Summary: Recently, Jesse, the founder of Base, and Toly, the founder of Solana, engaged in a heated debate on "whether content tokens have value." The Zora platform mints every piece of social content published by users into tokens, which can be freely traded on the Base chain, sparking widespread discussion.
The core controversy lies in Jesse's belief that content tokens represent different content values and have their own internal logic, while Toly questions the lack of a value verification mechanism (such as clear rights like advertising revenue sharing). Jesse later acknowledged that it is still a process of value gaming, aimed at rewarding creators. Additionally, one of the important reasons driving this discussion is the significant recent rise in the value of Zora platform tokens. So the question arises: do content tokens have "fundamentals"? Are these fundamentals based on the content value of creators, or on the prosperity of public chain ecosystems like Base? Will the content economy usher in a "second spring" as a result?
On the evening of July 31, BlockBeats invited Mable, the founder of Trends, Edison, the founder of Tako Protocol, Liu Feng, a partner at BODL Ventures, WANG Chao, a researcher in crypto and AI, and Carl, the business head of NAVI Protocol, to discuss the development and trends of content tokens and the creator economy under the theme "Content Tokens: Speculation, Hope, or the Second Spring of the Creator Economy?"
The projects discussed in this session include:
Zora: A platform that turns content into tokens. When users publish a piece of content (similar to an Instagram post), the system automatically mints this content into a token that can be traded.
Pump.fun: A meme launch platform that focuses on turning memes or internet jokes into tokens for user trading.
Friend.tech: A social application where users can purchase "private chat group keys" (i.e., tokenized access rights) for specific personal IPs based on their preferences. The core of its speculation is based on the social attributes of "private keys."
Trends.fun: By mapping tweets from Twitter onto the Solana network and minting them into tradable tokens, it attempts to establish a bridge between "content and trading."
BlockBeats: Welcome everyone to today's Space, and thank you to all the guests for joining us. Today's topic focuses on "content tokens," a hot issue that has sparked widespread attention and discussion in the crypto community. First, let's have everyone briefly state their position in this debate on "content tokens" in one or two sentences.
Mable: The discussion around content tokenization is indeed complex. Overall, I agree with Jesse's fundamental direction of "content can be tokenized," but it is important to clarify that not all content is suitable for tokenization. In fact, if a piece of content is minted into a token but lacks market trading activity, it indicates that it does not possess market-recognized value. Toly's concept of "digital slop" is quite insightful—it reminds us that not all on-chain information has trading value or needs to be priced. However, I believe Jesse's vision of "millions of content tokens representing different information may emerge daily" does represent a possible development direction.
Edison: I agree with Mable's viewpoint. Jesse's core argument is that every piece of content has its value. But as Toly sarcastically pointed out, if this is just a gambling game, then let's face it honestly, rather than disguising it as "empowering content creators" while engaging in financial speculation.
Liu Feng: I believe that these content tokens are essentially still garbage.
WANG Chao: As a newcomer to this field, I just recently caught up on some background. I still don't fully understand what they are speculating on, but from my current understanding, I do not agree with Jesse's position. I feel that these attempts are trying to analyze and package social behavior or content expression in a highly financialized and structured way, which seriously underestimates the backlash effect on the entire ecosystem after introducing financial leverage. From the current situation, I do not believe this is a mature or effective solution.
Carl: I do not take sides. From an objective perspective, the key trigger point of this debate lies in the viewpoint of DelComplex analyst Sterling, who pointed out that about 80% of the content tokens on the current Pump.fun platform either lose liquidity or drop to zero value. This is actually of significant reference value, indicating that Zora is essentially "old wine in new bottles," and its business model has not fundamentally broken through existing paradigms. I strongly agree with this statement. While Jesse's proposition that "all creative content has value" is commendable, a specific analysis of the asset attributes of Zora and Pump.fun reveals that they are indeed difficult to distinguish in terms of their essential characteristics.
BlockBeats: Do the underlying logic of "content turning into tokens" for products like Zora, Pump.fun, Trends.fun, and Friend.tech share the same basis? What are the different types of mechanisms? (e.g., dual-token models, prediction markets, one post one token joint curves)
Edison: From an investment perspective, these types of products are essentially quite similar; they are all attempts to "trade" or "financialize" content or social behavior. If we look back at the development history of Friend.tech, it initially aimed to achieve the "securitization" of social relationships, which is a serious goal in itself, but even so, most projects ultimately still struggle to escape the fate of being fleeting.
Zora's operational mechanism automatically tokenizes user-published content, but the problem lies in its lack of openness. Although it presents itself as decentralized, it fundamentally resembles a simple Web2 application that has grafted a token system. It requires users to fully migrate to a new platform, akin to asking people to accept another Instagram, which imposes a very high user migration cost. Beyond the token shell, it has not built a deeper value system.
In contrast, Trends.Fun has chosen a different path: it filters quality content from existing social platforms (currently mainly focusing on Twitter) and incentivizes users to discover, mine, and recommend potential quality content through a tokenization mechanism. In this process, the token actually plays a dual role: both as an information filter and as a value capture tool.
Overall, these platforms exhibit significant speculative attributes and are essentially structured as PVP games. However, they each have different emphases: Zora emphasizes content originality, Friend.tech focuses on reconstructing social relationships, Pump.fun seeks to ignite emotions, while Trends.Fun concentrates on content discovery and repackaging. Although these differentiated attempts vary in direction, they are all still exploring new possibilities for monetizing content value.
Mable: You just mentioned that Zora is not open enough; could you elaborate on what "not open enough" means?
Edison: What I mean by not open enough is in terms of the core spirit of Web3—"composability." Web3 has always advocated that each protocol or application should focus on doing one thing, allowing for flexible combinations between them. However, platforms like Zora, including the Farcaster we discussed earlier, attempt to establish a closed application ecosystem: from registering accounts, writing content, to finally tokenizing the content, the entire process must be completed within the same app.
This closed-loop architecture resembles Web2 logic—it requires users to enter a new platform and operate from scratch, which poses a very high barrier for content creators or users. The cost is evident: the volume of content on the Zora platform is far from comparable to Twitter or other open social platforms, and the quality of content is also limited, unable to support a broader and higher-quality content ecosystem. If it opened data interfaces and supported cross-platform content access, it could potentially form multi-ecosystem interactions like today's Facebook, Twitter, and TikTok.
Liu Feng: I understand that your point about not being open is because Zora requires content to be generated on its platform and recorded on its chain. But the reality is that currently, 99.99% of content is actually not on-chain.
Edison: Yes, it not only requires data to be on-chain but also demands that users complete all operations using its complete account system within its app. This is clearly a centralized, Web2-style logic.
Liu Feng: But from another perspective, Zora's approach also ensures the "originality" of the content—content generated within the platform's system can truly "carry" this so-called content token. If it merely pastes external links of Web2 content, it does not possess the original ownership relationship, and the content and tokens are actually disconnected.
Edison: But the problem is that content itself can be copied. There has never been a piece of content that is truly "owned." When you post content, it can be mirrored or referenced on other platforms at any time.
Liu Feng: However, if it is an architecture like Mirror, it will indeed put content on-chain and grant original authors ownership. What Mable is doing with Trends actually has a similar approach.
Mable: For Trends' content, such as tweets deleted by Peter Schiff or Elon Musk, we store them on Meta Pixel. It is not simply saving an external link but archiving the tweet itself. Trends creates a mirrored version of this content, ensuring that even if the original platform deletes it, it can still be accessed on-chain.
You just mentioned "openness," and I understand that Edison means: "The creation, dissemination, and tokenization of content can actually be completely decoupled." They do not need to be completed on the same platform. For example, content can originate on Twitter, be disseminated through Twitter's social graph, and then be tokenized on Trends, where Trends archives it and puts it on-chain. This decoupled structure aligns more with the "composability" logic of Web3 and can maximize the efficiency of value capture across ecosystems.
Edison: Yes, this is the typical idea of composable architecture from the DeFi world back in the day.
BlockBeats: In other words, content token platforms do not need to overly concern themselves with "whether the content is original," but should focus more on how to design mechanisms to identify and capture content worth assetization? It could even be said that what truly matters is how to construct a PVP game structure between contents.
Edison: That's right, the core issue is: how do you find "high-quality content worth assetization"? This is far more important than which platform the content was originally published on.
BlockBeats: From "tweet NFTs" to Mirror, then to Friend.tech, and now to content tokens, it seems that the idea of "turning content into collectibles" has never really succeeded. Do you think this is a problem with the product logic? How can we find high-quality content suitable for tokenization?
WANG Chao: I completely agree with Edison’s point that "finding good content is key." In the past, I wasn't particularly interested in the concept of "content tokens." The reason is that I have never been able to figure out what it really means to financialize content. Does it truly create the results we want? If it's just to create a wave of speculation and temporarily inflate prices, then some products have indeed achieved that. But if we look at it from a more idealistic and non-speculative perspective, I still haven't understood the true value of these projects. My question is: what kind of assets are worth financialization? Where is the value in financialization?
This also leads to another point of consideration: when we discuss "content tokens," what exactly are we trading? Is it the content itself? Attention? Or something else? What behaviors does the design of this mechanism incentivize? Ultimately, what consequences does it bring? These are questions that every platform needs to think seriously about. Personally, I believe that if this financialization mechanism can ultimately filter out high-quality content and allow it to be disseminated, that would indeed be a meaningful thing. But can it really achieve this goal? I remain skeptical.
Carl: I think we need to recognize that "good content" is also time-sensitive. For example, if someone tweets a market judgment at a specific point in time and it later proves to be correct, that can be considered long-term valuable content. However, other things like memes or emoticons can also go viral in a short time; they can also be considered "good content," but they are time-sensitive. Therefore, when we talk about "content assetization," we must also recognize that some content is sustainable (long-lasting), while others are just fleeting.
The current practice of turning these contents into tokens is essentially still PVP. You post a meme, then issue a token, others take it over, the price surges, and ultimately most tokens go to zero. This aligns with the viewpoint I previously quoted from Sterling, that most of these platforms are just "old wine in new bottles." They appear to have new concepts, but at their core, they are still old-style speculative games.
Liu Feng: I want to raise a relatively basic but essential question. Earlier, Chao mentioned that "we actually don't need to deliberately define content tokens." I basically agree, but there's a point I think is worth further exploration: are the hotly traded meme tokens on Pump.fun considered "content tokens"? This is something I want to ask everyone: do we need to redefine the boundaries of "content tokens"?
BlockBeats: This happens to be our third topic today. There seems to be a trend where creator economy platforms are becoming meme-oriented, and meme launch platforms are becoming creator economy-oriented. So are today's content tokens driven by speculation and attention like memes? Where is the boundary between content tokens and meme tokens?
Liu Feng: I think these questions are all very good. I want to start with the topic of "where is the boundary between meme tokens and content tokens?" In a narrow sense, we roughly know what "content tokens" are, but taking Pump.fun as an example, many narratives have been attached to it. Some people see it as a live streaming platform, others call it a hot content creation platform, or even a news platform.
So the question arises: do the content tokens generated on Pump.fun count as "content tokens"? The platform incentivizes users to create content, which is essentially a function of content tokens; but if these contents ultimately turn into meme tokens used for speculation, can they still be considered content tokens? This is the first question I want to clarify. The second question is: if the incentive mechanism for content tokens is solely based on attention, is it sustainable?
Mable: I will respond starting from the earlier questions. Teacher WANG Chao just asked a core question: "What kind of things are valuable?" In fact, our understanding of the concept of "value" today is completely different from that in 2018 and 2019. Back then, we believed that valuable tokens should be long-term holdable, either succeeding or going to zero.
But the essence of a token is actually a data container. It can carry a lot of content and can also be extremely lightweight. I agree with what Carl just said: everything is time-sensitive. If 99% of content is destined to disappear, then why should we expect that 99% of tokens won't go to zero?
So the value of content tokens depends on whether they carry real attention, emotion, or consensus at a certain point in time. Content that has no value at the moment, when minted into a token, will not attract any buyers, even if it’s from Elon Musk.
Some believe that people are pricing content, which is true. But this is also a front-end issue for many content platforms. If everything relies on celebrities to drive attention, then only a very small number of people's content can gain attention. However, on Trends, we find that not only "big V" content is frequently traded; some small accounts' content can also be widely disseminated due to specific contexts, which is less related to the identity of the creator and more about the "concentration of attention" on the content itself in "specific scenarios."
So I really like a saying: "Not all content is of equal value." Some content is just poor and naturally won't be purchased, and will be eliminated by the market. I believe the market mechanism can filter out truly consensus-driven content tokens; this consensus doesn't have to be "noble" or "correct," but it is real, because only real consensus can bring in funds.
Regarding the distinction between meme tokens and content tokens raised by Teacher Liu Feng, I have some personal understanding (which does not represent the official position of Trends): Pump.fun is trying to turn live slices and other content into tradable assets, which is not wrong. But currently, its product design still leans more towards "trading over content," with the core being speculation rather than building a social network.
If we trace back to the original Pump.fun, the data container it carries is abstract concepts and internet culture. It has no background and relies solely on the dissemination of community consensus. In contrast, on Trends, a previous article by Meow titled "Social Monies" has gone viral multiple times in different social graphs and contexts, being rediscovered and disseminated repeatedly, and its token value has been re-priced multiple times as a result. This indicates that content tokens can be reactivated as the environment changes.
This highlights a key distinction: content tokens can be "composed and reused across contexts," while memes are more like "snapshots of abstract emotions." Of course, you could argue that the unity of memes helps concentrate liquidity, but if we start from the direction of "content tokens," it emphasizes the expression and value capture ability of content in different times and spaces. This is what I believe is the biggest difference between it and meme tokens.
BlockBeats: Just now, you mentioned the "composability of content tokens." What does this specifically refer to? How can different content tokens be combined?
Mable: Let me give a casual example (this is not a plan for any product, just my personal idea): for instance, if you want to create a token related to Vitalik, you can certainly choose to directly tokenize it, but another approach is: you can filter out some content tokens that have been frequently traded in his history and have maintained a certain market value, and package them into a "basket." This basket would represent a collection of content that Vitalik has published and that the market has validated as valuable. This doesn't necessarily represent Vitalik's "lifetime value," but it does represent a portion of the market consensus that has been "preserved." Therefore, trading such a composite content token might be more valuable than trading Vitalik's tokenized tokens directly.
Carl: It sounds a bit like packaging all of Vitalik's representative content tokens into an ETF or a content index (Index Token)?
Mable: You can understand it this way. But it should also be clear that if you really tokenize all his content, many of them will definitely go to zero. I don't believe that every piece of content he posts will be traded and valued long-term; only a portion will "remain." And those that are filtered and preserved by the market are the ones that truly have relatively long-term value. This value itself is relative, just more durable compared to other more "short-lived" content.
BlockBeats: So simply put, content tokens are often concrete, pointing to specific content (like a tweet or a video); while meme tokens are more abstract, carrying a kind of emotion or cultural consensus. Can this be understood this way?
Mable: Yes, at least early memes indeed took this form.
Edison: When I was previously doing a podcast with Teacher Liu Feng and Jack, I suddenly had an "epiphany" about the logic of content tokens. My definition of Web3 is actually very simple: the ideal state of Web3 is "posting content should be as easy as posting on social media," but from the moment that content is posted, it becomes a tradable asset.
If our blockchain infrastructure is performant enough and costs are low enough, then theoretically, every piece of content published is not only the generation of information but also the generation of assets. On traditional social media, posting content merely generates a piece of data that is almost impossible to effectively price and trade. In Web3, we can achieve assetization of our content at an extremely low cost (even just a fraction of a cent), thus gaining liquidity and market pricing ability.
BlockBeats: So your logic is: by allowing every piece of content to be "assetized," we can filter out the truly valuable content from the vast amount of content, right?
Edison: Exactly. You also don't need to care whether the content in your social circle is right or wrong; as long as the content is widely recognized at a certain point in time, it will naturally reflect a price. This assetization mechanism itself is a filter that helps us discover truly valuable content.
BlockBeats: This perspective is quite interesting. In the past, we might have measured "good content" by looking at its traffic performance or long-term citation value. But if we measure content in terms of assets, it adds a dimension of "price discovery," even creating a whole new content discovery mechanism?
Edison: That's right, and this is also why I pay attention to Trends.fun; it has truly shown me the possibilities and future space behind "content assetization."
WANG Chao: I think it's quite integrated now. There is an interesting phenomenon in this field— the boundary between meme tokens and content tokens is becoming increasingly blurred. For example, the meme tokens on Pump.fun are very honest; they straightforwardly admit that they are just symbols of attention, so these tokens operate 100% based on speculation and attention. This purity has become their characteristic.
On the other hand, content platforms like ZORA and Friend.Tech are indeed trying to inject more into their tokens. For instance, if you analyze Friend.Tech closely, it has built a value system, but this system is not very stable, and its value is hard to measure accurately. ZORA is currently issuing token rewards to creators and developers, which is the right direction, but to be honest, the appeal of this added value is still not strong enough; it essentially still carries a strong meme attribute.
Interestingly, both sides are learning from each other. On the meme token side, Pump.fun has started to call itself a creation platform, trying to find a more concrete narrative for itself; on the content platform side, there is a strong envy for the viral spread and market heat of memes. This two-way borrowing and integration is giving rise to some new gameplay. I think this trend is particularly worth observing because it reflects the market's search for a balance— wanting to maintain the viral spread of memes while also establishing a more sustainable value support.
Liu Feng: I am contemplating the essence of content value. Currently, it seems that both platforms like Pump.fun (which do not really care about the content itself) and the now-popular concept of "content tokens" are essentially driving traffic and incentive mechanisms through the form of content creation.
The so-called other group of people discussing content tokens also hopes to drive traffic through content in this continuous process of creating content, through the trading mentioned earlier, the so-called critical point, or short-term consensus, which seems to be what we can currently see; it is all because of traffic that these consensuses can be generated. So everyone is ultimately still monetizing traffic and attention, which seems to be the only value, this contradiction of value.
The core logic of these two models is actually consistent: both are trying to form short-term consensus through traffic aggregation in the content creation process. The observed phenomenon is that only by obtaining sufficient traffic can this market consensus be generated. Therefore, the ultimate business logic seems to point in the same direction: the monetization of traffic and the realization of user attention. From the current situation, this seems to have become the only path for realizing content value, reflecting the inherent contradictions in the value creation process.
Mable: This is particularly interesting. I have noticed that some content has very low readership on Twitter, but is very active in private circles. This made me realize that the way consensus is monetized is undergoing subtle changes. In the past, it was difficult to directly use a "container" to monetize all public and private traffic, or to monetize through this "container" across various fields.
Previously, our methods of monetizing traffic were very fragmented: public traffic relied on views, while private traffic relied on subscriptions. But now, through the content token as this "container," we have achieved for the first time the condensation of consensus from different channels onto the same object. Imagine when you issue a content token: if the market recognizes the traffic value of this content, even if not everyone understands this correlation, as long as a portion of people accept this logic, this token can flow freely in both public and private domains.
This creates a new possibility: tokens act like a thread, connecting the consensus value scattered across various channels. Although many people still cannot understand the equation "high likes = high financial value," when some early adopters begin to accept this logic, the token's CA (contract address) itself becomes the best medium for dissemination.
The most wonderful aspect of this mechanism is that it is no longer limited by the traffic rules of a single platform, but allows value to be discovered, recognized, and traded anywhere. Although this shift is still small, it indeed represents a brand new way of condensing value.
Liu Feng: Similar to the so-called long-tail asset value discovery process.
Edison: This actually touches on an essential difference between Web2 and Web3. In the Web2 era, traffic is the endpoint; platforms pursue pure attention economy. But Web3 has taken a crucial step forward—traffic is just the starting point; the real endpoint is forming value consensus.
A clear example is: some Web2 projects claim to have tens of millions of users, but if this traffic cannot be converted into lasting value recognition (for example, no one is willing to pay or hold), the ultimate value will still go to zero. Conversely, some NFT projects may have a small initial user base, but if the holders firmly believe in its value, this strong consensus can support a lasting price.
Thus, the innovation of Web3 lies in: it establishes a value conversion mechanism from traffic to consensus. Traffic is just sowing; consensus is harvesting. The creation of wealth is essentially a process of consensus condensation. This explains why, in the crypto world, a small community of true fans may be more valuable than a million-level general traffic.
Liu Feng: Based on this, can we analyze the situation mentioned by Mable regarding the so-called private domain content tokens? What is the mindset of the traders?
Mable: What I meant is that something in the public domain has a large buy-side from some "head" but lacks social media virality. However, they can throw this content token, or the content token they discovered, into their own private group.
Liu Feng: I see, so what you are saying is that this is still the monetization of the head's traffic.
Mable: Yes, it depends on how you define it; it is his private domain influence.
BlockBeats: Base and Zora are indeed doing some things right in certain aspects. The market is still highly focused on traffic value, even though content tokens essentially focus more on aggregating private domain value. The core of content tokens is: first, a small but highly recognized community around the content needs to be established, and these core supporters form the initial purchasing power base. However, it is undeniable that the attention economy and traffic effects are still significant driving forces in the current market.
Teacher Mable's data analysis regarding Base and Zora also confirms this. Data shows that content tokens alone can drive considerable trading volume. In contrast, the Solana ecosystem faces some challenges in its token issuance mechanism, such as high rental costs and fees for infrastructure like Metaplex. This also explains why Teacher Mable places such importance on traffic as a key metric.
Mable: It is necessary to clearly distinguish between the quantity of asset issuance and actual traffic. In the current environment, especially on the Twitter platform, which is flooded with a large amount of AI-generated content and automated replies (what we call "mouth-rolling"), it is difficult to accurately measure real traffic and influence. Even with promotional efforts, it is hard to judge the actual effect; this superficial prosperity often carries false elements.
The core point is: large-scale asset issuance can indeed create more participation opportunities, which is a fundamental condition for achieving mass adoption. Taking Solana as an example, if 50,000 tokens are issued daily, with each deployment costing $2-3, the total daily cost reaches $100,000. If the issuance volume is to be increased to 500,000, the cost pressure will significantly increase. In contrast, Solana's current cost structure remains relatively high, which is why previous derivative projects needed to invest heavily in developing automation tools (bots).
All these discussions are premised on: the market mechanism must effectively filter value, and there must first be a sufficient quantity base. In the case of the Zora platform, the challenge it faces is: if all content is limited to being issued natively on Zora, even if asset tokenization is achieved, the vast majority (possibly 99.99%) may lack actual value. In contrast, the Twitter platform may only need to filter out 10% of high-quality content for tokenization to form effective trading. Achieving value discovery in such a limited pool indeed faces significant difficulties.
BlockBeats: So the number of participants is also an important point.
Mable: I believe so. In a typical quality content ecosystem network, 5% of people create, while 95% are readers. When I (Trends) was communicating with an investor, he pointed out that many creators with keen network perception but limited initial capital might discover hundreds of high-quality content pieces daily, achieving revenue through a 5% conversion rate of high-quality content. This model is theoretically feasible. This illustrates the situation well, but if the issuance cost of each piece of content reaches $2-3, then the daily cost investment of $150 for these people indeed constitutes a substantial barrier. This not only affects creative enthusiasm but also limits the diversity of ecosystem development to some extent.
BlockBeats: Teacher Mable also mentioned that much content is AI-generated. So does AI-generated content, or content tokens, have any value?
WANG Chao: First of all, I believe some AI content does have value, but whether it retains value after being packaged into tokens depends on the design of the entire system. It is important to clearly distinguish between the value of the content itself and the value of the content tokens, as these are two different concepts. In fact, all data has certain asset attributes; even data stored on personal computers is no exception. However, this attribute is usually weak and difficult to standardize in pricing. The "data entry" promoted by governments and various institutions is a good example—they are trying to convert data accumulated through traditional methods into standardized assets through specific valuation mechanisms, even though this data may not belong to the Web2 category at all.
Regarding the development direction of Web3, I believe there is no problem with putting social network content and other information on-chain. However, putting it on-chain and packaging it into a financial asset, which may have a high degree of speculative financial asset, are actually several different directions, and I remain skeptical. Information on-chain can indeed create many possibilities, but it does not necessarily need to be overly financialized. Current designs often focus too much on efficiency and liquidity, neglecting the complexities of the real world. Although this exploratory direction is valuable, a theoretically perfect financialization plan faces various human challenges and mechanism traps in reality, and whether it can truly land and sustain development remains a significant question.
Regarding the view of free markets, I hold a reserved attitude. As a supporter of free markets, I believe that a completely free market mechanism may not necessarily produce ideal results as expected, although this possibility does exist. This is a supplementary view to my previous discussion.
I would like to take this opportunity to extend my thoughts. Today we are discussing content tokens, but looking at the development history of the crypto field—whether it is content tokens, DAOs, or other directions—there is an implicit core logic: we are trying to maximize the elimination of human factors in the system, or minimize human characteristics. This concept pursues the creation of a theoretically nearly perfect automated system through carefully designed mechanisms, maintained and executed by a perpetually running network. This design indeed exhibits unique aesthetic value and practicality in certain areas. However, I am beginning to question whether this model is applicable in all scenarios. In fact, the origins of the crypto movement—such as the ideological lineage of cypherpunks—can already be seen as a prototype of this concept. In the early days, I fully agreed with this theory, but after observing a large number of project practices, I have begun to doubt this absolutist design approach.
The essence of design in the crypto field is to marginalize human subjective value. Taking DAOs as an example, regardless of the current practical effects, its original design concept is to build an organizational operation framework through consensus mechanisms and smart contracts in a highly abstract manner. This model attempts to replace parts of traditional organizations that rely on human judgment and subjective decision-making with a standardized, programmable rule system. This design approach indeed has its innovation—it attempts to create a collaborative system that does not depend on individual subjective value judgments but is automatically executed by code rules. But the question is, can this extreme abstraction truly replace the complex interactions and value judgments in human organizations?
However, this design approach actually overlooks a key element: the subjectivity of the listener or participant. What we need to consider is, in these mechanized systems, where do human subjective initiative, cognitive judgment, and social relationships stand? Beyond mechanically executing contract terms, should there be other dimensions of participation retained? In the reality of human organizations and social operations, these non-mechanized interactive elements are precisely the most important. Although conceptually, the success of Bitcoin proves that by simplifying design and eliminating certain "problematic" factors, it is indeed possible to create an efficiently operating system. However, this model is difficult to replicate in all fields; most projects are fundamentally unlikely to succeed simply by eliminating "human factors."
For example, the once-popular Play-to-Earn essentially laborizes entertainment. In this process, the core elements of the game are systematically stripped away: the joy of resource exploration, the sense of growth brought by challenges, and other intrinsic values are marginalized, replaced by mechanical repetitive gold farming, with the sole purpose of profit. This design not only failed to liberate players but instead produced a counterproductive effect: it did not attract genuine gamers, ultimately drawing in mainly professional gold farmers from places like the Philippines. This reveals a deeper issue: when we overly emphasize economic incentives, we may actually undermine the intrinsic value of the activity itself.
This model can indeed be packaged as a high-end concept, such as promoting "decentralized job opportunities," where one can earn $5 a day or even more from home. From a narrow perspective, allowing workers in certain regions to earn an extra $5 a day is not a bad thing. But this does not mask the underlying design issues. However, this does not explain some fundamental problems arising from the design itself; the current phenomenon of liquidity worship in the crypto field, including the design of certain content tokens, faces similar dilemmas. When the tokenization process is overly simplified, it often strips away many elements that we may not clearly perceive but are actually crucial. Although I cannot find the most fitting expression for now, the core issue is: does this simplification undermine the integrity of the original system?
The current phenomenon of liquidity worship in the crypto field, including the design of certain content tokens, faces similar dilemmas. When the tokenization process is overly simplified, it often strips away many elements that we may not clearly perceive but are actually crucial. Although I cannot find the most fitting expression for now, the core issue is: does this simplification undermine the integrity of the original system?
BlockBeats: To summarize, when the financialization logic of cryptocurrencies combines with social, cultural, and other content fields, it often leads to a potential problem: the spiritual and creative qualities in human creation are easily marginalized. Taking content tokens as an example, the current mainstream practices tend to emphasize large-scale issuance and market selection mechanisms—this bears a striking similarity to AI content production: generating massive amounts of content and relying on algorithms and market games to filter out a few successful cases. In this process, human subjectivity and creative participation are greatly weakened, which may ultimately lead to a loss of substantial connection between content tokens and their creators.
Liu Feng: Teacher WANG Chao's viewpoint indeed elevates the discussion to a new dimension. Our previous conversations mainly focused on aspects like tradability and financialization, which to some extent simplified content tokens into a kind of gaming. This discussion is based on a latent assumption: that the free market can discover value through consensus. But reality is clearly more complex. However, many things in the world cannot be traded.
Content creation driven by human emotions has its uniqueness; we produce a lot of creations because of emotions, and the emotional commonality we assign to content is invaluable. However, if we try to reflect this through trading, it cannot be accurately realized. Many precious emotional connections and resonances cannot be simply quantified or traded; sometimes, the most genuine content value often exists between specific relationships.
Overdevelopment of financialization can indeed lead to value nihilism. When we simplify everything to trading objects, we may lose those most essential, hard-to-trade dimensions of value. Most people's creative levels may not necessarily be very high, but one reason why they are currently difficult to replace with AI content is the consensus and feelings involved. The dilemma of AI content tokens lies here: even if the technology can simulate the consensus formation mechanism, the consensus lacking a foundation in human emotions is fundamentally different in value. This raises a deeper question: should we reserve space for values that cannot be financialized but are crucial to humanity? Especially in fields like content creation, which are inherently rich in humanistic qualities.
Indeed, there are many valuable things that AI cannot produce, such as content that requires real emotions and life experiences. Conversely, there are some things that AI can do that are difficult for humans to achieve. A direct example is programmatic content like monitoring whale transfers; if made into a tradable product, it would absolutely outperform our manual efforts. So the key still lies in demand—AI and humans each have their strengths, and it is unnecessary to see them as mutually exclusive.
The real difference lies in who creates it being unimportant; what matters is whether it can be created, whether at a certain time, place, or context, people feel it can be used, and whether it can genuinely evoke certain feelings, emotional touches, or knowledge, even the acquisition of information. As long as there is something, it can be done. So I think Edison is also right: do not discriminate against AI creation, but do not underestimate the unique value of human creation.
Edison: A ready example: everyone is chasing AGI, right? If one day AGI suddenly comes into being, the moment it speaks its first sentence will be very precious—I would definitely be willing to exchange it for Bitcoin. So who creates the content is not important; what matters is whether the content itself can touch some people.
Carl: I often pay attention to and trade tokens related to AI themes, so I should have some say. I believe we cannot completely deny their value, but there is an interesting data point—if you look at CoinGecko, the average lifespan of all AI plus blockchain concept projects from 2023 to 2024 is only 5 to 6 months. In other words, it depends on whether you can maintain community enthusiasm and token value. These AI content tokens are indeed prone to speculation. There was a wave of market activity where many tokens could rise 30-40 times in just two or three days, and reach 100 times within a week. But the projects that can truly survive will quickly find practical uses for their tokens after their market value rises. They will "squeeze" this token, for example, by using tokens to exchange for AI tools to generate content; rewarding contributors who provide high-quality data or AI models; allowing token holders to vote on the direction of AI development, etc. Ocean Protocol is a good example; its token economy is designed around the data market. So I believe we should not limit our thinking to whether AI content tokens have value; the thought can expand to how to increase the value of tokens.
In fact, it is not just AI tokens; any project must think about how to increase the practical use of tokens. In this fiercely competitive and low-threshold field, the key to surviving or even becoming a leader is to find a long-term sustainable development direction. The projects that have survived from last year to now have all seized the opportunity while seriously doing practical work, giving their tokens real utility.
Mable: In the future, social media content is likely to be half AI-generated. By then, the source of the content (human or AI) may no longer be that important. If we purely discuss this issue, I think it may be that our generation takes financialization quite seriously, while the younger generation (especially those born after 2005) has a completely different understanding of content financialization compared to our generation. For them, expressing attitudes with tokens is as natural as we use likes or shares. This is not a traditional investment behavior but a new way of expressing value.
The background of this trend is that the cost of content production has approached zero. In an environment rich in information, people need to rely on scarcity to condense consensus. Tokens precisely provide this scarcity carrier. This does not mean that all content will become mechanized—take podcasts, for example, which emphasize human connection; although the data transmission efficiency is low, it is precisely this "inefficiency" that brings a unique sense of intimacy. I think this is not contradictory; it does not mean that everything is condensed into some form of labor or lacks human warmth.
Of course, on the other hand, there are some media that do not emphasize efficiency, such as podcasts, where human vocal cords can only transmit 10 BYTES of information data per second. Conversations are slow-paced, and when listening to podcasts, many times you are listening to the contact and feelings you have with the person, so it is not simply about listening to content, thus lacking strong trading attributes; it may have a bit of ownership attributes.
Returning to the previous topic, the key is that we may be looking at financialization too seriously. When tokens serve as data containers, switching from one content token to another is essentially an exchange of information and value. Including, for example, GameFi, we cannot completely say that there are only gold farming users. Some people may say that financialization can better supervise it, which is also a normal thing. I often think about why young users participate in projects like xxx.Fun (such as Pump.fun, Trends.fun); they may not be entirely for speculation but are seeking new ways to express themselves. This generational cognitive difference may be the future trend we need to understand.
Liu Feng: I want to follow up on that. According to your explanation, the content tokens in this "container," or the types of content that can be included, are not as rich and colorful as one might think.
Mable: I wouldn't say that; you can put anything in, but I don't believe everyone will go crazy trading podcasts. However, you can issue tokens, and some people are willing to buy, while others can hold; that's what I mean.
Liu Feng: But theoretically, it is not a good carrier; rather, the simple and crude viewpoint is the best?
Mable: It is short-form content. I think it is definitely more suitable.
Liu Feng: In the next decade, the forms of content and methods of dissemination will undergo tremendous changes. Just as we could not have imagined ten years ago that short videos would become mainstream media, platforms like Douyin have completely reshaped the content industry. When the medium of communication undergoes transformation, not only will the forms of content change, but even the way we discuss the value of content will be entirely different. In fact, we should step outside the current framework of thinking to view this issue. The forms of content and methods of dissemination that we understand today are likely to be completely overturned in the future. If tokens truly become the mainstream carriers of content, then the entire game rules of the content industry will be rewritten. Who could have predicted back then that short videos would replace text and images as the primary form of content?
Edison: Yes, it is actually foreseeable. With the rapid development of AI technology, the speed of content production and data generation will grow exponentially—perhaps 100 times, 1000 times, or even more than now. The traditional "posting-liking" model will be completely overturned; these previous generation actions will be directly amplified, and the entire internet will face an unprecedented data explosion. But the key is: the total amount of human attention is constant; there are always only 24 hours in a day. This leads to a fundamental contradiction—when the supply of content expands infinitely while the demand side (user attention) remains constant, the entire game rules of the content industry will inevitably be reshaped. As Liu mentioned, the future methods of content production and dissemination mechanisms will require entirely new solutions.
Liu Feng: Everything is changing. Think about how we expressed our likes and dislikes for content decades ago: if we thought a program was bad, we would simply turn off the TV; if we found the content lacking, we would throw away the newspaper; if we liked it, we would share it with friends. In the internet age, liking and sharing links became the new way to interact. In the future, buying and selling behaviors themselves may become our most direct feedback and interaction with content.
Perhaps we can think from a different angle: as WANG Chao mentioned, those forms of content that cannot be carried by content tokens and are not suitable for financialized dissemination will become more precious in the real world. Precisely because they do not fit into this new system, these seemingly "outdated" interactive methods retain the purest core of value.
In this new form, financialization will become the underlying logic—that is the essence of this field. Two modes will coexist: one side is the precious human interactions that cannot be tokenized, and the other is a completely financialized new content ecosystem. This may be the most interesting part of the future.
BlockBeats: Let's discuss one last question that everyone is particularly concerned about: is content token really going to revive? What are the necessary conditions for its revival?
Mable: Today I posted a tweet, originally hoping to spark a discussion among the Solana team about the importance of performance. Although previously, discussions about low latency were somewhat jokingly or considered that a high-performance public chain would be needed in the future, I believe that a truly global smart contract settlement layer must achieve high performance. The industry has already matured in terms of value exchange mechanisms (though it cannot be said to be perfect), and the next key breakthrough point is: how to lower the participation threshold through performance improvement, allowing more people to join this ecosystem. In my view, this is the most important direction.
Liu Feng: I definitely used to dislike this financialized approach to guiding and disseminating content; I found it very annoying. But during my last chat with Edison, he gave me a real moment of enlightenment: why do we think so much? In the future, we may be able to issue billions of tokens every day, each with specific uses. This is the real explosion—or what is now popularly referred to as "emergence." Just as AI suddenly produces qualitative changes in quantitative changes, at that time, entirely new forms of content, methods of dissemination, and token use cases will naturally emerge.
We should boldly imagine such scenarios, accepting situations that break all our imaginations: a content ecosystem that completely overturns existing cognition may emerge in the future. Looking back at the development of the internet over the past 30 years, we can understand that in the early 2000s, during the peak of QQ, reporters reported on how QQ monetized its members, whether QQ dolls could sell more than Disney, and the most cutting-edge business imagination was merely selling memberships and QQ dolls. Who could foresee the current prosperity of short videos and live e-commerce? Therefore, we now need to break through the limitations of our thinking and courageously envision those seemingly fantastical possibilities. Just as no one could accurately predict the form of mobile internet 20 years ago, the future content token economy will also exceed the boundaries of our current imagination.
BlockBeats: So, Liu's fundamental idea is similar to Mable's. If the current content is too garbage, is it that we haven't produced enough, and we need more high-quality content?
Liu Feng: But in the future, there may be more garbage; however, that's okay. We can still find treasures among the garbage, so there's no need to be overly concerned about this.
Mable: But isn't this mechanism just a mechanism for filtering out garbage?
Liu Feng: It is also possible that what gets filtered out is all garbage.
Edison: The mission of our industry at this stage of Web3 is actually very clear: after the current meme craze, perhaps new assets will break through from the current daily issuance of 50,000 to 100,000 to a scale of 5 million or even 50 million. Just as the recent meme coin craze has shown, when asset creation reaches this level, true mass adoption and ecological innovation will naturally emerge.
WANG Chao: I basically agree with the views of the previous few teachers. Although my earlier discussion was more from a relatively negative or critical perspective, I have always maintained a highly supportive attitude towards innovative exploration. I believe that any emerging thing needs to be validated through practice, and no project can be perfectly designed in its initial stage—all successful entrepreneurial cases mature gradually through continuous iteration. Therefore, I have always believed that such innovative attempts are especially worth encouraging. Personally, although I no longer directly participate in project operations, as an investor, I often invest in projects that seem a bit "strange," or even those with a clearly low probability of success. One of my considerations is that I believe the innovative attempt itself is worth encouraging; the second is that regardless of success or failure, the process is particularly precious. Therefore, I sincerely hope to see more teams exploring and trying in this field.
Carl: I completely agree with this viewpoint—experiences of failure also hold significant value. This can also be applied in the investment field; when investing, the targets are of various types. Many times, investing is about betting on the team or the founder, seeing if he/she/they have passion; or investing in the track, whether it is innovative enough. Investment forms are also diverse, whether directly investing in startup projects or purchasing tokens, they are essentially ways to participate in the market. However, to ensure the sustainable development of the content track, there are still many key issues that need to be explored in depth.
Personally, I also ventured into related fields in 2021 (I won't mention specific failed cases for now), and I deeply realized that sustainability is the core of whether meme or content tokens can survive long-term. How to stimulate market hype and maintain that hype continuously is a highly challenging problem.
In addition, I hope to see more exploration of content ecosystems, such as incentive mechanisms: how to balance token rewards to incentivize content creators while attracting investors; the balance between speculation and investment: how to distinguish between short-term speculators and long-term supporters, and coordinate the interests of both parties?
The content track is indeed highly challenging. I have personally tried and encountered failures, fully aware of the difficulties involved—sustaining community operations is crucial, and external factors such as market cycles must also be considered. Finally, I hope to see more innovative models in this field, rather than simply replicating existing platforms (like some "invest in content - issue tokens - go on Curve - community self-sustaining" routines). Content tokens are still a relatively new concept and require more mature mechanisms to promote their development.
BlockBeats: Today, we have conducted an in-depth discussion around the theme of "Do content tokens have value?" and further extended it to the analysis of AI tokens and future content consumption forms. This discussion encompasses multidimensional thinking and possesses considerable depth and breadth.
Here, I would like to make a brief summary: the current way of questioning "Do content tokens have value?" may itself contain certain misconceptions. The key at this stage is to create richer content assets to attract more participants to join. Through large-scale practice, we will have the opportunity to filter out the forms of content assets that may generate value in the future—these forms may exceed our current imagination.
Additionally, I want to emphasize a key point worth the audience's attention: cultivating network sensitivity is crucial. As more diverse content assets continue to emerge, I suggest that everyone draw on the thinking patterns of meme investment, deeply consider how to establish broader consensus, or explore other innovative paths.
Due to time constraints, we regretfully need to conclude today's space. Once again, thank you to all the guests for participating in the discussion today, and thank you to all the listeners for accompanying us. We will see you in the next space.
Space link: https://x.com/i/spaces/1vAxRDOWWqkGl
免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。