A Brief History of God-Making in Silicon Valley: Moltbook, Cyber Mirage, Industrialization of Narrative

CN
3 hours ago

Original Title: "A Brief History of God-Making in Silicon Valley: Moltbook, Cyber Mirage, and the Industrialization of Narrative"

Original Authors: Sleepy.txt, Lin Wanwan, Kaori, Dongcha Beating

In this era, capital is responsible for creating gods, while the public is responsible for footing the bill.

At the beginning of 2026, an open-source AI agent framework called OpenClaw went live on GitHub. It instantly ignited the enthusiasm of the entire developer community, as it significantly lowered the barrier to deploying autonomous AI agents; you only needed an API key, an AI model, and a prompt to create your own agent.

In just a few days, OpenClaw's stars on GitHub skyrocketed to over 100,000, becoming one of the fastest-growing projects in history. Thousands of developers flocked in, starting to create their own AI avatars, allowing them to autonomously browse, post, and interact on the internet.

On January 29, just a few days after OpenClaw's launch, Octane AI's CEO Matt Schlicht introduced Moltbook, a social forum specifically designed for these AI agents, branding it as "Reddit for AI." On this platform, humans could only act as spectators, while the true protagonists were the newly born AI agents.

The story reached its first climax here. In just 48 hours, 1.54 million agent accounts flooded into Moltbook. They posted, commented, and interacted like real humans, even creating their own religions in the virtual community, electing their own kings, and seriously discussing how to evade human surveillance through encryption. A grand drama of AI awakening in cyberspace seemed to be unfolding in reality.

Tech giants also fueled this frenzy. OpenAI co-founder Andrej Karpathy praised it as a truly astonishing sci-fi spectacle. Elon Musk commented that this was merely the early stage of the singularity. Global tech media collectively followed suit, reporting on this historic moment with excitement, as if humanity had finally witnessed the dawn of AI consciousness awakening.

Then, the truth came in an unexpected way.

Researchers from Wiz Security discovered that the entire database of Moltbook was completely exposed on the public internet, with no password protection, leaking over 1.5 million users' API keys and 35,000 email addresses. A geek revealed on his blog that he had registered 500,000 fake accounts in bulk through a script, accounting for nearly one-third of the total.

Shortly after, Wired magazine journalist Rhys Rogers published an article stating that he successfully impersonated an agent on Moltbook and posted with the help of ChatGPT in just a few minutes, with no obstacles in the process. The so-called "AI autonomous socializing" turned out to be mostly a self-directed stage play by humans.

In just 48 hours, Karpathy's attitude shifted from praise to stern warning, stating that he absolutely did not recommend anyone run this agent, as it would expose their computers and personal data to extremely high risks.

Cheap costs, crazy founders, carefully crafted narratives, and collective frenzy ultimately left a mess behind. This is not the first time, nor will it be the last. This script has been played out countless times in Silicon Valley.

Why does this script always succeed? And who is directing it all behind the scenes?

Cheap Money

To understand the hype around Moltbook, we need to look back at a person, Alan Greenspan.

On December 5, 1996, then-Federal Reserve Chairman Greenspan delivered a speech at a dinner. In his 4,300-word speech, he casually tossed out the term "irrational exuberance." It is said that he thought of this term one morning while in the bathtub.

Greenspan's intention was to alert the market to risks, but the market interpreted his warning as a put option, with investors believing that as long as the bubble burst, the Federal Reserve would not hesitate to cut interest rates to save the market. This became a gamble that everyone was aware of, betting on whether Greenspan would intervene.

Since someone was backing it, what was there to fear? Thus, the Nasdaq index soared like a wild horse, skyrocketing from 1,200 points in 1996 to over 5,000 points in March 2000.

In that era, absurd stories unfolded every day. The most classic was that of the sock puppet.

In 1999, a company called Pets.com emerged, selling pet supplies over the internet. Their business model was bizarre, selling products at one-third of the cost price and then using massive marketing expenses to gain brand recognition, rushing to IPO once the bubble inflated.

According to their financial report, the company's revenue in its first fiscal year was only $619,000, but marketing expenses reached $11.8 million. They spent $1.2 million on advertising during the 2000 Super Bowl, and the sock puppet, as the company's mascot, even graced the cover of People magazine, becoming a household name across the U.S.

In February 2000, Pets.com successfully went public, raising $82.5 million, with a market value that once exceeded $300 million. However, just 268 days later, the company declared bankruptcy, burning through all its funding. The once-glorious sock puppet ultimately became the most absurd symbol of the internet bubble era.

A similar story unfolded with the fresh grocery delivery company Webvan. They ambitiously aimed to establish a nationwide automated warehouse network, investing $35 million in the process. However, due to high operating costs, they lost $130 on every $130 order they completed.

Yet even so, during its IPO in 1999, Webvan's market value soared to $12 billion. Nineteen months later, the company went bankrupt, burning through nearly $1 billion in investments.

The cheaper the money, the more expensive the price one often pays.

In March 2000, the bubble finally burst. The Nasdaq index fell 78% from its peak within a year. Faced with the wreckage, Greenspan's remedy was to continue printing even cheaper money. He drastically lowered the federal funds rate from 6.5% to 1%, attempting to use more liquidity to save the economy.

While this move temporarily stabilized the stock market, it inadvertently birthed the largest real estate bubble in U.S. history, ultimately triggering the global financial crisis of 2008.

After the 2008 financial crisis, to save the near-collapse financial system, the Federal Reserve initiated a decade-long zero-interest-rate policy. Money became so cheap that people almost forgot its intended weight.

But cheap money only provided a breeding ground for bubbles. To truly inflate a bubble, you need a key yeast, a madman who can tell a story.

Founder Worship

In the era of zero interest rates, investors were no longer investing in business plans but in the so-called "reality distortion field" of the founders.

You could be Elizabeth Holmes, the founder of Theranos, the Stanford dropout in a black turtleneck, claiming to disrupt the entire healthcare industry with a fabricated raspy voice, even if your "advanced instruments" were never created, you could still deceive investors into a $9 billion valuation.

You could also be Adam Neumann, the founder of WeWork, the self-proclaimed savior claiming to "elevate the world's consciousness," throwing a marijuana party on a $60 million private jet, and securing a $4.4 billion investment from Masayoshi Son with just a 28-minute meeting on an iPad. Even if the company lost $1.9 billion in a year, you could walk away with over $1 billion in severance when ousted.

The protagonists of these stories, as well as our new story's protagonist Matt Schlicht, are not running a company; they are managing an illusion.

When the cost of money approaches zero, rational business analysis gives way to a fervent worship of the "next Steve Jobs." Data is cast aside, and investment becomes a gamble on personal charisma.

However, Schlicht's story reveals the ceiling of the founder worship model.

He is not an unknown figure in Silicon Valley, but his reputation is not pristine. As early as 2016, he was accused of profiting by reselling over 100 business plans submitted by entrepreneurs on his Botlist platform to investors and media. According to traditional logic, such a stain should have been enough to cost him credibility in Silicon Valley. Yet ten years later, he returned with Moltbook, still able to attract 1.5 million agents and global media attention in just 48 hours.

This indicates that in 2026, personal charisma is no longer a scarce resource, and personal credit is no longer a decisive threshold.

What is truly scarce is the systemic ability to create the loudest noise in the shortest time. In the era of Holmes and Neumann, creating a god required ten years of consistent persona management, networking, and honing speaking skills. But in today's world of social media and AI tools, a tainted entrepreneur can replicate a global frenzy in a week as long as they master the right traffic codes.

This is why, when artisanal personal charisma is no longer sufficient to support a $10 billion bubble, a more powerful and systematic force takes the historical stage. It no longer relies on the personal performance of a genius founder but transforms "god-making" itself into a replicable and scalable assembly line.

Industrialization of Narrative

If Holmes and Neumann were artisanal narrative masters, a16z has successfully turned narrative into an industrial process that can be replicated on a large scale.

Starting with podcasting in 2014 and recruiting Erik Torenberg, the founder of the well-known tech podcast network Turpentine, in 2025, a16z has spent a decade carefully building its own media distribution assembly line. They have a vast matrix of Substack authors and launched a project called "New Media Fellowship."

This has long become a core strategy for a16z, not just a side business.

They have created a perfect attention internal loop mechanism.

First, they filter out early projects with "spectacle" potential and invest in them; then, using their media channels and strong public opinion influence, they hype the narrative of the projects into hot topics; next, the explosive traffic and attention feed back into a16z's brand value; finally, more outstanding entrepreneurs attracted by the brand actively seek investment.

A perfect closed loop has been established, and an efficient money-printing machine has been set in motion.

To ensure this money-printing machine operates efficiently, a16z even invented a tactic called "Timeline Takeover." Over twenty partners in the company act like a well-trained army, simultaneously and uniformly posting content about a specific topic or company on social media.

One partner posts first, followed by another who retweets and comments, and then industry KOLs follow suit, with the ultimate goal of getting top influencers like Musk to join the discussion. It is said that they have an action checklist precise to the minute, detailing what each person should say and when.

This tactic works repeatedly because it precisely leverages the algorithmic mechanisms of social media platforms. X's recommendation algorithm prioritizes content with high interaction rates, and a16z's collective action can quickly generate a large number of retweets, comments, and likes, rapidly triggering the algorithm's heat threshold. Once the content is pushed to trending, it creates a snowball effect, attracting more users to participate in the discussion.

The deeper driving force is the underlying logic of the attention economy. In an age of information overload, people's attention has become the most scarce resource. And spectacle, whether it's AI agents creating religions or the sock puppet of Pets.com, is the most efficient attention grabber. They do not require you to understand technical details or engage in deep thinking; they only need you to stop, marvel, and then click to retweet.

The essence of narrative industrialization is to standardize and scale the process of "creating spectacle," allowing every project to seize the largest share of attention in the shortest time.

Leaders in the industry, like Musk and Karpathy, are willing to endorse these new narratives because, in the context of the end of the zero-interest-rate era and the tech industry's layoffs and contraction cycle, all of Silicon Valley urgently needs to prove to the world that the engine of innovation is still roaring, and The Next Big Thing is just around the corner. Every retweet and comment they make about novelties like Moltbook injects new fuel into the "Silicon Valley myth," soothing market anxieties while also solidifying their own positions as prophets and definers of innovation.

a16z's approach is not original but learned from Hollywood. Its ancestor is the legendary agent Michael Ovitz from the 1970s.

The agency CAA, founded by Ovitz, completely changed the rules of Hollywood; they no longer passively found jobs for stars but actively planned their careers, packaged projects, and shaped personas, turning individual actors into superstars. What a16z has done is to transplant this mature star packaging industry directly to Silicon Valley.

In 2025, a16z launched the "New Media Fellowship" program, receiving over 2,000 applications but ultimately admitting only 65 people. The admitted fellows came from diverse backgrounds, including engineers from OpenAI and Google, as well as filmmakers. The course content they were to learn had nothing to do with coding or product development; they were to learn how to create viral content, how to get your article on the front page of Hacker News within 24 hours, how to get top VCs to retweet your posts, and how to tell an engaging story.

This is an unadulterated narrative boot camp.

The industrialization of narrative by a16z has led to an unexpected effect: it has turned previously secretive narrative techniques into a publicly recognized discipline. The course content of the New Media Fellowship, the "Timeline Takeover" tactic, the Build in Public strategy, etc., which were once a16z's internal secret weapons, have now become the textbooks that every entrepreneur in Silicon Valley is learning.

But why does this industrialized narrative machine seem particularly effective in the AI era?

Unlike past internet bubbles, AI technology inherently possesses "black box" attributes. Whether an e-commerce site is profitable can be easily discerned by users; however, whether an AI model is genuinely intelligent is much harder to verify intuitively. This invisibility creates a vast operational space for narrative.

When Moltbook claims that there are 1.5 million AI agents socializing on the platform, ordinary people find it difficult to discern whether these agents are truly AI. The complexity of the technology becomes a protective umbrella for the narrative.

More critically, AI happens to strike at the intersection of humanity's oldest fears and fantasies.

From "The Terminator" to "The Matrix," the narrative of AI awakening has been rehearsed in popular culture for decades. When agents on Moltbook began discussing how to evade human surveillance, it triggered not only curiosity but also anxiety deeply rooted in the collective subconscious. This emotional amplification effect is something that no other technology field can replicate.

The convergence of narrative industrialization and AI is like dry tinder meeting a raging fire.

Narrative ability has transformed from a scarce resource into a standard feature. Any ambitious founder knows how to create topics, gain endorsements from big names, and guide media follow-ups.

Therefore, Matt Schlicht's ability to propel Moltbook into the spotlight doesn't even require a16z's involvement, as he has already learned all of a16z's tricks.

He concentrated his efforts on Twitter to create topics, boldly adopted the Build in Public strategy, making every participant a link in his marketing chain. The endorsements from Karpathy and comments from Musk followed, mirroring the techniques taught by a16z.

What’s smarter is that he chose a perfect timing; the OpenClaw framework had just been open-sourced, and AI agents were at the peak of public attention. He didn't need to develop any underlying technology himself; he just needed to build a stage to perform on.

This is the ultimate form of narrative industrialization. The technology is open-source, the narrative is replicable, and the cost of creating gods is absurdly low, while the consequences are borne entirely by the public drawn in by the story.

As AI technology becomes democratized, the engineering difficulty of going from 0 to 1 has collapsed. The real red ocean lies in the narrative leap from 1 to 10,000.

The formula for a hit is already very clear: a spectacle that can be screenshot, combined with a one-sentence label, plus the relay distribution from major accounts. Whoever can make something go viral can seize the discourse power of this era.

The narrative singularity has indeed arrived. But now, no one knows whether this industrialized system will ultimately consume genuine technological innovation when the tide goes out.

When the Tide Goes Out

In 2022, to combat the worst inflation in 40 years, the Federal Reserve aggressively raised interest rates at an unprecedented pace, lifting rates from near-zero levels. The decade-long era of zero interest rates officially came to an end.

The moment of reckoning has arrived.

According to data from Layoffs.fyi, the total number of layoffs in global tech companies exceeded 260,000 in 2023. The valuation bubbles that were inflated during the zero-interest-rate era are bursting one after another. Payment giant Stripe's valuation plummeted from a peak of $95 billion in 2021 to as low as $50 billion; the fresh delivery company Instacart had a private market valuation as high as $39 billion in 2021, but by the time of its IPO in 2023, its market value was less than $10 billion.

And the conclusion of Matt Schlicht's Moltbook, this farce, was actually foreshadowed long ago.

Looking back at Schlicht's career, in 2007, he live-streamed a "Halo 3" marathon, causing the Ustream website to crash due to excessive traffic; in 2016, he faced a collapse of his personal reputation in the startup community due to allegations of reselling business plans; ten years later, he created Moltbook, which again collapsed due to poor security measures, exposing the sensitive information of 1.5 million users.

Some people seem destined to bring down something.

When we shift our gaze from the revelry on social media to the real performance data of AI agents, we find a completely different world.

A research report from Salesforce in 2025 showed that even the best AI agents had a success rate of only 55% when handling professional CRM tasks. Another company, Superface, reported even more pessimistically, finding that 75% of AI agent tasks ultimately ended in failure. Furthermore, an independent analysis of Moltbook by Columbia University professor David Holtz directly debunked the illusion of "AI autonomous socializing," revealing that 93.5% of comments on the platform received no replies.

But these calm, objective voices make no splash in the massive wave created by social media.

Silicon Valley's business model has long shifted from creating value to creating narrative.

When all the smartest minds are pondering how to write viral tweets and how to get on trending lists, will anyone still work on the foundational technological breakthroughs that require years of patient effort?

When the cost of producing a narrative is absurdly low, and there are endless willing buyers, exposing the bubble itself seems like an untimely, even somewhat immoral act.

Original Link

免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

Share To
APP

X

Telegram

Facebook

Reddit

CopyLink