The impact of DeepSeek on the upstream and downstream protocols of Web3 AI

CN
1 year ago

Original Author: Kevin, the Researcher at BlockBooster

TLDR

  • The emergence of DeepSeek shatters the moat of computing power, with open-source models leading the new direction of computing optimization;

  • DeepSeek benefits the model and application layers in the industry supply chain, negatively impacting computing power protocols in the infrastructure;

  • The advantages of DeepSeek inadvertently burst the last bubble in the Agent track, with DeFAI most likely to give birth to new life;

  • The zero-sum game of project financing is expected to come to an end, with community launches and a small amount of VC funding becoming the norm.

The impact triggered by DeepSeek will have far-reaching effects on the upstream and downstream of the AI industry this year. DeepSeek successfully enables consumer-grade graphics cards to complete large model training tasks that previously required high-end GPUs. The first moat surrounding AI development—computing power—begins to collapse. As algorithm efficiency races ahead at a staggering annual rate of 68%, while hardware performance follows the linear ascent of Moore's Law, the deeply entrenched valuation models of the past three years are no longer applicable. The next chapter of AI will be opened by open-source models.

Although the AI protocols of Web3 are completely different from those of Web2, they inevitably bear the influence of DeepSeek. This influence will give rise to entirely new use cases in the upstream and downstream of Web3 AI: infrastructure layer, middleware layer, model layer, and application layer.

Sorting Out the Collaborative Relationships of Upstream and Downstream Protocols

Through the analysis of technical architecture, functional positioning, and practical use cases, I have divided the entire ecosystem into: infrastructure layer, middleware layer, model layer, and application layer, and sorted out their dependencies:

Impact of DeepSeek on Web3 AI Upstream and Downstream Protocols

Infrastructure Layer

The infrastructure layer provides decentralized underlying resources (computing power, storage, L1), where computing power protocols include: Render, Akash, io.net, etc.; storage protocols include: Arweave, Filecoin, Storj, etc.; and L1 includes: NEAR, Olas, Fetch.ai, etc.

The computing power layer protocols support model training, inference, and framework operation; storage protocols preserve training data, model parameters, and on-chain interaction records; L1 optimizes data transmission efficiency and reduces latency through dedicated nodes.

Middleware Layer

The middleware layer serves as a bridge connecting the infrastructure and upper-layer applications, providing framework development tools, data services, and privacy protection. Data labeling protocols include: Grass, Masa, Vana, etc.; development framework protocols include: Eliza, ARC, Swarms, etc.; and privacy computing protocols include: Phala, etc.

The data service layer provides fuel for model training, while the development framework relies on the computing power and storage of the infrastructure layer, and the privacy computing layer protects data security during training/inference.

Model Layer

The model layer is used for model development, training, and distribution, with open-source model training platforms like Bittensor.

The model layer relies on the computing power of the infrastructure layer and the data from the middleware layer; models are deployed on-chain through development frameworks; the model market delivers training results to the application layer.

Application Layer

The application layer consists of AI products aimed at end users, where Agents include: GOAT, AIXBT, etc.; DeFAI protocols include: Griffain, Buzz, etc.

The application layer calls pre-trained models from the model layer; relies on privacy computing from the middleware layer; and complex applications require real-time computing power from the infrastructure layer.

DeepSeek May Have a Negative Impact on Decentralized Computing Power

According to a sample survey, about 70% of Web3 AI projects actually call OpenAI or centralized cloud platforms, with only 15% of projects using decentralized GPUs (such as Bittensor subnet models), and the remaining 15% employing a hybrid architecture (sensitive data processed locally, general tasks on the cloud).

The actual usage rate of decentralized computing power protocols is far below expectations and does not match their actual market value. There are three reasons for the low usage rate: Web2 developers continue to use their original toolchains when migrating to Web3; decentralized GPU platforms have yet to achieve price advantages; and some projects evade data compliance checks under the guise of "decentralization," while still relying on centralized clouds for actual computing power.

AWS/GCP occupies over 90% of the AI computing power market, while Akash's equivalent computing power is only 0.2% of AWS. The moat of centralized cloud platforms includes: cluster management, RDMA high-speed networks, and elastic scaling; decentralized cloud platforms have web3 improved versions of these technologies, but they have unaddressed flaws, such as latency issues: distributed node communication latency is six times that of centralized clouds; and toolchain fragmentation: PyTorch/TensorFlow does not natively support decentralized scheduling.

Impact of DeepSeek on Web3 AI Upstream and Downstream Protocols

DeepSeek reduces computing power consumption by 50% through sparse training, enabling consumer-grade GPUs to train models with billions of parameters through dynamic model pruning. Market expectations for high-end GPU demand in the short term have been significantly lowered, and the market potential for edge computing has been re-evaluated. As shown in the figure, before the emergence of DeepSeek, the vast majority of protocols and applications in the industry used platforms like AWS, with only a few use cases deployed on decentralized GPU networks. These use cases valued the price advantage of the latter in consumer-grade computing power and did not focus on the impact of latency.

This situation may further deteriorate with the emergence of DeepSeek. DeepSeek releases the constraints on long-tail developers, and low-cost, efficient inference models will spread at an unprecedented speed. In fact, many centralized cloud platforms and several countries have already begun deploying DeepSeek, and the significant reduction in inference costs will give rise to a large number of front-end applications, which will have a huge demand for consumer-grade GPUs. Faced with the impending massive market, centralized cloud platforms will engage in a new round of user competition, not only competing with leading platforms but also with countless small centralized cloud platforms. The most direct way to compete is through price cuts; it can be anticipated that the price of the 4090 on centralized platforms will decrease, which would be a disaster for Web3's computing power platforms. When price is no longer the only moat for the latter, and computing power platforms in the industry are also forced to lower prices, the result is io.net, Render, Akash, and others will be unable to bear it. The price war will destroy the last remaining valuation ceiling of the latter, and the downward spiral caused by declining revenue and user loss may force decentralized computing power protocols to transform in a new direction.

The Specific Significance of DeepSeek to Industry Upstream and Downstream Protocols

Impact of DeepSeek on Web3 AI Upstream and Downstream Protocols

As shown in the figure, I believe DeepSeek will have different impacts on the infrastructure layer, model layer, and application layer. From a positive perspective:

The application layer will benefit from a significant reduction in inference costs, allowing more applications to ensure that Agent applications remain online for extended periods and complete tasks in real-time at low costs;

At the same time, the low-cost model expenses like those of DeepSeek can enable the DeFAI protocol to form more complex SWARMs, with thousands of Agents used for a single use case, where each Agent's role will be very detailed and clear, greatly enhancing user experience and avoiding the misinterpretation and execution of user inputs by the model;

Developers in the application layer can fine-tune models, feeding prices, on-chain data and analysis, and governance data to DeFi-related AI applications without having to pay high licensing fees.

The significance of the open-source model layer has been proven with the advent of DeepSeek, as high-end models are opened to long-tail developers, stimulating a widespread development boom;

The computing power high walls built around high-end GPUs over the past three years have been completely shattered, giving developers more choices and establishing a direction for open-source models. In the future, the competition among AI models will no longer be about computing power but about algorithms, and this shift in belief will become the cornerstone of confidence for open-source model developers;

Specific subnets around DeepSeek will emerge endlessly, with model parameters increasing under the same computing power, and more developers will join the open-source community.

From a negative perspective:

The objective existence of usage latency in computing power protocols within the infrastructure cannot be optimized;

Moreover, the hybrid network composed of A100 and 4090 requires higher coordination algorithm demands, which is not an advantage of decentralized platforms.

DeepSeek Bursts the Last Bubble in the Agent Track, DeFAI May Give Birth to New Life, and Industry Financing Methods Will Change

Agents are the last hope for AI in the industry. The emergence of DeepSeek liberates the constraints of computing power and paints a future expectation of application explosion. This was a significant boon for the Agent track, but due to the strong correlation with the industry, the US stock market, and Federal Reserve policies, the last remaining bubble has been burst, and the market value of the track has plummeted.

In the wave of integration between AI and the industry, technological breakthroughs and market games have always gone hand in hand. The chain reaction triggered by the fluctuations in Nvidia's market value is like a mirror reflecting the deep-seated dilemmas in the AI narrative within the industry: from On-chain Agents to DeFAI engines, beneath the seemingly complete ecological map lies the harsh reality of weak technological infrastructure, hollowed-out value logic, and capital dominance. The superficially prosperous on-chain ecosystem hides hidden ailments: a large number of high FDV tokens compete for limited liquidity, outdated assets rely on FOMO sentiment to survive, and developers are trapped in PVP internal competition, consuming innovative momentum. When incremental funds and user growth hit a ceiling, the entire industry falls into the "innovator's dilemma"—eager for breakthrough narratives while struggling to break free from the shackles of path dependence. This state of tearing provides a historic opportunity for AI Agents: it is not only an upgrade of the technical toolbox but also a reconstruction of the value creation paradigm.

In the past year, more and more teams in the industry have discovered that traditional financing models are failing—the old tricks of giving VCs small shares, high control, and waiting for them to pump the market are no longer sustainable. With VCs tightening their pockets, retail investors refusing to take over, and high thresholds for large exchanges to list tokens, under the triple pressure, a new playstyle more suited to bear markets is emerging: collaborating with leading KOLs and a small number of VCs, launching with a large proportion of community involvement, and cold-starting with low market capitalization.

Innovators represented by Soon and Pump Fun are opening new paths through "community launches"—partnering with leading KOLs to endorse projects, distributing 40%-60% of tokens directly to the community, and launching projects at valuation levels as low as $10 million FDV, achieving millions of dollars in financing. This model builds consensus FOMO through KOL influence, allowing teams to lock in profits early while exchanging high liquidity for market depth. Although it sacrifices short-term control advantages, it can repurchase tokens at low prices during bear markets through compliant market-making mechanisms. Essentially, this represents a paradigm shift in power structure: from a VC-led game of hot potato (institutional takeovers—listing and selling—retail buying) to a transparent game of community consensus pricing, forming a new symbiotic relationship between project parties and the community within liquidity premiums. As the industry enters a transparency revolution cycle, projects that cling to traditional control logic may become remnants of an era swept away by the tide of power migration.

The short-term pain in the market precisely confirms the irreversibility of the long-term technological wave. When AI Agents reduce on-chain interaction costs by two orders of magnitude, and adaptive models continuously optimize the capital efficiency of DeFi protocols, the industry is expected to welcome the long-awaited Massive Adoption. This transformation does not rely on conceptual hype or capital acceleration but is rooted in the technological penetration of real demand—just as the electricity revolution did not stall due to the bankruptcy of light bulb companies, Agents will eventually become the true golden track after the bubble bursts. DeFAI may be the fertile ground for new life, and as low-cost inference becomes routine, we may soon see hundreds of Agents combined into a single Swarm use case. With equivalent computing power, the significant increase in model parameters can ensure that Agents in the open-source model era can be fine-tuned more fully, even when faced with complex user input instructions, allowing them to be broken down into task pipelines that a single Agent can execute fully. Each Agent optimizing on-chain operations may promote increased overall activity and liquidity in DeFi protocols. More complex DeFi products led by DeFAI will emerge, which is precisely where new opportunities arise after the last round of bubble bursts.

About BlockBooster

BlockBooster is an Asian Web3 venture studio supported by OKX Ventures and other top institutions, dedicated to being a trusted partner for outstanding entrepreneurs. We connect Web3 projects with the real world through strategic investments and deep incubation, helping quality entrepreneurial projects grow.

Disclaimer: This article/blog is for reference only, representing the author's personal views and does not represent the position of BlockBooster. This article does not intend to provide: (i) investment advice or recommendations; (ii) offers or solicitations to buy, sell, or hold digital assets; or (iii) financial, accounting, legal, or tax advice. Holding digital assets, including stablecoins and NFTs, carries high risks, with significant price volatility, and they may even become worthless. You should carefully consider whether trading or holding digital assets is suitable for you based on your financial situation. For specific questions, please consult your legal, tax, or investment advisor. The information provided in this article (including market data and statistics, if any) is for general reference only. Reasonable care has been taken in compiling this data and charts, but we accept no responsibility for any factual errors or omissions expressed therein.

免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

Share To
APP

X

Telegram

Facebook

Reddit

CopyLink