Author: RWA Research Institute
On April 16, 2026, the National Data Bureau publicly solicited opinions on the "Implementation Plan for Promoting the Construction of High-Quality Industry Data Sets (Draft for Comments)." According to reports from the People’s Daily, a noteworthy concept appeared for the first time in the plan—"Token Trading," which explicitly proposes "to explore new data set trading models such as token trading, and to construct a data set value system based on quantifiable and priceable tokens."

In Guizhou's Gui'an New District, a platform named TopenRouter.com is handling millions of requests per minute. According to publicly disclosed information from the platform's operator, Guizhou Data Treasure Network Technology Co., Ltd., its daily average token usage has surged from just under 3 billion to nearly 12 billion, with order quantities exceeding hundreds of thousands. Guizhou Data Treasure is a state-owned enterprise, and its platform practice is seen as an early example of the transition of token trading from concept to reality.
While the top-level design has just begun to explore the economic outline of "tokens," this invisible and intangible element is already seeking its circulation path at the grassroots level of the market.
What makes this topic impossible to ignore is a number that is simple to the point of being blunt.
According to Ma Shengyong, deputy director of the National Bureau of Statistics, at a news conference held by the State Council Information Office on April 17, 2026, as of March 2026, China's daily average token usage had exceeded 140 trillion, a more than 40% increase compared to the end of 2025. Looking back two years to early 2024, daily token usage in China was only 100 billion.
From then to now, a growth of 1400 times. In the history of the digital economy, it is probably difficult to find another fundamental element that has achieved such exponential expansion in such a short period.
Is this merely a fleeting market upsurge or an early signal of some deeper infrastructural transformation?
1. From "Information Unit" to "Settlement Unit"
When we input a sentence to AI, allowing it to translate a piece of text, managing tasks in the background like an untiring execution unit—behind these daily actions, every information unit that is fragmented, encoded, computed, and restored is being measured, priced, and circulated.
This is the token.
According to the national standard "Basic Terminology for Data" (Draft for Comments) overseen by the National Technical Committee for Data Standardization, a token is a basic symbol that represents certain semantics in the information storage, processing, and exchange of intelligent devices in the AI field, serving as the minimum unit for information processing and exchange in large models. To put it informally, it resembles a fundamental building block in the AI world—unseen by the human eye but supporting the operation of the entire system.
On March 23, 2026, Liu Liehong, director of the National Data Bureau, provided a further qualitative description of tokens at the China Development Forum annual meeting. He pointed out that tokens are not only the minimum unit for large models processing information but also possess measurable, priceable, and tradable characteristics in the intelligent era, with a new value system rapidly evolving around token invocation, distribution, and settlement. Liu further positioned the token as a "value anchor point of the intelligent era" and a "settlement unit" that connects technological supply with commercial demand.
From "information unit" to "settlement unit," this semantic leap reflects a logical adjustment occurring at the grassroots level of China's AI industry. The focus has shifted from whose model is smarter to whose intelligent service can be accurately measured, reasonably priced, and compliantly circulated. This is a cognitive reconstruction that is currently taking place.
When we discuss token trading, a question that must be answered is: what exactly is being circulated?
On the surface, it appears to be computing power. The stronger the computing power, the more tokens produced in a unit of time. However, this answer is not complete. From the perspective of the industrial chain, token services are a comprehensive pricing carrier of capabilities. Behind it lies the stability of power supply, the throughput efficiency of computing clusters, the optimization level of model architecture, and the quality and compliance of underlying data. Among these, the western region has a natural advantage in terms of power and land costs, while computing scheduling technology and model optimization capabilities present a technological difference among various service providers. More critically, the data dimension—whether high-quality vertical industry corpus can be effectively infused into models—directly relates to the improvement of large model performance and the reduction in inference costs.
Therefore, when an enterprise invokes one million tokens, what it gains is not just computing power output but also robust support from high-quality corpus, algorithm optimization, security compliance, and intelligent execution results aligned with business objectives. Tokens function more like a pricing voucher for a bundle of services rather than a commodity defined by a single dimension.
2. Structural Inflation and Competition for Efficiency
The underlying driving force for the demand for token trading comes from a rapidly expanding torrent of demand.
According to data disclosed by the National Data Bureau at its routine press conference in April 2026, as of March this year, China’s daily average token usage had surpassed 140 trillion, increasing more than 1000 times compared to 100 billion in early 2024 and growing by over 40% in just three months since the end of 2025 when it stood at 100 trillion. According to Morgan Stanley's research report, "China AI Token Economy Outlook," released in March 2026, from 2025 to 2030, the annual compound growth rate of token consumption in China is projected to reach 330%, with a five-year growth of 400 times. Additionally, statistics from OpenRouter, a global AI large model aggregation platform, indicate that from March 16 to March 22, 2026, the token usage of AI large models in China exceeded that of the United States for the third consecutive week.

A 1400-fold explosive growth does not come from nowhere. In the age of intelligent agents, AI needs to continuously execute links around its objectives independently, meaning that the actual consumption of tokens far exceeds previous expectations, even reaching multiple orders of magnitude growth. As AI evolves from a "dialogue tool" to an "intelligent agent executing tasks," token consumption has undergone a qualitative leap—an intelligent agent must continuously plan, invoke, verify, and correct in the background to complete a task, creating token consumption at every step. This is not a linear growth but rather structural inflation.
However, this torrent of growth has also exposed structural shortcomings in the supply side. Currently, AI computing power infrastructure still faces bottlenecks in token production efficiency. Simply stacking open-source models and inference frameworks statically without in-depth collaborative optimization from the physical hardware layer to the system scheduling layer can lead to continuous degradation of computing resources at various bottlenecks. There is a growing consensus in the industry that the core focus of the AI competition is shifting from MaaS (Model as a Service) to TaaS (Token as a Service), moving from "comparing sizes of computing clusters" to "comparing token production efficiency per watt."
Token production efficiency per watt—these words may be the key entry point to understanding the underlying logic of the token economy.
The torrent of 140 trillion on the demand side and the efficiency bottlenecks on the supply side are forming pressure at the same point in time, and the result is a change in price signals.
According to public market information, since mid-March 2026, leading companies such as Alibaba Cloud, Tencent Cloud, and Baidu Smart Cloud have gradually adjusted their AI service prices, with increases ranging from 5% to 34%. The large model provider Zhizhun AI has even adjusted its API calling prices three times in two months. The core driving force behind the price increase is the rapid rise in inference token usage. As AI applications shift from "experimental tools" to "productive tools," especially with the explosion of AI Agents, the costs for power, bandwidth, and hardware depreciation from massive concurrent calls have exceeded the vendor's previous subsidy capacities. Guojin Securities, in its research report on the computing power industry published in April 2026, analyzed that under the logic squeeze of supply and demand sides, it is expected that the prosperity of the computing power industrial chain will radiate from core chips to data centers, cloud computing, and computing power services, as well as supporting power equipment and servers.
Every increase in token usage essentially provides demand-side support for the domestic computing power chain.
3. Guizhou and Guangzhou: The Intersection of Two Exploration Paths
If the National Data Bureau's draft for public comments provides an exploratory framework for token trading, then the two practical cases in Guizhou and Guangzhou present early paths from concept to implementation.
The TopenRouter.com platform under Guizhou Data Treasure is one of the earliest cases exploring token trading in China. According to public information from the platform, it leverages Guizhou's advantages as a national computing power hub in terms of electricity and network to encapsulate heterogeneous computing resources into tokens, achieving a peak output capacity of 5 million tokens per minute. The core logic lies in changing the traditional extensive model of computing power leasing—previously, leasing was more akin to a fixed daily or monthly billing model, with costs incurred even during idle time. In contrast, token services are closer to a pay-as-you-go model, where users pay for actual consumption. This adjustment in pricing reflects a cognitive shift from "infrastructure usage rights" to "intelligent service output quantities."

On April 3, 2026, two significant events occurred in Guangzhou. Four computing power projects were officially launched with a total investment of 4.839 billion yuan, adding more than 40,000 P of intelligent computing scale. On the same day, the Guangzhou Municipal Bureau of Government Services and Data Management, in collaboration with Guangzhou Digital Science Group, released the country’s first comprehensive city-level computing power operation and service platform based on "token" level scheduling—Guangzhou’s Integrated Computing Power Network Monitoring and Scheduling Platform. According to the information released by the Guangzhou Municipal Bureau, this platform establishes various flexible billing systems based on tokens as a unified measurement standard, allowing for unit and periodic billing, and achieving unified management and resource pooling of heterogeneous computing power, supporting standardized orchestration and token-level scheduling. The launch of the platform marks the entry of Guangzhou’s computing resource management into a new stage of intelligent and centralized operation, aiming to provide one-stop computing power services for enterprises and alleviate practical problems such as difficulties and high costs of computing power.
Both the Guangzhou plan and Guizhou practice point in one direction. Tokens are gradually becoming a scheduling unit for government management of computing resources. With tokens serving as the minimum pricing unit, costs are linked to actual AI consumption, making budgets measurable and expenditures transparent. From national ministries to local governments, from state-owned platforms to market enterprises, a new governance system centered on tokens is taking shape.
It should be noted that token trading is still in the policy exploration stage, with its trading rules, rights confirmation mechanisms, and pricing systems yet to be fully established, and relevant regulations still need further clarification. Core issues such as the quality grading standards for tokens, cross-platform settlement agreements, and the definition of data ownership are still under ongoing discussion in the industry.
4. Establishing a Pricing Unit: A Prologue to Industry Maturity
In the industrial era, the unit of measurement for electricity was kilowatt-hours. In the information era, the unit of measurement for data was GB. In the intelligent era, could the unit of measurement for intelligence be tokens—this is becoming a proposition worth observing.

When something has a standardized unit of measurement, it gains the basic conditions for entry into the economic system. It can be accounted for, audited, and integrated into a more formal economic narrative framework. The China Center for Information Industry Development has posited in its publicly released industry research report that the appearance of the term "token" signifies that artificial intelligence is beginning to be included in a more mature economic discourse.
Of course, the evolution of the token economy is not without its challenges. The Shanghai Securities Journal pointed out in its April 2026 analysis report that the industry is facing a structural contradiction of "surging usage but lagging revenue growth," with some companies caught in a situation where their data looks impressive but monetization is weak. The valuation of tokens is shifting from simple "usage-based pricing" to refined "quality-based pricing." Building a commercial closed loop requires layering value capabilities on the foundation of precise measurement. The Securities Daily also noted that currently over 95% of token consumption comes from free subsidized users, with paid calls that genuinely hold commercial value remaining relatively low. The rapid growth in token usage poses higher demands for industry norms, data governance, and security guarantees. To continuously unleash the value of tokens, it is necessary not only to ensure stability in quantity but also to continuously improve token quality, facilitate data circulation, establish sound pricing mechanisms, and balance innovation with security.
However, the direction is gradually becoming clearer. From the National Data Bureau's draft for public comments to Guizhou Data Treasure's trading platform exploration, from Guangzhou's city-level computing power scheduling center to China's changing position in the global token utilization market, an industrial chain linking top-level design with grassroots practice and domestic markets with global competition is accelerating its construction centered around tokens.
The establishment of a pricing unit is often the first step for an industry to mature from immaturity.
Looking back two years ago. In early 2024, China's daily average token usage was only 100 billion. At that time, "token" was merely a technical term used by AI engineers, not appearing in policy documents from the National Data Bureau or becoming a focal point for industrial discussions. Today, the daily average usage of 140 trillion is silently presenting a change.
AI is transitioning from a laboratory exhibit to a foundation for societal operations.
The era where every dialogue is being priced has already arrived.
免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。
