The United States has spent three years, implemented four rounds of export controls, involving 24 categories of semiconductor equipment and over 140 entities, in an attempt to cut off China's access to advanced AI chips. However, according to a report released on March 24 by the U.S.-China Economic and Security Review Commission (USCC), 80% of American AI startups are using Chinese open-source models.
The wall is built at the hardware layer. The door opens at the software layer.
This set of contradictions is not an abstract policy discussion. Just last week, the AI programming tool Cursor, valued at $29.3 billion, was found to have its flagship feature Composer 2 based on Kimi K2.5 from the dark side of the moon. A model from a Chinese company is driving one of the leading AI development tools in the United States.
Meanwhile, the Pentagon has labeled Anthropic, an American company, as a "supply chain risk."
The direction of regulation and the direction of actual reliance are completely opposite.
Since the first round of export restrictions on A100/H100 chips by BIS in October 2022, U.S. chip controls have continued to escalate. In 2023, the loophole for H800 was closed, and performance density control metrics were expanded. In December 2024, another round of new regulations was added, introducing 24 categories of semiconductor equipment restrictions and blacklisting 140 Chinese entities, even high-bandwidth memory (HBM) and DRAM were included in the restrictions. In January 2025, the Department of Commerce even launched an "AI Diffusion Framework," attempting to establish a global regulatory system from the model level, but this framework was withdrawn two days before it officially took effect. By December 2025, Trump reversed direction again, allowing H200 chips to be exported to approved customers in China.

In the latter half of this regulation timeline, the pace of releases for Chinese open-source models has been accelerating. In 2024, DeepSeek-V2 and Qwen 2.5 series were successively open-sourced. On January 20, 2025, DeepSeek-R1 and Kimi K1.5 were released on the same day, with the former once topping the U.S. App Store download charts, surpassing ChatGPT. In the second half of 2025, Kimi-K2 and GLM-4.5 followed suit. By early 2026, ByteDance's Doubao 2.0 had 155 million weekly active users, and Kimi K2.5 was directly adopted by Cursor. The tighter the regulation, the more models emerge.
According to official data from HuggingFace, the share of Chinese open-source models in global downloads soared from about 1.2% at the end of 2024 to approximately 30% by early 2026. The cumulative downloads of the Ali Qwen series surpassed 700 million in January 2026, officially exceeding Meta's Llama. Chip regulations have not stopped China's AI software output; instead, they may have accelerated the strategic shift towards open source.
This is not a coincidence. The USCC report used an accurate framework to describe this phenomenon: "dual circulation." In the hardware circulation, China is constrained by a chip supply bottleneck. In the software circulation, China is reversing infiltrating global AI infrastructure through open-source models, creating downstream dependencies. The forces of the two circulations are in opposing directions but reinforce each other. Regulations limit our ability to access top computing power, but they also force the development of a technical path that does more with less computing power. DeepSeek-R1 achieving cutting-edge performance at a fraction of the inference cost of GPT-4o is a product of this path.

The changes on HuggingFace are visible to the naked eye. According to platform statistics, by the end of 2024, models derived from Llama accounted for about 60% of newly added language models, while Qwen accounted for just over 10%. By mid-2025, a crossover appeared; according to the official HuggingFace blog, the share of Qwen-derived models surged to over 40%, while Llama dropped to around 15%. By early 2026, Qwen-derived models were approaching half, while Llama continued to shrink to about 12%.
The speed of this crossover has exceeded most people's expectations. Two years ago, open-source AI was almost synonymous with Meta's Llama ecosystem. Global developers were fine-tuning, deploying, and creating products based on Llama. Now, the same things are being replayed on the Qwen ecosystem, only faster and with broader coverage.
This means that global developers are increasingly choosing Chinese models as the underlying base when building AI applications. Not because of political stances, but because of performance and openness. The Qwen 2.5 series covers parameter counts from 0.5B to 72B, and developers can fine-tune and deploy on their own hardware without needing to pay OpenAI or Anthropic for API calls. Open source eliminates vendor lock-in and transcends national borders.
A noteworthy detail is that according to a February report by MIT Technology Review, Chinese AI companies are forming differentiated competition in their open-source strategies. DeepSeek is taking an extreme cost-efficiency approach, Kimi focuses on long context and coding capabilities, while Qwen aims for full parameter coverage. This multi-route progress creates increasingly rich options for global developers. Our open-source models are redefining the global AI supply chain with strength.
But what does the endpoint of this supply chain look like?
On March 19, developer @fynnso discovered the model ID accounts/anysphere/models/kimi-k2p5-rl-0317-s515-fast in Cursor's code. Cursor co-founder Aman Sanger subsequently admitted that Composer 2 was built based on Kimi K2.5. According to Cursor VP Lee Robinson, "the base model only contributed about a quarter of the compute load, with the rest coming from our own training." But the base is the base. The product, valued at $29.3 billion, has its base model coming from the dark side of the moon, a Chinese company funded by Alibaba and HongShan.

Looking at this dependency chain together with the actions of the Pentagon reveals a more absurd reality. On March 5, the Pentagon officially marked Anthropic as a "supply chain risk." According to NPR, the reason is that Anthropic CEO Dario Amodei refused to compromise on two red lines: the use of AI for autonomous weapons and mass surveillance of American citizens. Trump gave the military six months to eliminate Claude, while Claude has already been deeply integrated into military and national security platforms. Anthropic subsequently filed a lawsuit against the Pentagon on March 9.
On one side, the U.S. government labels its own company as a "supply chain risk," while on the other, 80% of American startups are relying on Chinese models. The former is a political game, while the latter is a technological reality. There is no intersection between the two.
80% of American startups are relying on Chinese models, while the Pentagon's risk label is attached to an American company. Regulations are stacking up at the hardware level, while reliance is quietly growing at the software level. On the other side of three years of chip walls is an emerging new reality: Chinese open-source AI is no longer a "follower," but rather the supply side of global AI infrastructure.
免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。
