Charts
DataOn-chain
VIP
Market Cap
API
Rankings
CoinOSNew
CoinClaw🦞
Language
  • 简体中文
  • 繁体中文
  • English
Leader in global market data applications, committed to providing valuable information more efficiently.

Features

  • Real-time Data
  • Special Features
  • AI Grid

Services

  • News
  • Open Data(API)
  • Institutional Services

Downloads

  • Desktop
  • Android
  • iOS

Contact Us

  • Chat Room
  • Business Email
  • Official Email
  • Official Verification

Join Community

  • Telegram
  • Twitter
  • Discord

© Copyright 2013-2026. All rights reserved.

简体繁體English
|Legacy

Nvidia's share significantly dropped.

CN
BTCdayu
Follow
5 hours ago
AI summarizes in 5 seconds.

Nvidia's market share has plummeted. Where are the opportunities in the new phase of the AI revolution? This is the ninth article in the AI Investment Research 100 series.

In the previous articles, we looked at Intel, AMD, and ARM. Their stock prices have all seen significant increases over the past year—AMD has doubled, Intel has tripled, and ARM has also reached historic highs. After the rise, a simple question arises:

Can we still hold onto these stocks that have already risen? Are there still opportunities among those that haven’t risen?

To answer this question, we must pass through a core term—reasoning. For the companies mentioned earlier, these two characters repeatedly appear in the analysis.

So, how big is the reasoning market? What stage are we currently in? Which companies will benefit, and which have already been priced in by the market, and which have not?

This is the ninth article in the AI Investment Research 100 series, spanning 15,000 words, rich in content yet easy to read. It is recommended to save it for later viewing.

1. How big is the market?

Model training is "writing programs", while reasoning is "the process of this program being called every day." After GPT was trained, hundreds of millions of people ask it questions every day, and each Q&A consumes reasoning computing power. When Claude Code runs a task, the agent runs one hundred rounds on its own, with each round being reasoning.

Multiple industry studies and media references point in the same direction: once models enter the production environment, reasoning will become a major component of lifecycle costs, commonly estimated to be in the range of 80-90%. This means that in the future AI era, 80 out of every 100 dollars spent on computing power will be burned by reasoning.

However, over the past three years, the market has mostly discussed training, as training is the more "attractive" story—who has more H100s, whose parameters are larger, who first trains the next-generation model. Reasoning has been treated as an afterthought once training is complete.

This cognitive bias is being corrected, and this is the fundamental reason why a group of semiconductor companies have been revaluated over the past year.

So while the reasoning market is large, just how large is it? We can quantify it from five perspectives.

First is the number of users. ChatGPT has a weekly active user base of 900 million, with 50 million being paid subscribers. The comparison from the Chinese side is even more direct—daily token usage has increased from 100 billion at the beginning of 2024 to 140 trillion by 2026, a 1400-fold increase. This aspect is far from saturation.

Second is the intensity of usage. OpenAI's token processing volume was 6 billion per minute in October 2025, and by April 2026 it had risen to 15 billion—a 2.5-fold increase in six months. Enterprise revenue accounts for over 40%, and the usage intensity of enterprise users is dozens of times that of consumers.

Third is the length of conversations. The context length has increased from a few hundred tokens in the early days to the current DeepSeek API documentation, which lists the V4 Pro / Flash context length at 1M, with a maximum output of 384K. The longer the document, the higher the memory and computing power consumed per reasoning instance.

Fourth, the models themselves are becoming increasingly computing power-intensive. Reasoning models like OpenAI O1, DeepSeek R1, and Claude thinking will "think" internally for thousands or even tens of thousands of tokens before answering a question. Jensen Huang mentioned DeepSeek R1 as an example, noting that reasoning models may require significantly higher computing power—potentially an order of magnitude more.

In the past, when you asked AI a question, it would give you an answer directly; now, when you ask AI a difficult question, it will think for half a minute before providing an answer. This "thinking for half a minute" adds to the new computing power consumption.

Fifth is the agent. A typical agent task may call the model 10-100 times. OpenAI Codex has already surpassed 4 million weekly active users (as of April 22, 2026)—this is just one product from one company. An industry insider's judgment is that the overall computing power consumption of AI agents can reach more than ten times that of similarly sized large language models.

The multiplication of these five aspects suggests that the total demand for reasoning will experience a scale expansion within three to five years, and this is not an exaggeration but a judgment that is increasingly close to mainstream consensus.

Nvidia's reasoning share has plummeted; where are the opportunities as the AI revolution enters its second phase?

https://mp.weixin.qq.com/s/YiMfIcf9L2RepjuG-b2B8A


免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

|
|
APP
Windows
Mac
Share To

X

Telegram

Facebook

Reddit

CopyLink

|
|
APP
Windows
Mac
Share To

X

Telegram

Facebook

Reddit

CopyLink

Selected Articles by BTCdayu

15 hours ago
"NVIDIA's market share is only 48% left."
2 days ago
It's not that the old ETH is more stable, but rather the natural reasoning.
2 days ago
Recent good product recommendations:
View More

Table of Contents

|
|
APP
Windows
Mac
Share To

X

Telegram

Facebook

Reddit

CopyLink

Related Articles

avatar
avatarBITWU.ETH
59 minutes ago
Regarding the suggestions and reflections he gave me after chatting with GPT | 2026
avatar
avatarBITWU.ETH
3 hours ago
Do not respond to malice.
avatar
avatarPhyrex
8 hours ago
The weekend has passed without any surprises.
avatar
avatarPhyrex
8 hours ago
I pondered all night.
APP
Windows
Mac

X

Telegram

Facebook

Reddit

CopyLink