Charts
DataOn-chain
VIP
Market Cap
API
Rankings
CoinOSNew
CoinClaw🦞
Language
  • 简体中文
  • 繁体中文
  • English
Leader in global market data applications, committed to providing valuable information more efficiently.

Features

  • Real-time Data
  • Special Features
  • AI Grid

Services

  • News
  • Open Data(API)
  • Institutional Services

Downloads

  • Desktop
  • Android
  • iOS

Contact Us

  • Chat Room
  • Business Email
  • Official Email
  • Official Verification

Join Community

  • Telegram
  • Twitter
  • Discord

© Copyright 2013-2026. All rights reserved.

简体繁體English
|Legacy

From electricity to chips, how ordinary people can participate in the wealth opportunities of the AI era.

CN
律动BlockBeats
Follow
1 hour ago
AI summarizes in 5 seconds.
Original Title: If you don't understand AI by the end of this, the next decade will confuse you
Original Author: Anish Moonka
Translated by: Peggy, BlockBeats

Editor's Note: When people talk about AI, the attention is often focused on the most obvious places: chatbots, AI assistants, and various new applications. However, behind these products, a deeper industrial restructuring is taking place. From electricity, chips, to data centers, and then to models and applications, AI is essentially a technology stack made up of multiple layers of infrastructure, and the flow of capital and profits is much more complex than it appears on the surface.

This article systematically outlines this value chain from the perspective of the "Five Layers of AI": why hundreds of billions of dollars are flowing into energy, chips, and cloud infrastructure; why model companies are burning massive amounts of cash while growing rapidly; and where the real value in this technological revolution may be concentrated first.

By comparing AI with historical cycles such as the electricity revolution and internet infrastructure construction, the author attempts to answer a key question: where exactly is capital flowing in this wave of technology that may reshape the global industrial structure.

Below is the original article:

Most people think of AI as just a chatbot.

I can understand that thinking. You open ChatGPT, ask it to help you modify an email, and it can complete the task instantly. It feels like magic. So you close the page, thinking you understand what AI is about. But that's like swiping a Visa card at a restaurant and thinking you understand how Visa makes money. You’re just using the product without seeing the underlying system.

For most of last year, I was trying to figure out where the real profits of AI are flowing. And one somewhat embarrassing fact is that it took me a long time to realize I had been looking at the wrong level. I had been fixated on ChatGPT, Claude, Gemini—things you can interact with directly.

Meanwhile, $700 billion was quietly flowing into a whole set of infrastructure that I could hardly name: chips I had never heard of, technical acronyms that sounded made up, cooling systems, power plants. In Texas, Iowa, and Hyderabad, a lot of concrete is being poured to build data centers.

About a year ago, hardly anyone I knew was talking about these things. But now, everyone has started to talk about them.

This article will be relatively long. If you don’t have time to read it all now, you can save it to read later.

I want to take you through the complete value chain of AI: starting with the power supplying data centers, all the way to applications on your phone.

And I will explain it in a way that even if you've never read a public company's annual report in your life, you'll still understand. I will explain every term; I will provide real data for every judgment; and for places where I’m still uncertain, I will be transparent about it, because there are indeed some.

So let’s get started.

1. The Five Layer Cake (Why No One Discusses the Bottom Four Layers)

AI is infrastructure. Just like the internet, just like electricity, it requires factories. — Jensen Huang

Most people's understanding of AI is this: a smart computer answering questions.

This is like saying the internet is "a place to watch videos." Technically correct, but completely misses the point.

At the World Economic Forum in January 2026, Jensen Huang described AI as a five-layer system:

Energy

Chips

Cloud

Models

Applications

He refers to this entire system as "the largest infrastructure construction in human history."

First, think about the word: infrastructure.

Highways. Power grids. Water supply systems. These things keep modern civilization running, but people usually only notice them when something goes wrong.

AI is becoming the same—it’s invisible, indispensable, and extremely costly to build. I refer to this entire structure as the AI Stack. It consists of five layers, each stacked on top of the other, with each layer supporting the one above it and capital flowing bidirectionally between these layers.

The simplest version I can give you is this:

Energy, you need power to run computers, and a lot of it.

Chips, you need specialized processors for computation. This is not the CPU in your laptop.

Cloud, you need enormous warehouse-type data centers filled with these chips, connected by extremely high-speed networks.

Models, you need genuine AI software—the "intelligent brain" that learns patterns from data.

Applications, you need products that people actually use, like ChatGPT, Google Search, or bank anti-fraud systems.

Any AI discussion that only focuses on the fifth layer (the application layer) overlooks a whopping 80% of reality. And if you are an investor, entrepreneur, or just someone who wants to understand the future direction of the world, the key point is that money will not be evenly distributed among these five layers. It will concentrate, compound, and flow toward a very few critical nodes.

And today, that funding is concentrating in places that most people aren't even aware of.

2. Tracking the Flow of Capital (The Answers Are Not Where You Think They Are)

People's attention tends to concentrate almost entirely on the application layer. ChatGPT, GitHub Copilot, Claude, Perplexity.

These are products you can use directly, so it's easy to feel that the story of AI is simply about these applications.

But most people overlook something. By 2026, the four largest cloud computing companies (Amazon, Microsoft, Google, Meta) are expected to reach a total capital expenditure (CapEx) of $650 billion to $700 billion within a year.

This is just one year, four companies combined.

This figure is roughly equivalent to the entire GDP of Switzerland. About 75%, or approximately $450 billion, will be directly invested in AI infrastructure.

Not in chatbots, not in applications. But in buildings, chips, fiber optics and networks, cooling systems—topics that hardly anyone talks about at cocktail parties. This precisely indicates where the money is.

Because think about it, before anyone could use ChatGPT, someone had to accomplish one thing: build a data center the size of a shopping mall, install tens of thousands of specialized processors, and connect them with networking equipment worth far more than most companies' market value, and provide enough power to supply a small city. And it has to operate like this every single day.

This is the first to third layers: energy, chips, cloud infrastructure—these are the invisible layers where the real capital is being deployed.

Someone might ask, "What about OpenAI? Haven’t they made billions yet?"

That’s true.

By the end of 2025, OpenAI's annual recurring revenue (ARR) is expected to reach $20 billion. A year ago it was $6 billion, and the year before that it was only $2 billion.

Growing tenfold in two years is a feat few companies in human commercial history have achieved at this scale.

But the problem is, costs are equally staggering.

In 2025: OpenAI will burn approximately $9 billion in cash

In 2026: expected to burn $17 billion

Just the inference cost—the cost of the system actually running the model when you ask an AI a question:

In 2025: $8.4 billion

Expected in 2026: $14.1 billion

Under current forecasts, OpenAI may not achieve positive cash flow until 2029 or 2030.

So the question arises: where is all this burned money going?

The answer is: flowing down through the AI technology stack.

Flowing to:

Microsoft Azure (OpenAI is required by contract to pay 20% of revenue to Microsoft by 2032)

Nvidia's GPUs

Engineering firms constructing data centers

And energy companies supplying power

If you look at this system long enough, you'll see a nearly circular structure:

Microsoft invests in OpenAI

OpenAI uses this money to purchase Azure cloud services

Azure uses the revenue to buy Nvidia chips

Nvidia announces record profits

Everyone applauds

Then, capital continues to flow downward.

There is a very important structural fact in the AI technology stack:

Most users are at the top layer (application layer)

Most profits are at the bottom layer (infrastructure layer)

And this misalignment between user position and profit position is the core of the entire AI investment logic.

This is the first law of the AI value chain: revenue flows upward, capital settles downward.

3. You've Actually Seen This Scene

All human problems are essentially engineering problems, and engineering problems can ultimately be solved. — Buckminster Fuller

If you want to truly understand what is happening with AI, you can look back at the history of the electricity revolution from 1880 to 1920.

In 1882, Thomas Edison built the first commercial power station on Pearl Street in Manhattan, New York. At the time, most people thought electricity was just a novelty, a "more advanced" way of lighting. After all, gas lamps worked just fine. Who really needed this?

But in just 40 years, electricity completely reshaped almost every industry: manufacturing, transportation, communications, healthcare, entertainment.

Those who truly won this revolution were not the ones who invented the light bulb but those who built the infrastructure: General Electric, Westinghouse Electric, power companies, copper mining enterprises, engineering construction firms.

AI is now repeating the same pattern, only the speed has been compressed into a few years instead of several decades.

Contrast two chains:

AI system: AI → Data Center → Chips → Raw Materials → Energy

Electricity system: Electricity → Factories → Machines → Raw Materials → Coal / Hydro

The two paths are almost identical. And the winners once again are not primarily in the application layer but in the infrastructure layer.

I refer to this phenomenon as Infrastructure Gravity. Whenever a new computing platform emerges, the first to create wealth are always the "shovel sellers."

Applications will follow, applications will get all the media attention. But infrastructure takes most of the profits.

For example, Nvidia's total revenue in the fiscal year 2026 (up to January 2026) was $215.9 billion, a 65% increase year-over-year. Of this, the data center business alone generated $62.3 billion in revenue in the last quarter, a 75% increase year-over-year. This business now accounts for 91% of Nvidia’s total revenue.

In other words, a company generated $68 billion in revenue in a single quarter, with 90% coming from one business line.

Look at chip manufacturing. TSMC occupies about 70% of the global foundry market in 2025, with sales of $122.5 billion. The second place, Samsung Electronics, is only 7.2%. This level of monopoly makes even Standard Oil in its day seem less exaggerated.

Infrastructure always wins first. The real question is just how long this window will last.

Ask anyone what the internet revolution was, and they'll say Google, Amazon, Facebook.

But if you ask where the early money was made, the answer is actually Cisco Systems, Corning, the companies that laid fiber-optic networks.

The same story; just in a different era.

4. The Part No One Wants to Hear

The stock market is a machine that transfers money from the impatient to the patient. — Charlie Munger

I have to confess one thing. When I first started paying attention to AI as an investor, I made the same mistake as most people; I looked at the application layer. I saw the growth of ChatGPT. I noticed that Anthropic raised billions in funding. So I thought, AI companies would win, so I invested in AI companies.

Later, three things changed my perspective, and they happened in sequence.

The First Thing: The Hottest Companies Are Burning Cash

I discovered that almost all "AI companies" are burning cash like crazy. OpenAI, Anthropic, Mistral AI, xAI. They are all spending money much faster than they are making it. The reason is not that their business models are poor but that the cost of computing power is structural.

Every time you ask AI a question, the system must perform real computations. Computation requires GPUs, and GPUs need power. And the more powerful the model, the higher the demand for computation, so the operating costs will only increase.

In other words: the AI winners that people think are truly those who spend the most money.

The Second Thing: The Most Profitable Are at the Bottom

I noticed that infrastructure companies are printing money. Nvidia's gross margin is close to 75%, and TSMC is raising prices while expanding production because demand far exceeds supply.

These companies do not have a "when will we turn a profit" problem. Their issue is that we simply can’t build fast enough. These two types of problems are completely different.

The Third Thing: Don't Think Like a "Consumer" (Also the Most Uncomfortable One)

I realized I had been thinking about AI like a consumer.

Consumers see applications. Engineers see the tech stack. Once you see the whole tech stack, you can no longer ignore it.

Every AI release becomes a capital expenditure (CapEx) announcement. Every model upgrade becomes a new chip order. Every new feature becomes a new data center lease.

The entire industry starts to resemble concentric circles: the closer to the center, the more profits are concentrated.

Maybe you are: a software engineer focused on AI models, a retail investor who bought Nvidia at $300, or someone observing this revolution from afar in India (or perhaps you are all three—that’s the most interesting position).

No matter where you are, the principle is the same. Consumers see products, investors see supply chains. And the best investors see the supply chain that has already formed before product releases.

5. Investor Map: Layer-by-Layer Breakdown of the AI Technology Stack

The article is already lengthy, so I will speed up.

Below is the structure, major players, and potential opportunities for each layer of the AI Stack.

Layer 1: Energy

AI data centers are extremely power-hungry. A single large model training session may consume the electricity of a small town for a year. By 2026, global AI data centers are expected to consume about 90 terawatt hours of power annually. This is about ten times more than in 2022.

This brings about a very straightforward investment logic: whoever can provide stable power to data centers will benefit. This includes nuclear power companies, gas companies, renewable energy companies, and grid companies, particularly energy companies near data center clusters.

Jensen Huang mentioned in October 2025 that the speed of self-built power for data centers might be faster than connecting to the grid. In fact, many tech companies are already building power generation facilities directly next to data centers, bypassing the grid.

This shocked me. These tech companies are becoming their own power companies.

Beneficiaries include utility companies, independent power producers, and power equipment manufacturers (transformers, switchgear, etc.). In Asia, for example, in India, as hyperscaler data centers expand, power equipment and transmission companies will also benefit.

Layer 2: Chips

This is the layer the public is most familiar with because of Nvidia. But in reality, this layer is much more complex than one company.

The chip layer can be further divided into several sub-layers:

Design Companies

Nvidia (GPU), AMD, Broadcom, Qualcomm

And an increasing number of cloud providers developing their own chips: Google TPU, Amazon Trainium, Microsoft Maia

Manufacturing Companies

Almost monopolized by TSMC, with approximately 70% market share; the second place, Samsung Electronics, holds only 7.2%. Intel is trying to rebuild its foundry business, but that will take years.

Equipment Companies

The machines used to manufacture chips come from ASML (the only company producing EUV lithography machines) as well as Applied Materials, Lam Research, and Tokyo Electron.

Memory Companies

AI models require a large amount of high-bandwidth memory (HBM). Main players: SK Hynix, Samsung, Micron Technology.

Packaging Technology

Advanced packaging technologies (such as TSMC's CoWoS) have become new bottlenecks.

The most shocking aspect of this layer is actually its concentration:

Nvidia: about 92% market share of AI GPUs

TSMC: virtually manufacturing all AI chips

ASML: the only supplier of EUV equipment

One company designs. One company manufactures. One company produces the manufacturing machines. This concentration presents both investment opportunities and geopolitical risks.

Layer 3: Cloud and Data Centers

This is where the chips really operate.

Massive warehouse-like facilities:

Thousands of servers

High-speed network connections

Liquid cooling systems (which have gone from optional to standard)

The market is dominated by three major cloud providers:

Amazon Web Services (31%)

Microsoft Azure (24%)

Google Cloud (11%)

Oracle is also rapidly expanding, planning $50 billion in capital expenditure by 2026. But this layer goes beyond just hyperscalers.

For example:

Foxconn assembles 40% of AI servers

Arista Networks provides networking equipment

Credo Technology (stock price up 117% in 2025)

Vertiv provides liquid cooling

Data center real estate companies:

Equinix

Digital Realty

Even concrete suppliers are in the mix; there is a complete supply chain at every layer.

According to Bank of America estimates, hyperscalers will allocate 90% of their operating cash flow to capital expenditures by 2026. This ratio was 65% in 2025.

Morgan Stanley predicts these companies will issue over $400 billion in debt this year to build data centers. In 2025, this number was $165 billion.

When I first read this number, I paused. $400 billion in debt in one year, just to build more warehouses filled with computers.

Layer 4: Models

This layer is the "brain layer," responsible for training and building true AI models.

Main players include:

OpenAI (GPT series, over $20 billion in annual revenue)

Anthropic (Claude, reportedly around $19 billion in annual revenue by early 2026)

Google DeepMind (Gemini)

Meta AI (Llama, open-source model)

Mistral AI

xAI (developing Grok)

This layer fascinates me because it is both the most sought-after and the least profitable.

For example:

OpenAI's revenue growth rate is unprecedented, yet it is still expected to burn $17 billion in cash in 2026.

Anthropic is also growing at a rapid pace but is highly reliant on funding—$5 billion in a round of financing at the beginning of 2026, with a valuation of about $170 billion.

The problem is that there is a structural contradiction in the business model of this layer. Models are getting stronger and require more computing power, while the growth rate of computing power costs often outpaces revenue growth.

This is somewhat akin to running a restaurant where every new dish requires more expensive ingredients, but customers wish for prices to stay the same.

The result is that profit margins are continuously squeezed.

When will this change? I’m not sure; perhaps not in the short term.

For investors, this layer is high risk, high reward. The problem is that most companies are still privately held.

Therefore, the investment exposure in the public market primarily comes from two types of channels:

Cloud Computing Companies

For example, Microsoft holds a significant stake in OpenAI and provides computing power through Microsoft Azure.

Chip Companies

Because they will consume a lot of their hardware during model training.

Layer 5: Applications

This is the layer you see every day. For example, ChatGPT, Google Search powered by Gemini, Microsoft Copilot features in Office, banking AI fraud detection systems, Netflix recommendation algorithms, AI image enhancement in phones.

The application layer is the widest and the most crowded layer. Thousands of startups and large enterprises are competing here. In the long term, it could become the largest market size layer. Some predictions suggest that by the early 2030s, the market size of the application layer could exceed $2 trillion.

Yet, at the current stage, this layer also has the thinnest profits and the most uncertain competition.

In this layer, true differentiation comes from data. Companies with unique proprietary data will establish lasting advantages.

For example:

Salesforce—enterprise CRM data

Bloomberg—financial market data

Epic Systems—medical record data

Companies that hold this data moat can perform deep fine-tuning on AI models, which general chatbots cannot achieve.

For investors, the application layer may ultimately provide the greatest profit potential, but it will also destroy the most capital.

Most AI startups will fail, and only a few survivors will generate exponential compounding growth.

The most likely investment logic over the next 3 to 5 years is to bet on infrastructure now and on applications later. The smartest money has already laid out this strategy.

The companies that will truly win in Layer 5 will likely be those that possess data that others cannot access.

Interestingly, many of these companies do not even refer to themselves as AI companies.

6. AI Risks: "Isn't This Just a Bubble?"

The biggest enemy of investors is likely themselves. — Benjamin Graham

Let's confront that most common question directly. "What about the internet bubble? Isn’t this the same thing? Huge infrastructure investments, no profits, everyone immersed in hype."

This is a great question and worth a serious response.

The key difference is that during the internet bubble era, when companies were building infrastructure, demand had not yet truly materialized. At that time, companies were frantically laying fiber-optic networks and building server rooms, but real internet users were still using dial-up.

The result was that the infrastructure was built, but the demand did not really appear for another 5 to 7 years. During that time, many companies went bankrupt.

But by 2026, the demand for AI already exists. Nvidia’s chips are in short supply, TSMC's advanced packaging capacity is sold out, and cloud computing rental prices are rising rather than falling. Meanwhile, OpenAI gained 400 million weekly active users between March and October 2025. Models are being used.

Computing power is being consumed. Customers are paying. This does not mean there are no risks. In fact, the risks are very significant, and I probably think about this issue more often than I care to admit.

Three points are particularly worth noting.

Capital Mismatch Risk

In 2026, tech companies will spend over $650 billion on data centers.

If the growth rate of AI service revenue is not sufficient to support these investments, many companies will face severe profit margin compression. Even Amazon’s free cash flow could turn negative this year.

That’s Amazon, the company that virtually invented the cloud computing business model.

Supply Chain Concentration Risk

The AI supply chain is highly concentrated.

TSMC produces about 70% of the world's chips

ASML is the only supplier of EUV lithography machines

Nvidia designs 92% of AI data center GPUs

Any major shock—geopolitical events, natural disasters, changes in competitive landscape—could impact the entire AI industry chain.

For example, a major earthquake in Hsinchu, Taiwan, could set global AI development back several years. This idea should be unsettling.

DeepSeek Variable

In January 2025, China's AI lab DeepSeek released a model. Its performance approaches frontier models, but the training cost is only a small fraction of the original.

This challenges a core assumption: that more investment in computing power always leads to better AI.

If future open-source models and high-efficiency models continue to narrow the gap, then the investment logic in infrastructure will be weakened.

I don’t think DeepSeek has overturned the entire AI investment logic. But it has introduced a previously nonexistent variable. And once this variable appears, it won’t disappear.

But I will always return to a larger framework.

Consulting firms provide long-term forecasts: McKinsey & Company predicts global data center investments will reach $6.7 trillion by 2030; PwC expects AI to contribute $15.7 trillion to global GDP by 2030; and the International Data Corporation (IDC) predicts the cumulative economic impact of AI-related solutions will reach $22.3 trillion.

Even if these numbers are overestimated by 50%, we still face the largest technological-driven economic transformation since the internet. The question is not the direction but the scale.

I often hear people say, "I am skeptical about AI."

That’s perfectly fine.

You can doubt model capabilities, doubt the timeline of development, but don’t ignore the structure of the supply chain.

This is two completely different things. One is healthy rational skepticism, while the other will cause you to miss opportunities.

Five years from now, the winners of this cycle will undoubtedly be very clear.

History is always like this. And the key to this game now is: understand the structure before others do.

7. Participating in This Game at the Right Level

Imagine AI as a five-layer video game. Each layer is a different level.

Level 1: Energy

This is the beginner tutorial level. Important, straightforward, and almost guarantees success as long as you operate normally. Low risk, stable returns.

Just like the quest NPC in a game: won’t die, but keeps giving rewards.

Level 2: Chips

This is the Boss battle. Power is most concentrated, profits are highest. But at the same time, both technical and geopolitical risks are greatest.

Huge rewards, but on Hard mode.

Level 3: Cloud Computing

This is a multiplayer server where all players are active. Hyperscalers act like server administrators, taking a cut from every transaction.

Level 4: Models

This is the PVP arena. Competition is extremely fierce, and the pace of innovation is rapid.

Most players will be eliminated; only the best equipped will survive.

Level 5: Applications

This is the open-world map. Endless possibilities but no fixed rewards. You must seek out tasks yourself.

The real Meta Strategy is simple. You don’t need to play through every level.

Most people will head to Level 5 because it is the most visible. But right now, the smartest money is gaining experience in Level 2 and Level 3 because that’s where the highest returns are at this stage.

Your position in the tech stack determines what you should focus on.

For Non-Technical People

You don’t need to understand how GPUs work. You only need to know that someone has to manufacture GPUs, someone has to build data centers for them, and someone has to supply them with power. And all these companies are public; you can read their financial reports.

For Technical Personnel

You already know the models are getting stronger. But you might underestimate one thing: the real bottleneck is becoming the physical world: power, cooling, chip packaging. The competition in AI over the next decade may be more about engineering problems than model architecture problems found in papers.

For Investors

The AI value chain is actually five distinctly different transactions. Different risks, different time cycles, different winners. Treating AI as a single industry is like treating "tech" as a single industry in 1998. The internal differences are enormous.

This situation will not last forever. One day, infrastructure building will mature, the application layer will consolidate, and value will shift back upward.

The internet era was like this too. Ultimately, the ones who made the most money were Amazon, Google, Facebook, not the fiber-optic and server manufacturers.

But AI has not reached that stage yet. We are still in the infrastructure phase, the phase of selling shovels.

And right now, shovels are making crazy profits. Those who understand the complete tech stack will see the signals before the turning point occurs.

Others will repeatedly be surprised at where the money is actually flowing.

In ten years, understanding the AI technology stack will be as foundational as understanding a balance sheet.

Remember three things: understand the tech stack. Draw out the hierarchical structure. Track capital flow.

This is the game.

[Original Link]

免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

原油波动这么大,现在交易竟然0手续费
广告
|
|
APP
Windows
Mac
Share To

X

Telegram

Facebook

Reddit

CopyLink

|
|
APP
Windows
Mac
Share To

X

Telegram

Facebook

Reddit

CopyLink

Selected Articles by 律动BlockBeats

43 minutes ago
Oil prices soar, inflation reignites: Is the next step for the Federal Reserve a rate hike?
1 hour ago
Review of Venus THE Attack: How to Profit in a Moment's Window?
2 hours ago
Dialogue with Arthur Hayes: AI will trigger a financial crisis, wait until central banks start printing money to buy Bitcoin.
View More

Table of Contents

|
|
APP
Windows
Mac
Share To

X

Telegram

Facebook

Reddit

CopyLink

Related Articles

avatar
avatar律动BlockBeats
43 minutes ago
Oil prices soar, inflation reignites: Is the next step for the Federal Reserve a rate hike?
avatar
avatarTechub News
1 hour ago
6 Major Exchanges Skill In-Depth Comparison, Step-by-Step Guide to Create Your Own "Lobster Trader"
avatar
avatarTechub News
1 hour ago
2026 China's Virtual Currency Judicial Disposal Trends: Shanghai High Court Issues New Regulations
avatar
avatar律动BlockBeats
1 hour ago
Review of Venus THE Attack: How to Profit in a Moment's Window?
APP
Windows
Mac

X

Telegram

Facebook

Reddit

CopyLink