Author: imToken
By the end of 2025, the Ethereum community welcomed the conclusion of the Fusaka upgrade relatively quietly.
Looking back over the past year, although discussions about underlying technology upgrades have gradually faded from the market spotlight, many on-chain users have likely felt a significant change: Ethereum L2 has become increasingly cheaper.
Currently, on-chain interactions, whether for transfers or complex DeFi operations, often incur Gas fees of just a few cents or even negligible amounts. Behind this, the Dencun upgrade and the Blob mechanism have played a crucial role. At the same time, with the official activation of the core feature PeerDAS (Peer Data Availability Sampling), Ethereum is also bidding farewell to the era of "full data download" for data verification.
It can be said that what truly determines whether Ethereum can sustainably support large-scale applications in the long term is not just the Blob itself, but more importantly, the next step represented by PeerDAS.
1. What is PeerDAS?
To understand the revolutionary significance of PeerDAS, we cannot just talk about concepts; we must first revisit a key milestone on Ethereum's scalability journey, namely the Dencun upgrade in March 2024.
At that time, EIP-4844 introduced a transaction model carrying Blobs (embedding large amounts of transaction data into blobs), allowing L2s to no longer rely on the expensive calldata storage mechanism and instead use temporary Blob storage.
This change directly reduced the cost of Rollups to one-tenth of what it was before, ensuring that L2 platforms can offer cheaper and faster transactions without compromising the security and decentralization of Ethereum, allowing users to enjoy the benefits of the "low Gas fee era."
However, while Blobs are very useful, the number of Blobs that each block on the Ethereum mainnet can carry has a hard limit (usually 3-6). The reason is very practical: physical bandwidth and hard drive space are limited.
In the traditional verification model, every validator in the network, whether operated by professional institutions or ordinary computers at home, must still download and propagate the complete Blob data to confirm its validity.
This creates a dilemma:
- If the number of Blobs is increased (to scale up): the data volume surges, home nodes' bandwidth will be maxed out, and hard drives will be filled, forcing them offline, leading to rapid centralization of the network, ultimately becoming a giant chain that only large data centers can operate;
- If the number of Blobs is limited (to maintain decentralization): the throughput of L2 is locked, unable to meet the future explosive growth in demand.
In simple terms, Blobs have only taken the first step, solving the problem of "where to store data." When the data is small, everything is fine, but if the number of Rollups continues to increase in the future, with each Rollup frequently submitting data and Blob capacity constantly expanding, then the bandwidth and storage pressure on nodes will become a new centralization risk.
If the traditional full download model continues to be used, it will not solve the bandwidth pressure, and Ethereum's scalability path will hit a wall due to physical bandwidth limitations. PeerDAS is the key to unlocking this deadlock.
In summary, PeerDAS is essentially a brand new data verification architecture that breaks the iron rule that verification must involve full data downloads, allowing the expansion of Blobs to exceed current physical throughput levels (for example, jumping from 6 Blobs/block to 48 or even more).
2. Blob solves "where to store," PeerDAS solves "how to store efficiently"
As mentioned above, Blobs have taken the first step in scaling, solving the problem of "where to store data" (moving from expensive calldata to temporary Blob space), while PeerDAS aims to solve the question of "how to store more efficiently."
The core issue it addresses is how to handle exponential data growth without overwhelming the physical bandwidth of nodes. The approach is straightforward: based on probability and distributed collaboration, "not everyone needs to store the full data to confirm that the data truly exists with high probability."
This can be inferred from the full name of PeerDAS, "Peer Data Availability Sampling."
This concept may sound obscure, but we can use a simple analogy to understand this paradigm shift. For instance, in the past, full verification was like a library acquiring a multi-thousand-page "Encyclopedia Britannica" (Blob data). To prevent loss, every administrator (node) was required to make a complete copy of the book as a backup.
This meant that only those with money and time (large bandwidth/storage) could be administrators. Especially as the "Encyclopedia Britannica" (Blob data) continued to expand, with more and more content, ordinary people would eventually be eliminated, and decentralization would vanish.
Now, based on PeerDAS sampling, technologies like Erasure Coding have been introduced, allowing the book to be torn into countless fragments and mathematically encoded. Each administrator no longer needs to hold the entire book but only needs to randomly select a few pages to keep.
Even during verification, no one needs to present the entire book. Theoretically, as long as the network gathers any 50% of the fragments (regardless of whether they hold page 10 or page 100), we can use mathematical algorithms to instantly reconstruct the entire book with 100% certainty.
This is the magic of PeerDAS—it offloads the burden of downloading data from individual nodes and distributes it across a collaborative network of thousands of nodes.
Source: @Maaztwts
From a purely data perspective, before the Fusaka upgrade, the number of Blobs was firmly capped at single digits (3-6). The implementation of PeerDAS directly tore this limit open, allowing the Blob target to jump from 6 to 48 or even more.
When a user initiates a transaction on Arbitrum or Optimism, and the data is packaged and sent back to the mainnet, there is no longer a need to broadcast the complete data package across the entire network, enabling Ethereum to achieve scalability without a linear increase in node costs.
Objectively speaking, Blob + PeerDAS is the complete DA (Data Availability) solution. From a roadmap perspective, this is also a key transition for Ethereum from Proto-Danksharding to complete Danksharding.
3. The new normal on-chain in the post-Fusaka era
As we all know, in the past two years, third-party modular DA layers like Celestia gained significant market space due to the high costs of the Ethereum mainnet. Their narrative logic was built on the premise that Ethereum's native data storage is expensive.
Now, with Blobs and the latest PeerDAS, Ethereum has become both cheap and extremely secure: the cost for L2 to publish data to L1 has been cut by more than half, and Ethereum boasts the largest validator set in the entire network, with security far exceeding that of third-party chains.
Objectively, this represents a dimensionality reduction attack on third-party DA solutions like Celestia, marking Ethereum's reclaiming of sovereignty over data availability and significantly squeezing their survival space.
You might ask, how does all this relate to me using wallets, making transfers, and DeFi?
The relationship is very direct. If PeerDAS can be successfully implemented, it means that the data costs for L2 can remain low in the long term, and Rollups will not be forced to raise fees due to a rebound in DA costs. On-chain applications can confidently design high-frequency interactions, and wallets and DApps will not have to repeatedly compromise between "functionality vs. cost"…
In other words, the affordable L2 we can use today is thanks to Blobs, and if we can continue to afford it in the future, it will rely on the silent contributions of PeerDAS.
This is why, in Ethereum's scalability roadmap, PeerDAS, though low-key, is always regarded as an indispensable stop. Essentially, this is the best form of technology in my view—"benefiting without feeling it, difficult to sustain if lost," making you unaware of its existence.
Ultimately, PeerDAS proves that blockchain can carry Web2-level massive data through ingenious mathematical designs (such as data sampling) without excessively sacrificing the vision of decentralization.
Thus, the data highway of Ethereum has been fully paved, and what vehicles will run on this road next is a question for the application layer to answer.
Let us wait and see.
免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

