Parallel EVM Envisions a Crypto World: Reshaping the Landscape of dApps and User Experience

CN
1 year ago

Original Author: Reforge Research

Original Translator: Deep Tide TechFlow

Franklin once said, "In this world, only death and taxes are inevitable."

The original title of this article is Death, Taxes, and Parallel EVM.

As parallel EVM becomes an inevitable trend in the crypto world, what will a crypto world using parallel EVM look like?

Reforge Research has explored this idea from a technical and application perspective, and the following is the full translation.

Introduction

In today's computer systems, making things faster and more efficient often means parallel processing tasks rather than sequential. This phenomenon, known as parallelization, is catalyzed by the emergence of modern computer multi-core processor architectures. Tasks traditionally executed step by step are now handled from a perspective of simultaneity, maximizing the capabilities of processors. Similarly, in blockchain networks, this principle of executing multiple operations at once is applied at the transaction level, not by using multiple processors, but by leveraging the collective validation capabilities of numerous validators in the network. Some early implementation examples include:

  • In 2015, Nano (XNO) implemented a block-lattice structure, where each account has its own blockchain, allowing for parallel processing and eliminating the need for network-wide transaction confirmation.

  • In 2018, the paper on Block-STM (Software Transactional Memory) parallel execution engine for blockchain networks was published, Polkadot approached parallelization through a multi-chain architecture, and EOS introduced their multi-threaded processing engine.

  • In 2020, Avalanche introduced parallel processing for its consensus (rather than a serialized EVM c-chain), and Solana introduced a similar innovation called Sealevel.

For EVM, since its inception, transactions and smart contract execution have always been carried out sequentially. This single-threaded execution design limits the throughput and scalability of the entire system, especially during periods of high network demand. As network validators face increased workloads, the network inevitably slows down, and users face higher costs as they compete to prioritize their transactions in congested network environments.

The Ethereum community has long discussed parallel processing as a solution, initially starting with Vitalik's EIP in 2017. The initial goal was to achieve parallelization through traditional sharding or fragmentation. However, with the rapid development and adoption of L2 rollups, which offer simpler and more immediate scalability benefits, Ethereum's focus has shifted from sharding to what is now referred to as danksharding. Through danksharding, shards are primarily used as a layer for data availability, rather than for parallel transaction execution. However, with the full implementation of danksharding yet to be achieved, attention has turned to several key alternative parallel L1 networks, which prominently feature EVM compatibility, particularly Monad, Neon EVM, and Sei.

Given the traditional evolution of software system engineering and the successful scalability of other networks, parallel execution of EVM is inevitable. While we are confident in this transition, the future beyond this remains uncertain but highly promising. The impact on the world's largest smart contract developer ecosystem, proud of its total locked value of over $80 billion, is significant. What will happen when gas prices plummet to a fraction of a cent due to optimized state access? How broad will the design space for application layer developers become? The following is our vision of what the post-parallel EVM world might look like.

Parallelization is a means, not an end

Scaling blockchain is a multidimensional problem, and parallel execution paves the way for the development of more critical infrastructure, such as blockchain state storage.

For projects committed to parallel EVM, the main challenge lies not only in enabling computations to run simultaneously, but in ensuring optimized state access and modification in a parallelized environment. The core of the problem lies in two main issues:

  • Ethereum clients and Ethereum itself use different data structures for storage (B-trees/LSM trees and Merkle Patricia Trie), embedding one data structure within another leads to poor performance.

  • The ability to perform asynchronous input/output (I/O) for transaction reads and updates through parallel execution is crucial; processes may deadlock due to mutual waiting, wasting any speed gains.

Compared to the cost of retrieving or setting storage values, adding a large number of additional SHA-3 hashes or computational tasks such as cost is secondary. To reduce transaction processing time and gas prices, the infrastructure of the database itself must be improved. This goes beyond simply adopting traditional database architectures as a replacement for the original key-value store (i.e., SQL databases). Implementing the relational model for EVM state adds unnecessary complexity and overhead, resulting in higher costs for 'sload' and 'sstore' operations compared to using basic key-value storage. EVM state does not require features such as sorting, range scans, or transaction semantics, as it only performs point reads and writes, with writes occurring separately at the end of each block. Therefore, these improved requirements should focus on addressing key considerations such as scalability, low-latency read/write, efficient concurrency control, state pruning and archiving, and seamless integration with EVM. For example, Monad is building a custom state database called MonadDB from scratch. It will leverage the latest kernel support for asynchronous operations while locally implementing the Merkle Patricia Trie data structure in both disk and memory.

We expect further reshaping of the underlying key-value database and significant improvements to third-party infrastructure supporting the majority of blockchain storage capabilities.

Making Programmable Central Limit Order Books (pCLOB) Great Again

As DeFi transitions to higher-fidelity states, CLOB will become the dominant design approach.

Since its debut in 2017, automated market makers (AMMs) have become a cornerstone of DeFi, providing simplicity and unique liquidity bootstrapping capabilities. By leveraging liquidity pools and pricing algorithms, AMMs have fundamentally transformed DeFi, becoming the best alternative to traditional trading systems (such as order books). While central limit order books (CLOB) are a fundamental building block in traditional finance, when introduced to Ethereum, they encountered blockchain scalability limitations. They require a large number of transactions, as each order submission, execution, cancellation, or modification requires a new on-chain transaction. Due to Ethereum's immature scalability efforts, the associated costs made CLOB impractical in the early days of DeFi, leading to the failure of early versions such as EtherDelta. However, even as AMMs gained popularity, they also faced their own inherent limitations. As DeFi attracted more sophisticated traders and institutions over the years, these limitations became increasingly apparent.

Realizing the superiority of CLOB, efforts to incorporate CLOB-based exchanges into DeFi have begun to increase on other alternative, more scalable blockchain networks. Protocols such as Kujira, Serum (RIP), Demex, dYdX, Dexalot, and the recent Aori and Hyperliquid aim to provide a better on-chain trading experience relative to their AMM counterparts. However, in addition to scalability, CLOBs on these alternative networks face their own set of challenges, except for projects targeting specific areas (such as dYdX and Hyperliquid for perpetual contracts).

  • Liquidity dispersion: The network effects achieved by highly composable and seamlessly integrated DeFi protocols on Ethereum make it difficult for other on-chain CLOBs to attract sufficient liquidity and trading volume, hindering their adoption and availability.

  • Meme coins: Guiding on-chain liquidity for CLOBs requires limit orders, which is a more challenging chicken-and-egg problem for new and lesser-known assets like meme.

CLOB with blob

Envisioning the Parallel EVM Crypto World: Reshaping the Landscape of dApps and User Experience

But what about L2? Existing Ethereum L2 stacks have significantly improved transaction throughput and gas costs compared to the mainnet, especially after the recent Dencun hard fork. By replacing gas-intensive calldata with lightweight binary large objects (blob), costs have been greatly reduced. According to growthepie data, as of April 1st, Arbitrum and OP fees were $0.028 and $0.064 respectively, while Mantle was the cheapest at $0.015. This is a significant difference compared to before the KanKun upgrade, where calldata accounted for 70%-90% of the costs. Unfortunately, this is still not cheap enough, as the $0.01 subsequent/cancellation fee is still considered expensive. For example, institutional traders and market makers often have a high order-to-trade ratio, meaning they place a higher number of orders relative to the actual executed trades. Even at today's L2 fee pricing, paying for order submissions and then modifying or canceling these orders across multiple ledgers can significantly impact the profitability and strategic decision-making of institutional participants. Imagine the following scenario:

Company A: The standard benchmark is 10,000 order submissions, 1,000 trades, and 9,000 cancellations or modifications per hour. If the company operates on 100 ledgers in a day, the total activity could easily result in over $150,000 in fees, even if a trade costs less than $0.01.

pCLOB

Envisioning the Parallel EVM Crypto World: Reshaping the Landscape of dApps and User Experience

With the emergence of parallel EVM, we expect a surge in DeFi activity, primarily driven by the feasibility of on-chain CLOBs. But not just any CLOB - programmable Central Limit Order Books (pCLOB). Given that DeFi is inherently composable and can interact with an infinite number of protocols, it can create a large number of transaction permutations. Leveraging this, pCLOB can enable custom logic during the order submission process. This logic can be invoked before or after order submission. For example, pCLOB smart contracts can include custom logic to:

  • Validate order parameters based on predefined rules or market conditions (e.g., price and quantity)
  • Perform real-time risk checks (e.g., ensuring sufficient margin or collateral for leveraged trades)
  • Apply dynamic fee calculations based on any parameters (e.g., order type, trade volume, market volatility, etc.)
  • Execute conditional orders based on specified trigger conditions

A significant step cheaper than existing trade designs.

The concept of Just-in-Time (JIT) liquidity illustrates this well. Liquidity does not sit idle on any single exchange; it generates yield elsewhere until it is matched and liquidity is extracted from the underlying platform. Who wouldn't want to harvest every bit of yield on MakerDAO before seeking liquidity for a trade? The innovative "offer-is-code" approach of Mangrove Exchange implies the potential. When a quote in an order is matched, the embedded code portion will execute a unique mission to find the requested liquidity for the order recipient. This indicates that there are still challenges in L2 scalability and cost.

Parallel EVM also greatly enhances the matching engine of pCLOB. pCLOB can now implement a parallel matching engine, utilizing multiple "channels" to simultaneously process incoming orders and execute matching calculations. Each channel can handle a subset of the order book, thus removing price-time priority constraints and only executing when a match is found. The reduction in latency between order submission, execution, and modification allows for optimal efficiency in order book updates.

Keone Hon, Co-founder and CEO of Monad, stated: Due to the continuous market-making ability in times of illiquidity, AMMs are expected to continue to be widely used for long-tail assets; however, pCLOB will dominate for "blue-chip" assets.

In a discussion with Keone, Co-founder and CEO of Monad, he believes we can expect multiple pCLOBs to gain attention in different high-throughput ecosystems. Keone emphasized that these pCLOBs will have a significant impact on the larger DeFi ecosystem due to lower fees.

Even with just a few of these improvements, we expect pCLOB to have a significant impact on capital efficiency and unlock new categories within DeFi.

More applications are needed, but first…

Existing and new applications need to be architecturally designed to fully leverage underlying parallelism.

Apart from pCLOB, current decentralized applications are not parallel; their interaction with the blockchain is sequential. However, history has shown that technology and applications naturally evolve to take advantage of new advancements, even if they were not initially considered.

Steven Landers, Blockchain Architect at Sei, stated: When the first iPhone was released, the applications designed for it looked like bad computer applications. The situation here is similar. We are adding multi-core to the blockchain, which will lead to better applications.

The development from showcasing magazine catalogs on the internet to the existence of powerful bilateral marketplaces in e-commerce is a typical example. With the emergence of parallel EVM, we will witness a similar transformation of decentralized applications. This further emphasizes a key limitation: applications that do not consider parallelism will not benefit from the efficiency gains of parallel EVM. Therefore, it is not enough for them to have parallelism only at the infrastructure layer without a redesign at the application layer. They must remain consistent in architecture.

State competition

Even without any changes to the applications themselves, we still expect a slight performance improvement of 2-4 times, but why stop there when it can be improved even more? This shift introduces a key challenge: applications need to be fundamentally redesigned to accommodate the subtle differences in parallel processing.

Steven Landers, Blockchain Architect at Sei, stated: If you want to leverage throughput, you need to limit competition between transactions.

More specifically, when multiple transactions from decentralized applications simultaneously attempt to modify the same state, conflicts arise. Resolving conflicts requires serializing conflicting transactions, which offsets the benefits of parallelization.

Conflict resolution has many methods, which we won't discuss at the moment, but the potential number of conflicts encountered during execution largely depends on the application developers. Even within the scope of decentralized applications, even the most popular protocols like Uniswap have not considered or implemented such restrictions. Co-founder of Aori, 0xTaker, and I had an in-depth discussion about the major state disputes that will occur in the parallel world. For an AMM, due to its peer-to-pool model, many participants may simultaneously target a single pool. From a few to over 100 trades will compete for state, so AMM designers will have to carefully consider the distribution and management of liquidity in the state to maximize the benefits of pooling.

Steven, a core developer at Sei, also emphasized the importance of considering competition in multi-threaded development and pointed out that Sei is actively researching the implications of parallelization and how to ensure full resource utilization.

Predictability of Performance

Yilong, Co-founder and CEO of MegaETH, also emphasized the importance of decentralized applications seeking predictability of performance. Predictability of performance refers to the ability of decentralized applications to consistently execute transactions over a period of time, unaffected by network congestion or other factors. One way to achieve this goal is through application-specific chains; however, while application-specific chains provide predictable performance, they sacrifice composability.

Co-founder of Aori, 0xTaker, stated: Parallelization provides a way to experiment with local fee markets to minimize state disputes.

Advanced parallelism and multi-dimensional fee mechanisms can provide more predictable performance for individual applications on a single blockchain while maintaining overall composability.

Solana has a nice fee market system that is localized, so if multiple users access the same state, they will pay more fees (peak pricing) rather than bidding against each other in a global fee market. This approach is particularly advantageous for loosely connected protocols that require both predictability of performance and composability. To illustrate this concept, consider a highway system with multiple lanes and dynamic tolls. During peak hours, the highway can allocate dedicated express lanes for vehicles willing to pay higher tolls. These express lanes ensure predictable and faster travel times for those prioritizing speed and willing to pay a premium. Meanwhile, regular lanes remain open to all vehicles, maintaining the overall connectivity of the highway system.

Considering All Possibilities

While redesigning protocols to be consistent with underlying parallelism may seem challenging, the design space that can be realized is significantly expanded in DeFi and other verticals. We can expect to see a new generation of applications that are more complex, efficient, and focused on use cases that were previously impractical due to performance limitations.

Envisioning the Parallel EVM Crypto World: Reshaping the Landscape of dApps and User Experience

Keone Hon, Co-founder and CEO of Monad, stated: Going back to 1995, the only internet plan was to pay $0.10 for every 1MB of data downloaded, and you would carefully choose which websites to visit. Imagine the shift from that time to unlimited, and notice how people's behavior and what becomes possible have changed.

It is possible that we will return to a scenario similar to the early days of centralized exchanges, a war for user acquisition, where DeFi applications, especially decentralized exchanges, will offer referral programs (i.e., points, airdrops) and superior user experiences as weapons. We see a world where any reasonable interaction in on-chain gaming could actually become a thing. Hybrid order book-AMMs already exist, but instead of decentralizing CLOB sorters as standalone nodes through governance, it's better to move them on-chain, achieving decentralized improvements, lower latency, and enhanced composability. Fully on-chain social interactions are now also feasible. Frankly, anything involving a large number of people or agents simultaneously performing some kind of operation is now within the realm of discussion.

In addition to humans, intelligent agents are likely to dominate on-chain transaction flows more than they do currently. AI as part of gaming has existed for some time, with the ability to arbitrage and autonomously execute trades, but their involvement will increase exponentially. Our theory is that any form of on-chain participation will be augmented to some degree by artificial intelligence. The latency requirements for agents to trade will be more important than what we imagine today.

Ultimately, technological progress is just a foundational enabling factor. The ultimate winners will depend on their ability to attract users and drive trading volume/liquidity better than their peers. The difference now is that developers have more resources to work with.

Cryptocurrency user experience has been terrible, now it won't be so terrible

User Experience Unification (UXU) is not only feasible but necessary, and the industry is definitely moving towards achieving this goal.

Today's blockchain user experience is decentralized and cumbersome, requiring users to operate across multiple blockchains, wallets, and protocols, wait for transactions to complete, and potentially face risks of security vulnerabilities or hacking attacks. The ideal future is for users to seamlessly and securely interact with their assets without worrying about the underlying blockchain infrastructure. The process we call User Experience Unification (UXU) is the transition from the current decentralized user experience to a unified, simplified experience.

Fundamentally, improving blockchain performance, especially by reducing latency and costs, can significantly help address user experience issues. Historically, performance improvements have often positively impacted various aspects of our digital user experience. For example, faster internet speeds not only enabled seamless online interactions but also drove demand for richer, more immersive digital content. The emergence of broadband and fiber optic technologies facilitated low-latency streaming of high-definition videos and real-time online gaming, raising user expectations for digital platforms. The continuous pursuit of depth and quality spawned ongoing innovation in developing the next significant, eye-catching innovation - from advanced interactive web content to complex cloud-based services, and to virtual/augmented reality experiences. Improving internet speed not only enhanced the online experience itself but also expanded the range of user demands.

Similarly, improving blockchain performance will not only directly enhance user experience by reducing latency but also indirectly enhance user experience by giving rise to protocols that unify and elevate the overall user experience. Performance is a key element of their existence. In particular, these networks, especially parallel EVM, with higher performance and lower gas fees mean that the process of entering and exiting will be more frictionless for end users, thus attracting more developers. In a conversation with Sergey, Co-founder of Axelar interoperability network, he envisioned a world that is not only truly interoperable but more symbiotic.

Sergey stated: If you have complex logic on a high-throughput chain (e.g., parallel EVM), and because of its high performance, the chain itself can "absorb" the complexity and throughput requirements of that logic, then you can use interoperability solutions to effectively export that functionality to other chains.

Felix Madutsa, Co-founder of Orb Labs, stated: With scalability issues resolved and increased interoperability between different ecosystems, we will witness the emergence of protocols that bridge the Web3 user experience with Web2. Some examples include second-generation intent-based protocols, advanced RPC infrastructure, chain abstraction capabilities, and open computing infrastructure enhanced by artificial intelligence.

Other Aspects

As performance requirements increase, the oracle market will become lively.

Parallel EVM means that the performance requirements for oracles will increase, which has been a severely lagging vertical in the past few years. The growth in demand at the application layer will activate a market full of substandard performance and security, thereby improving the composability performance of DeFi. For example, market depth and trading volume are two powerful indicators for many DeFi primitives, such as money markets. We expect large established companies like Chainlink and Pyth to be able to adapt relatively quickly, as new entrants challenge their market share in this new era. After talking to senior members of Chainlink, our idea is consistent: "Chainlink's consensus is that if parallel EVM becomes dominant, we may want to reshape our contracts to capture value from it (e.g., reduce dependencies between contracts, so that transactions/calls do not unnecessarily depend and therefore are not subject to MEV). However, because parallel EVM is designed to increase transparency and throughput for applications already running on EVM, it should not affect network stability."

This indicates that Chainlink understands the impact of parallel execution on its products, and as previously emphasized, to take advantage of parallelism, they will have to reshape their contracts.

This is not just an L1 party; parallel EVM L2 also wants to join in.

From a technical perspective, creating a high-performance parallel EVM L2 solution is easier than developing L1. This is because in L2, the setup of sorters is simpler than the consensus-based mechanisms (such as Tendermint and its variants) used in traditional L1 systems. This simplicity stems from the fact that sorters in parallel EVM L2 setups only need to maintain the order of transactions, unlike consensus-based L1 systems where many nodes must agree on the order.

More specifically, we expect that in the short term, optimistic parallel EVM L2 will dominate its zero-knowledge peers. Ultimately, we expect a transition from OP-based rollups to zk-rollups through a general zero-knowledge framework (such as RISC0), rather than the traditional methods used in other zk-rollups, it's just a matter of time.

Currently, is Rust taking the lead?

The choice of programming language will play a crucial role in the development of these systems. We are more inclined towards Rust, Ethereum's Rust implementation Reth, rather than any other alternatives. This preference is not arbitrary, as Rust has many advantages over other languages, including memory safety without garbage collection, zero-cost abstractions, and a rich type system, among others.

In our view, the competition between Rust and C++ is becoming an important competition in the new generation of blockchain development languages. Although this competition is often overlooked, it is not to be ignored. Language choice is crucial because it affects the efficiency, security, and versatility of the systems developers build.

Envisioning the Parallel EVM Crypto World: Reshaping the Landscape of dApps and User Experience

Developers are the ones who will make these systems a reality, and their preferences and expertise are crucial for the industry's direction. We firmly believe that Rust will ultimately succeed. However, migrating an implementation to another is no easy task. It requires a significant amount of resources, time, and expertise, further emphasizing the importance of choosing the right language from the start.

In the context of parallel execution, it is inappropriate not to mention Move. While Rust and C++ are often the focus of discussion, Move has several features that make it equally suitable:

  • Move introduces the concept of "resources," a type that can only be created, moved, or destroyed, but not copied. This ensures that resources are always uniquely owned, preventing common issues in parallel execution such as race conditions and data races.
  • Formal verification and static typing: Move is a statically typed language with an emphasis on safety. It includes features such as type inference, ownership tracking, and overflow checks, which help prevent common programming errors and vulnerabilities. These safety features are particularly important in a parallel execution environment, as errors may be more difficult to detect and reproduce. The language's semantics and type system are based on linear logic, similar to Rust and Haskell, making it easier to reason about the correctness of Move programs, so formal verification can ensure that concurrent operations are safe and correct.
  • Move advocates for a modular design approach, where smart contracts are composed of smaller, reusable modules. This modular structure makes it easier to understand the behavior of individual components and promotes parallel execution by allowing different modules to execute simultaneously.

Future Considerations: EVM Insecurity Needs Improvement

While we have painted a very optimistic picture of the on-chain universe after parallel EVM, it all means nothing if the issues with EVM and smart contract security are not addressed.

Envisioning the Parallel EVM Crypto World: Reshaping the Landscape of dApps and User Experience

Unlike network economics and consensus security, hackers exploited smart contract security vulnerabilities in Ethereum DeFi protocols, illegally obtaining over $1.3 billion in 2023. As a result, users prefer walled CEX or hybrid "decentralized" protocols, where there are centralized validator sets, sacrificing decentralization for what is considered a more secure (and better performing) on-chain experience.

Envisioning the Parallel EVM Crypto World: Reshaping the Landscape of dApps and User Experience

The inherent lack of security features in EVM design is the root cause of these vulnerabilities.

Analogous to the aerospace industry, where strict safety standards have made air travel very safe, we see a stark contrast in the approach to security in the blockchain. Just as people value their lives above all else, the security of their financial assets is equally important. Key practices such as comprehensive testing, redundancy, fault tolerance, and strict development standards underpin the safety record of the aviation industry. These key features are currently widely lacking in EVM, and in most cases, in other VMs as well.

A potential solution is to adopt a dual VM setup, where a separate VM, such as CosmWasm, monitors the real-time execution of EVM smart contracts, similar to how antivirus software functions within an operating system. This structure allows for advanced checks, such as call stack checks, specifically designed to reduce hacking incidents. However, this approach requires significant upgrades to existing blockchain systems. We expect new, better-positioned solutions like Arbitrum Stylus and Artela to successfully implement this architecture from the start.

Existing security primitives in the market often react to upcoming or attempted threats by checking the memory pool or auditing/reviewing smart contract code. While these mechanisms are helpful, they fail to address potential vulnerabilities in VM design. A more effective and proactive approach must be taken to reshape and enhance the security of blockchain networks and their application layers.

We advocate for a thorough overhaul of blockchain VM architecture to embed real-time protection and other critical security features, possibly through a dual VM setup, to align with industries that have already successfully used this approach (such as the aerospace industry). Looking ahead, we are eager to support infrastructure enhancements that emphasize preventive approaches, ensuring progress in security matches industry progress in performance (i.e., parallel EVM).

Conclusion

The emergence of Parallel EVM marks a significant turning point in the evolution of blockchain technology. By implementing concurrent transaction execution and optimizing state access, Parallel EVM opens up new possibilities for decentralized applications. From the revival of programmable CLOB to the emergence of more complex and higher-performance applications, Parallel EVM lays the foundation for a more unified and user-friendly blockchain ecosystem. As the industry embraces this paradigm shift, we can expect to see a wave of innovation pushing the boundaries of decentralized technology. Ultimately, the success of this transition will depend on whether developers, infrastructure providers, and the broader community can adapt and align with the principles of parallel execution, seamlessly integrating the technology into our daily lives.

The emergence of Parallel EVM has the potential to reshape the landscape of decentralized applications and user experience. By addressing the scalability and performance limitations that have hindered the growth of critical verticals such as DeFi, it enables complex, high-throughput applications to thrive without sacrificing the trilemma, opening a door.

Realizing this vision requires more than just infrastructure advancements. Developers must fundamentally rethink the architecture of their applications to align with the principles of parallel processing, minimizing state contention and maximizing performance predictability. Even so, in the bright future ahead, we must emphasize that beyond scalability, security prioritization is crucial.

Original Article Link

免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

Share To
APP

X

Telegram

Facebook

Reddit

CopyLink