a16z: 17 Major Potential Trends in the Crypto Space for 2026

CN
4 hours ago

Covering agents and artificial intelligence, stablecoins, tokenization and finance, privacy and security, extending to prediction markets, SNARKs, and other applications.

Written by: Adeniyi Abiodun, Ali Yahya, Andrew Hall, Arianna Simpson, Christian Crowley, Daejun Park, Elizabeth Harkavy, Guy Wuollet, Jeremy Zhang, Justin Thaler, Maggie Hsu, Miles Jennings, Pyrs Carvolth, Robert Hackett, Sam Broner, Scott Duke Kominers, Sean Neville, Shane Mac, and Sonal Chokshi

Translated by: Saoirse, Foresight News

This week, a16z released its annual "Major Ideas" report, featuring insights from partners across its Apps, American Dynamism, Bio, Crypto, Growth, Infra, and Speedrun teams. Below are 17 observations from several partners in the a16z crypto space (including a few guest contributors) regarding industry trends for 2026—covering agents and artificial intelligence, stablecoins, tokenization and finance, privacy and security, extending to prediction markets, SNARKs, and other applications, and finally discussing directions for industry development.

On Stablecoins, RWA Tokenization, Payments, and Finance

1. Higher Quality, More Flexible Stablecoin On/Off Ramps

Last year, stablecoin transaction volume was estimated to reach $46 trillion, continuously setting historical peaks. Intuitively, this scale is over 20 times that of PayPal's transaction volume and nearly 3 times that of Visa, one of the world's largest payment networks, and is rapidly approaching the transaction volume of the U.S. Automated Clearing House (ACH), which is the electronic network for processing financial transactions like direct deposits in the U.S.

Today, sending a stablecoin takes less than 1 second, with transaction fees of less than 1 cent. However, the core unresolved issue is: how to connect these "digital dollars" with the financial systems people use daily—namely, the "on/off ramp" for stablecoins.

A new generation of startups is filling this gap, driving the integration of stablecoins with more widespread payment systems and local currencies: some companies are using cryptographic proof technology to allow users to privately convert local currency balances into digital dollars; others are integrating regional networks to facilitate interbank transfers using QR codes and real-time payment channels; and some are creating a truly interoperable global wallet layer and card issuance platform that allows users to spend stablecoins directly at everyday merchants. These solutions collectively expand the participation in the digital dollar economy and may accelerate the adoption of stablecoins as mainstream payment tools.

As on/off ramps mature and digital dollars are directly integrated into local payment systems and merchant tools, new application scenarios will emerge: cross-border workers can receive payments in real-time, merchants can accept global dollars without needing a bank account, and applications can instantly settle value with global users. At that point, stablecoins will completely transform from "niche financial tools" to "the foundational settlement layer of the internet."

—— Jeremy Zhang, a16z Crypto Engineering Team

2. Reconstructing RWA Tokenization and Stablecoins with "Crypto-Native Thinking"

Currently, banks, fintech companies, and asset management institutions show strong interest in "on-chain traditional assets," involving U.S. stocks, commodities, indices, and other traditional assets. However, as more traditional assets go on-chain, their tokenization processes often fall into the "reification trap"—that is, being limited to the existing forms of real-world assets and failing to leverage the advantages of crypto-native characteristics.

Synthetic derivatives like perpetual futures can provide deeper liquidity and are easier to implement. Additionally, the leverage mechanism of perpetual contracts is easy to understand, making them the "most product-market fit" crypto-native derivatives. Furthermore, emerging market stocks are one of the asset classes most suitable for "perpetualization" (the liquidity of "zero-day expiration options" for some stocks has already surpassed that of the spot market, making their perpetualization a highly valuable attempt).

This is essentially a choice between "fully on-chain vs. tokenization," but regardless, we will see more "crypto-native" RWA tokenization solutions in 2026.

Similarly, stablecoins entered the mainstream market in 2025, with outstanding issuance continuing to grow; in 2026, the stablecoin space will shift from "pure tokenization" to "innovative issuance models." Currently, stablecoins lacking a robust credit infrastructure resemble "narrow banks"—holding only highly secure specific liquid assets. While the narrow bank model is reasonable, it is unlikely to become a core pillar of the on-chain economy in the long run.

Currently, several new asset management institutions, asset managers, and protocols are beginning to explore "on-chain asset collateralized lending based on off-chain collateral," but such loans are often initiated off-chain before being tokenized. I believe that in this model, the value of tokenization is quite limited, serving only users who have already entered the on-chain ecosystem. Therefore, debt assets should be "initiated directly on-chain," rather than "initiated off-chain and then tokenized"—on-chain initiation can reduce loan service costs, backend architecture costs, and improve accessibility. Although compliance and standardization remain challenges, developers are actively working to address these issues.

—— Guy Wuollet, a16z Crypto General Partner

3. Stablecoins Drive Bank Ledger Upgrades, Unlocking New Payment Scenarios

Most of the software currently used by banks is nearly "unrecognizable" to modern developers: in the 1960s and 70s, banks were early adopters of large software systems; in the 80s and 90s, second-generation core banking software emerged (such as Temenos's GLOBUS and Infosys's Finacle). However, these systems have gradually aged, and their update speed is extremely slow—today, the banking industry (especially core ledger systems, which are key databases recording deposits, collateral, and other liabilities) still often relies on mainframes, using COBOL programming and batch file interfaces instead of APIs.

The vast majority of global assets are stored in these "decades-old core ledgers." Although these systems have been validated through long-term practice, received regulatory approval, and are deeply integrated into complex banking business scenarios, they severely hinder innovation: adding key features like real-time payments (RTP) can take months or even years, and must contend with layers of technical debt and regulatory complexity.

The value of stablecoins is reflected here: over the past few years, stablecoins have not only achieved "product-market fit" and entered the mainstream, but in 2025, traditional financial (TradFi) institutions will "fully embrace" stablecoins. Stablecoins, tokenized deposits, tokenized government bonds, and on-chain bonds allow banks, fintech companies, and financial institutions to develop new products and serve new customers—more importantly, without forcing these institutions to reconstruct their "aging but decades-stable" legacy systems. Stablecoins provide a "low-risk innovation path" for financial institutions.

—— Sam Broner

4. The Internet Will Become the "Next Generation Bank"

As AI agents become widely adopted, more business activities will be "automatically completed in the background" (rather than relying on user clicks), which means that the "flow of value (currency)" must change accordingly.

In a world where "systems act on intent" (rather than step-by-step instructions)—for example, AI agents automatically transfer funds after identifying needs, fulfilling obligations, or triggering outcomes—the flow of value must possess "the same speed and freedom as current information flows." Blockchain, smart contracts, and new protocols are key to achieving this goal.

Today, smart contracts can complete global dollar payments in seconds; by 2026, emerging foundational protocols like x402 will enable "settlements to be programmable and responsive": agents can pay for data, GPU computing power, or API calls instantly and without permission, without invoicing, reconciliation, or batch processing; software updates released by developers can embed payment rules, limits, and audit trails without needing fiat integration, merchant onboarding, or reliance on banks; prediction markets can "automatically settle in real-time" as events unfold—odds updates, agent trades, and global payouts can be completed in seconds without the need for custodians or exchanges.

When value can flow in this way, the "payment process" will no longer be an independent operational layer but will become "network behavior": banks will integrate into internet infrastructure, and assets will become infrastructure. If currency can flow like "internet-routable data packets," the internet will no longer be "supporting the financial system," but will "itself become the financial system."

—— Christian Crowley, Pyrs Carvolth, a16z Crypto Market Expansion Team

5. Wealth Management Services Accessible to Everyone

Traditionally, personalized wealth management services have only been available to banks' "high-net-worth clients": customized advice and portfolio adjustments across asset classes are costly and operationally complex. However, as more asset classes are tokenized, crypto channels enable personalized strategies of "AI recommendations + assisted decision-making" to be "executed instantly and rebalanced at low cost."

This goes far beyond "smart investment advisory": everyone can access "active portfolio management" (rather than just passive management). By 2025, traditional financial institutions will have increased their allocation of crypto assets in portfolios (banks recommend a direct allocation of 2%-5% or through exchange-traded products (ETPs)), but this is just the beginning; in 2026, we will see the rise of platforms "aimed at wealth accumulation" (rather than just focusing on wealth preservation)—fintech companies like Revolut and Robinhood, as well as centralized exchanges like Coinbase, will seize this market advantage with their tech stack.

At the same time, DeFi tools like Morpho Vaults can automatically allocate assets to "risk-adjusted return optimal" lending markets, providing "core yield allocation" for portfolios. Holding idle liquid funds in stablecoins (rather than fiat) and in tokenized money market funds (rather than traditional money funds) can further expand yield opportunities.

Finally, tokenization, while meeting compliance and reporting requirements, also makes it easier for retail investors to access "illiquid private market assets" (such as private credit, pre-IPO company equity, and private equity). When various asset classes in a balanced portfolio (from bonds to stocks to private and alternative assets) are all tokenized, rebalancing can be automatically completed without wire transfers.

—— Maggie Hsu, a16z Crypto Market Expansion Team

On Agents and AI

6. From KYC to KYA

Currently, the bottleneck of the "agent economy" is shifting from "intelligence level" to "identity recognition."

In the financial services sector, the number of "non-human identities" (such as AI agents) has reached 96 times that of human employees, but these identities remain "ghosts that cannot access the banking system"—the core missing foundational capability is KYA (Know Your Agent).

Just as humans need credit scores to obtain loans, agents also require "cryptographic signature credentials" to complete transactions—credentials that must be associated with the agent's "principal," "constraints," and "liability." If this issue remains unresolved, merchants will continue to block agents at the firewall level. The industries that have built KYC infrastructure over the past few decades now need to tackle the KYA challenge within months.

—— Sean Neville, Co-founder of Circle, Architect of USDC, CEO of Catena Labs

7. AI Will Empower "Substantive Research Tasks"

As a mathematical economist, in January 2025, I struggled to get consumer-grade AI models to understand my workflow; but by November, I was able to send abstract tasks to AI models as if I were giving instructions to PhD students—sometimes they even returned "innovative and correctly executed" results. Beyond my personal experience, the application of AI in research is gradually becoming widespread, especially in the "reasoning domain": AI not only assists directly in discovery but can also "autonomously solve Putnam problems" (considered the hardest university-level math exam in the world).

What still needs exploration is: in which fields do these research assistance functions hold the highest value, and how can they be specifically applied? However, I anticipate that AI will give rise to and reward a "new type of polymath research model"—one that emphasizes the ability to "speculate on the connections between viewpoints" and "quickly derive from highly speculative answers." These answers may not be accurate, but they can point in the right direction (at least within a specific logical framework). Ironically, this is akin to "leveraging the power of model hallucination": when a model is sufficiently intelligent, granting it abstract exploratory space may yield meaningless content, but it could also lead to key breakthroughs—just as humans are most creative in "non-linear, non-explicit goal-oriented" states.

To achieve this reasoning model, a "new type of AI workflow" needs to be constructed—not just "interactions between agents," but also "agents nested within agents": multi-layered models assist researchers in evaluating "the methods of predecessor models," gradually filtering effective information and eliminating ineffective content. I have written papers using this method, while others have used it for patent searches, creating new forms of art, and even (regrettably) discovering new types of attacks on smart contracts.

But it is important to note: to run a "nested reasoning agent cluster" to support research, two key issues must be resolved—"interoperability between models" and "identifying and reasonably compensating each model's contributions"—and cryptographic technology can provide solutions for this.

—— Scott Duke Kominers, Member of a16z Crypto Research Team, Professor at Harvard Business School

8. The "Invisible Tax" of Open Networks

The rise of AI agents is imposing an "invisible tax" on open networks, fundamentally undermining their economic foundation. This disruption stems from the increasing misalignment between the "context layer" and the "execution layer" of the internet: currently, AI agents extract data from "advertising-supported websites" (context layer), providing convenience to users while systematically bypassing the "revenue sources that support content creation" (such as advertising and subscriptions).

To avoid the decline of open networks (while protecting the diverse content that "fuels AI"), large-scale deployment of "technology + economic" solutions is needed, such as "next-generation sponsored content," "micro-attribution systems," or other new funding models. Existing AI licensing agreements are essentially "financially unsustainable stopgap measures"—compensation for content providers often amounts to only a small fraction of the revenue they lose due to AI diverting traffic.

Open networks require "new technological economic models for automatic value flow." A key shift in 2026 will be: moving from "static licensing" to "real-time, usage-based payments." This means testing and scaling "blockchain-based micropayments + precise attribution standards" systems—automatically providing rewards to "all parties contributing to agents completing tasks."

—— Elizabeth Harkavy, Member of a16z Crypto Investment Team

On Privacy and Security

9. Privacy Will Become the "Most Important Moat" in the Crypto Space

Privacy is a key prerequisite for "global financial on-chain," but currently, almost all blockchains lack this feature—privacy is merely an "afterthought" for most chains.

Today, "privacy capabilities" are sufficient for a chain to stand out among numerous competitors; more importantly, privacy can "create a chain lock-in effect," which can be termed the "privacy network effect"—especially in a time when "competing solely on performance is no longer enough."

Thanks to cross-chain bridge protocols, as long as data is public, migrating between different chains is quite easy; but once privacy is involved, the situation changes completely: "transferring tokens across chains is easy, but transferring secrets across chains is difficult." When entering or exiting a "privacy zone," observers of the chain, memory pool, or network traffic may identify user identities; and transferring assets between "privacy chains and public chains" or "even between two privacy chains" can leak metadata such as transaction time and amount correlation, increasing the risk of user tracking.

Currently, many "undifferentiated new chains" have driven transaction fees close to zero due to competition (on-chain space has essentially become homogeneous); however, blockchains with privacy capabilities can build stronger "network effects." The reality is: if a "general-purpose chain" lacks a thriving ecosystem, killer applications, or unique distribution advantages, users and developers have no reason to choose it or build on it, let alone exhibit loyalty.

On public chains, users can easily transact with users from other chains, making the choice of which chain to use irrelevant; but on privacy chains, "which chain to choose" is crucial—once a user joins a certain privacy chain, they may be reluctant to migrate due to concerns about identity exposure, creating a "winner-takes-all" scenario. Since privacy is a necessity in most real-world scenarios, a few privacy chains may dominate the crypto space.

—— Ali Yahya, a16z Crypto General Partner

10. The (Near) Future of Instant Messaging: Not Just Quantum-Resistant, but Decentralized

As the world prepares for the "quantum computing era," applications like Apple, Signal, and WhatsApp—"instant messaging applications based on cryptographic technology"—have taken the lead, achieving significant results. But the problem is: all mainstream communication tools rely on "privately operated servers by a single entity"—these servers easily become targets for governments to "shut down, implant backdoors, or forcibly obtain private data."

If a country can shut down servers, a company holds the private server keys, or even the company itself owns the private servers, then what is the significance of "quantum-resistant encryption"? Private servers require users to "trust me," while "no private servers" means "you don't have to trust me." Communication should not require intermediaries (single entities) but should rely on "open protocols that require no trust in any party."

The path to achieving this goal is "network decentralization": no private servers, no single applications, fully open-source code, and employing "top-tier encryption technology" (including resistance to quantum threats). In an open network, no individual, company, nonprofit organization, or state can deprive people of their right to communicate—if a country or company shuts down an application, 500 new versions will appear the next day; even if a node is shut down, the economic incentives brought by technologies like blockchain will ensure that new nodes immediately take their place.

When people "control messages with keys" (just as they control funds), everything will change: applications may iterate, but users will always control their messages and identities—even if they stop using a certain application, the ownership of the messages will still belong to the users.

This is not just about "quantum resistance" and "encryption," but also about "ownership" and "decentralization." Without these two elements, what we build is merely "unbreakable encryption that can be shut down at any time."

—— Shane Mac, Co-founder and CEO of XMTP Labs

11. "Secrets as a Service"

Behind every model, agent, and automated system lies a simple foundation: data. However, most data transmission channels today—whether inputting models or outputting data from models—suffer from opacity, tampering, and lack of auditability. This may not significantly impact some consumer-grade applications, but industries and users in finance, healthcare, and many others require companies to protect sensitive data privacy; this is also a major barrier for institutions currently advancing the tokenization of real-world assets.

So, how can we achieve innovation that is secure, compliant, autonomous, and globally interoperable while ensuring privacy? There are many solutions to this problem, but I will focus on "data access control": who controls sensitive data? How does data flow? Who (or what entity) has the right to access the data?

Without a data access control mechanism, any entity wishing to protect data confidentiality must either rely on centralized services or build customized systems—this approach is not only time-consuming and costly but also hinders traditional financial institutions and other entities from fully utilizing the functionalities and advantages of on-chain data management. Furthermore, as agent systems begin to autonomously browse information, complete transactions, and make decisions, users and institutions across industries will require "cryptographic-level guarantees," rather than "good faith trust commitments."

For this reason, I believe we need "Secrets as a Service": leveraging new technologies to achieve programmable native data access rules, client-side encryption, and decentralized key management—clearly specifying who can decrypt which data under what conditions and for how long, with all rules enforced on-chain. Combined with verifiable data systems, "data confidentiality protection" will become part of the public infrastructure of the internet, rather than a patch added at the application level, truly making privacy a core infrastructure.

—— Adeniyi Abiodun, Chief Product Officer and Co-founder of Mysten Labs

12. From "Code is Law" to "Norms are Law"

Recent incidents of DeFi hacking have affected protocols that have been rigorously tested over many years, with strong teams and stringent auditing processes. These events reveal a disturbing reality: current mainstream security practices largely remain at the level of "experience-based judgment" and "case handling."

To mature DeFi security, two major shifts are needed: from "patching vulnerability models" to "ensuring design-level attributes," and from "good faith protection" to "principle-based systematic protection," which can be approached from two aspects:

Static/Pre-deployment phase (testing, auditing, formal verification): It is necessary to systematically prove "global invariance" (the core rules that the entire system always follows), rather than merely verifying "manually filtered local rules." Several teams are already developing AI-assisted proof tools that can help write specifications, propose invariance hypotheses, and significantly reduce the proof engineering work that previously had to be done manually—this type of work was extremely costly and difficult to scale.

Dynamic/Post-deployment phase (runtime monitoring, runtime enforcement, etc.): The aforementioned "invariance rules" can be transformed into real-time protective barriers, serving as the last line of defense. These protective barriers will be directly encoded as "runtime assertions," and all transactions must meet the assertion conditions to be executed.

In this way, we no longer need to assume that "all vulnerabilities have been patched," but instead enforce key security attributes through the code itself—any transaction that violates these attributes will be automatically rejected.

This is not a theoretical fantasy. In fact, almost all hacking attacks to date trigger such security checks during execution, which may prevent the attack from occurring. Therefore, the once-popular idea of "code is law" is gradually evolving into "norms are law": even when faced with new types of attacks, attackers must adhere to the core security attributes that maintain system integrity, and the remaining attack methods will either have minimal impact or be extremely difficult to implement.

—— Daejun Park, a16z Crypto Engineering Team

About Other Industries and Applications

13. Prediction Markets: Larger Scale, Broader Coverage, Higher Intelligence

Prediction markets have entered the mainstream spotlight, and by 2026, with the deep integration of cryptographic technology and AI, they will further expand in scale, broaden their coverage, and enhance their intelligence level—while also presenting new significant challenges for developers that need to be addressed.

First, prediction markets will launch more contracts. This means we will not only be able to obtain real-time odds for "major elections and geopolitical events," but also for various segmented outcomes and complex cross-events. As these new contracts continuously release information and integrate into the news ecosystem (a trend that is already emerging), society will face important questions: how to balance the value of this information? How to enhance the transparency and auditability of prediction markets through optimized design (which can be achieved with cryptographic technology)?

To cope with the significant increase in the number of contracts, new "consensus mechanisms" need to be established for contract settlement. While centralized platform settlement (confirming whether an event has actually occurred and how to verify it) is important, controversial cases like the "Zelensky Litigation Market" and the "Venezuela Election Market" expose its limitations. To address these edge cases and promote the expansion of prediction markets into more practical scenarios, new decentralized governance mechanisms and large language model (LLM) oracles can assist in determining the authenticity of disputed outcomes.

In addition to LLM oracles, AI also brings more possibilities to prediction markets. For example, AI agents trading on prediction platforms can widely collect various signals to gain short-term trading advantages, thereby providing new insights for understanding the world and predicting future trends (projects like Prophet Arena have already demonstrated the potential in this field). These agents can not only serve as "high-level political analysts" for people to consult insights but also help us identify the core factors influencing complex social events through the analysis of their independently formed strategies.

Will prediction markets replace polls? The answer is no. On the contrary, they can enhance the quality of polls (polling information can also be integrated into prediction markets). As a political scientist, what I look forward to most is the collaborative development of prediction markets and a "rich and vibrant polling ecosystem"—but this requires reliance on new technologies: AI can optimize the survey experience; cryptographic technology can provide new ways to prove that poll respondents are real humans rather than bots.

—— Andrew Hall, a16z Crypto Research Advisor, Professor of Political Economy at Stanford University

14. The Rise of Staked Media

Traditional media models tout "objectivity," but their drawbacks have long been evident. The internet has given everyone a voice, and more and more practitioners, doers, and builders are directly conveying their views to the public—reflecting their own "interest affiliations" in the world. Ironically, audiences respect them not "despite their interest affiliations," but "because of their interest affiliations."

This new change in trend is not the rise of social media, but rather the "emergence of cryptographic tools"—these tools enable people to make "publicly verifiable commitments." As AI significantly reduces the cost of generating massive amounts of content and makes the process more convenient (content can be generated from any perspective or identity—regardless of its authenticity), relying solely on human (or bot) statements is no longer convincing. Tokenized assets, programmable lock-ups, prediction markets, and on-chain historical records provide a more solid foundation for trust: commentators can prove they are "walking the talk" (backing their views with funds); podcasters can lock tokens to prove they won't opportunistically change their stance or "pump and dump"; analysts can bind predictions to "publicly settled markets," forming auditable performance records.

This is what I refer to as the early form of "staked media": this type of media not only embraces the idea of "interest relevance" but also provides tangible evidence. In this model, credibility does not come from "pretending to be neutral," nor from "unfounded claims," but from "publicly transparent and verifiable interest commitments." Staked media will not replace other forms of media but will complement the existing media ecosystem. It sends a new signal: no longer "trust me, I am neutral," but "this is the risk I am willing to take, and this is how you can verify that what I say is true."

—— Robert Hackett, a16z Crypto Editorial Team

15. Cryptographic Technology Provides "New Building Blocks Beyond Blockchain"

For many years, SNARKs—a cryptographic proof technology that verifies computation results without re-executing the calculations—have been largely limited to blockchain applications. The main reason is "high costs": the workload required to generate a computation proof can be a million times that of directly executing the computation. This technology is only valuable in scenarios where "costs can be distributed across thousands of verification nodes" (like in blockchain); in other scenarios, it is impractical.

But this situation is about to change. By 2026, the cost of zero-knowledge virtual machine (zkVM) provers will drop to about 10,000 times (i.e., the workload to generate proofs will be 10,000 times that of direct computation), with memory usage of only a few hundred megabytes—fast enough to run on mobile phones and cost-effective enough for widespread application. The reason 10,000 times may become a "critical threshold" is that the parallel processing power of high-end GPUs is about 10,000 times that of laptop CPUs. By the end of 2026, a single GPU will be able to "real-time generate proofs of CPU execution processes."

This will realize the vision proposed in old research papers: "verifiable cloud computing." If you need to run CPU workloads in the cloud due to "insufficient computational capacity for GPU processing," "lack of relevant technical capabilities," or "legacy system limitations," in the future, you will only need to pay a reasonable additional cost to obtain "cryptographic proof of computational correctness." Provers have achieved GPU optimization, and your code can be used without additional adaptation.

—— Justin Thaler, a16z Crypto Research Team Member, Associate Professor of Computer Science at Georgetown University

About Industry Development

16. Trading Business: A "Transfer Station" for Crypto Companies, Not the "End Point"

Today, except for the stablecoin sector and some core infrastructure companies, almost all successful crypto companies have either turned to trading business or are transitioning towards it. But if "all crypto companies become trading platforms," where will that lead? A large number of companies clustering in the same lane will not only distract user attention but also result in "a few giants monopolizing while most companies are eliminated." This means that those companies that rush too quickly into trading will miss the opportunity to build "more competitive and sustainable business models."

I fully understand the founders' intention to achieve business profitability, but the pursuit of "short-term product-market fit" also comes at a cost. This issue is particularly prominent in the crypto space: the unique dynamics related to token characteristics and speculative attributes can easily lead founders to choose the path of "instant gratification" in the process of "seeking product-market fit"—this is essentially similar to the "marshmallow experiment" (testing the ability to delay gratification).

There is nothing wrong with the trading business itself; it is an important market function, but it should not become the "ultimate goal" of a company. Founders who focus on "the essence of the product in product-market fit" are ultimately more likely to become industry winners.

—— Arianna Simpson, a16z Crypto General Partner

17. Unlocking the Full Potential of Blockchain: When Legal Frameworks and Technical Architectures Finally Align

Over the past decade, one of the biggest obstacles to building blockchain networks in the United States has been "legal uncertainty." The scope of securities law has been expanded and enforcement standards have been inconsistent, forcing founders into a regulatory framework that is "designed for businesses rather than networks." For many years, "avoiding legal risks" has replaced "product strategy," and the importance of engineers has given way to lawyers.

This situation has led to many distortions: founders are advised to avoid transparency; token distribution has become arbitrary on a legal level; governance has become formalistic; organizational structures prioritize "avoiding legal risks"; and token designs deliberately "avoid carrying economic value" or "do not set business models." Worse still, those crypto projects that "disregard rules and operate in gray areas" often develop faster than those "honestly compliant" builders.

But now, the U.S. government is closer than ever to passing the "Crypto Market Structure Regulatory Act"—this act is expected to eliminate all the aforementioned distortions by 2026. If the act is passed, it will incentivize companies to enhance transparency, establish clear standards, and replace "random enforcement" with "clear, structured financing, token issuance, and decentralization paths." Previously, after the passage of the "GENIUS Act," the issuance of stablecoins surged; and legislation related to crypto market structure will bring even more significant changes—this transformation will focus on "blockchain networks."

In other words, such regulation will allow blockchain networks to "truly operate in a network form": open, autonomous, composable, trustlessly neutral, and decentralized.

—— Miles Jennings, a16z Crypto Policy Team Member, General Counsel

免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

Share To
APP

X

Telegram

Facebook

Reddit

CopyLink