a16z conversation with Solana co-founders: People should try to create greater ideas instead of repeating what already exists

CN
链捕手
Follow
1 year ago

Original Title: Debating Blockchain Architectures (with Solana)
Hosts: Ali Yahya, General Partner at a16z crypto; Guy Wuollet, Partner on the trading team at a16z crypto
Guest: Anatoly Yakovenko, CEO of Solana Labs and Co-founder of Solana
Translation: Qianwen, ChainCatcher


"But what I want to say is that people should try to create greater ideas rather than repeating what already exists. The best metaphor I've heard is that when people discovered cement, everyone focused on making bricks with it, and then someone thought, I can build a skyscraper. They figured out a way to combine steel, cement, and architecture in a way that no one had thought of. The new tool is cement. You just need to figure out what the skyscraper is and then go build it."

In this episode, a16z crypto talks to Anatoly Yakovenko, co-founder and CEO of Solana Labs. Anatoly Yakovenko previously worked at Qualcomm as a senior engineer and engineering manager.

Summary

  • The ultimate goal of decentralized computing
  • The philosophy behind Solana
  • Similarities and differences between Solana and Ethereum
  • The future development of blockchain
  • Web3 community and development
  • Talent recruitment for Web3 startups

The ultimate goal of decentralized computing

a16z crypto: First, I'd like to know how you view the ultimate goal of decentralized computing and your perspective on blockchain architecture.

Anatoly Yakovenko: My position is quite extreme. I believe that settlement will become increasingly less important, just like in traditional finance. You still need someone to provide guarantees, but these guarantees can be achieved in many different ways. I believe that what truly has value in the world is having a globally distributed, globally synchronized state, which is also the real challenge. You can think of it as Google Spanner's role for Google, or Nasdaq's role for the financial markets.

From a macro perspective, blockchain systems are permissionless, programmable, and highly open, but there is still some form of market at the back of the stack. For all these markets, achieving global synchronization as close to the speed of light as possible is very valuable, because then everyone can use it as a reference. You can still operate local markets, but if there is fast synchronized global pricing, then global finance will become more efficient. I believe this is the ultimate goal of blockchain, to synchronize as many states as possible at the speed of light.

a16z crypto: If cryptocurrencies and blockchain gain mainstream adoption, what do you think will be the biggest driver of activity on the blockchain at that time?

Anatoly Yakovenko: I think the form will be very similar to Web2, but it will be more transparent, realizing the vision of the long tail distribution—there will be various small-scale companies on the internet that can control their own data, rather than a few dominant players (although what these large companies are doing is also great). I believe that in the long run, creators should have more control and more autonomy in publishing, and be able to achieve a truly meaningful internet with broad distribution and markets.

a16z crypto: Another way to think about or pose this question is how to balance it. You mentioned that you believe settlement will become less important in the future. I'm curious, as Solana is a place where a large amount of global business, especially financial activities, take place, how can it accelerate or complement the ultimate goal you just mentioned?

Anatoly Yakovenko: The Solana system is not designed as a store of value, and it actually has a low tolerance for network failures. It aims to use as many available resources on the internet as quickly as possible. In fact, it relies on most of the world's free cross-border communication and finance. It is different from Bank Coin (tokens linked to banking and other financial systems), and of course, I believe that there is a need for Bank Coin that can survive when local geopolitical conflicts occur.

However, optimistically, the connections between things in the world are becoming increasingly tight. I think we will see terabit connections between us. In that world, you will have a fully interconnected world. I think this globally synchronized state machine can absorb a lot of execution aspects.

From experience, settlement can happen in many places because settlement is easy to guarantee. Again, I emphasize that I take this position for discussion. Since 2017, we have witnessed hundreds of privacy networks, each with many different instances from a design perspective. We basically see no Quorum failures because settlement is relatively easy to achieve. We have also solved other extended issues. From experience, Tendermint is very feasible, although we experienced the Luna meltdown in the early days, the problems that occurred were not in the voting algorithm mechanism.

I believe that we spend too much on settlement in terms of security, resources, and engineering, and not enough on research and execution, which is where most of the profits in the financial industry come from. Personally, I think if these technologies are to truly impact and reach a global scale, they must be superior to traditional finance in terms of price, fairness, and speed. This is where we need to focus on research and competition.

a16z crypto: Do you think settlement is one of the aspects you choose to optimize blockchain? People may over-optimize blockchain for settlement and overlook other aspects, such as throughput, latency, and composability, which often conflict with the security of settlement. Can you talk about Solana's architecture?

Anatoly Yakovenko: The task of Solana's architecture is to simultaneously transmit information from all over the world to all participants in the network at the fastest speed. So there is no need for sharding, no complex consensus protocols. We actually want to keep things simple. Or, we are fortunate to have solved a computer science problem, which is box synchronization (using verifiable delay functions as a time source in the network). You can think of it as two radio towers transmitting signals at the same time or frequency, which creates noise. The first protocol people thought of when building cellular networks was to equip each tower with a clock to alternate signal transmission.

An analogy is that the Federal Communications Commission is like a truck full of thugs, and if your signal tower is not synchronized in the open licensed network, they will come and shut it down. Solana is inspired to use verifiable delay functions to schedule block producers, so there are no collisions. For example, in a network like Bitcoin, if two block producers produce a block at the same time, a fork will occur, which is similar to the noise in a cellular network. If we can force all block producers to produce blocks in turn, you can get a good time division protocol, where each block producer can take turns to produce as planned, and they will never collide. Therefore, forks will never occur, and the network will never enter a noisy state.

After that, everything we do is optimizing the operations of the operating system and the database. We transmit data blocks globally like BitTorrent, transfer coding blocks to different machines, and they end up looking very similar to data availability sampling and have the same effect. Then they forward bits to each other, reconstruct blocks, and then vote, and so on. The main design idea of Solana is that we strive to ensure that every process in the network or codebase only needs to be updated in the kernel when designed.

If within two years, we can get twice the kernel for every dollar we spend, we can adjust it so that the number of threads per block is twice as much as it is now, or the computational load per block is twice as much. Therefore, the network can achieve twice as much. All of this will happen naturally without any changes to the architecture.

This is the main goal we really want to achieve, based on my experience. From 2003 to 2014, I worked at Qualcomm. We saw improvements in mobile terminal hardware and architecture every year. If you don't consider being able to expand software without rewriting it the next year, as an engineer, you are very unqualified. Because your devices will scale rapidly, and to take advantage of this, you have to rewrite the code.

So, if you really need to think ahead, everything you build will only develop faster and faster. The biggest learning experience in my engineering career is that you can choose a carefully designed algorithm, but it may be wrong, because as the hardware scales, the benefits of using this algorithm become minimal, and implementing it now is like a waste of time. So, if you can do very simple things and only need to expand the kernel, you may already be able to achieve 95% of everything.

The philosophy behind Solana

a16z crypto: Using proof of history as a way to synchronize time across validators is a very innovative idea, which is why Solana is different from other consensus protocols.

Anatoly Yakovenko: This is part of Amdahl's Law, which is why it is difficult for people to replicate Solana in terms of accountless, latency, and throughput, because classic consensus implementations are based on step functions. An entire network, such as Tendermint, must agree on the content of the current block before moving on to the next block.

Cellular signal towers use a timetable, and you just need to send a signal. Because there is no need for step functions, the network can run quickly. I think it's like a kind of synchronization, but I don't know if that word is appropriate. They keep transmitting and never stop to wait for consensus to run. We can do this because we have a strict understanding of time. Honestly, we can establish some clock synchronization protocols for redundancy, but the process may be very difficult. This is a huge engineering project that requires reliable clock synchronization.

This is the philosophy behind Solana. Before I started building Solana, I enjoyed trading, being a broker, and so on, although I didn't make money. At that time, "flash boys" were prevalent in traditional finance. Whenever I thought my algorithm was good, my orders would be delayed, taking longer to enter the market, and the data would arrive more slowly.

I believe that if we want to disrupt the financial industry, the fundamental goal of these open business systems is to make this situation impossible. This system is open, and anyone can participate. Everyone knows how to gain access, how to gain rights, such as priority or fairness, etc.

Within the limits allowed by physics and within the range that engineers can achieve, the fundamental problem is to achieve all of this at the fastest speed. I think if blockchain can solve this problem, it will have a very big impact on the rest of the world, benefiting many people globally. This could become a cornerstone, and then you can use it to disrupt advertising transactions and monetization models on the internet.

a16z crypto: I think there is an important distinction between pure latency and malicious activity, especially within a single state machine. Perhaps you can elaborate on which you think is more important and why.

Anatoly Yakovenko: It's not possible to atomize the entire state, because that would mean there is only one correct global lock for the entire state, which would result in a very slow ordering system. Therefore, you need atomic access to the state, and you need to guarantee that. It's difficult to build software that operates on non-atomic remote states if you don't know what side effects it will have on your computation. Therefore, the idea is like submitting a transaction, either it executes completely or it fails completely, with no side effects. This is one of the characteristics that these computers must have. Otherwise, I think it's impossible to write reliable software for them. You simply cannot build any reliable logic or financially reliable logic.

You might be able to build a consistent system, but I think that's a different kind of software. So, there is always a tension between maintaining the atomicity of the system and performance. Because if you guarantee this, it ultimately means that at any given time, you have to choose a specific writer globally to handle a specific part of the state. And to solve this problem, you need a single sequencer and linearize these events. This creates points where value can be extracted and system fairness can be improved. I think it's really difficult to solve these problems, and not only Solana faces these challenges, but also Ethereum and Lightning Network.

Solana vs. Ethereum

a16z crypto: One frequently debated issue, especially in the Ethereum community, is the verifiability of execution, which is very important for users because they don't have very powerful machines to verify activities on the network. What are your thoughts on this?

Anatoly Yakovenko: I think the ultimate goals of these two systems are very similar. If you look at the Ethereum roadmap, you'll find that its idea is that the overall network bandwidth is greater than any single node, and the network is already processing or computing more events than any single node. You have to consider the security factors of such a system. There are also protocols for publishing fraud proofs, sampling schemes, etc., all of which actually apply to Solana as well.

So, if you step back, there's actually not much difference. You have a system that operates like a black box, creating so much bandwidth that it's not very practical for a random user. Therefore, they need to rely on sampling techniques to ensure the authenticity of the data. It's like a very powerful gossip network that can spread fraud proofs to all clients. The things guaranteed between Solana and Ethereum are the same. I think the main difference between the two is that Ethereum is largely constrained by its narrative as a global currency, especially in competition with Bitcoin as a store of value.

I think it makes sense for users to have very small nodes. Even if they only partially participate in the network, rather than having the network entirely run by professionals. Honestly, I think it's a fair optimization, such as, if you don't care about execution, only care about settlement, why not minimize the node requirements and allow people to partially participate in network activities? I don't think doing so can create a trust-minimized or absolutely secure system for the vast majority of people in the world, as people still need to rely on data availability sampling and fraud proofs. And for users to verify if the blockchain has done something wrong, they only need to execute the majority of signatures on the chain.

On Solana, a single transaction describes the action states of everyone involved in the transaction, and it runs on any device, such as a browser in a mobile phone, that can easily execute a single transaction with the majority of people's signatures, because everything on Solana is predetermined, so building on Solana is actually easier. Smart contracts or any EVM can access any state and randomly jump between them during execution. In a way, it's almost simpler. But I think, at a high level, users still ultimately rely on DAS and fraud proofs. In this regard, all designs are the same.

a16z crypto: It sounds like you believe that validity proofs and ZK proofs are excellent for settlement, but not very helpful for execution because of the long delay and the need for performance improvement.

Anatoly Yakovenko: So far, that's true. That's my intuition, and the reason is simple, because the more things happen on the chain, the more hotspots of state dependencies there are. They are not completely parallel and will never talk to each other. It's just a bunch of very poor quality code.

a16z crypto: Another counterargument might be that zero-knowledge proofs are experiencing exponential progress, as there is currently a lot of investment in this area. Perhaps in 5 years, 10 years, the expenses may decrease from the current 1000 times to a more feasible level. I'm curious to hear your thoughts on whether it might be more efficient to have a single node perform calculations and generate proofs, then distribute the proofs to others, rather than having each node perform calculations on its own.

Anatoly Yakovenko: This trend is helpful for optimizing zero-knowledge systems. The more things happen on the chain, the more constraints there are, and the speed far exceeds the rate at which you can add hardware, and then you continue to add hardware. That's my intuition. My feeling is that as demand increases, such as the amount of computation on the chain, it will become increasingly difficult for zero-knowledge systems to keep up with low latency. I'm not even sure if it's 100% feasible. I think you could likely build a system that can handle super large recursive batches, but you still have to run classic execution, snapshotting every second. Then, you invest an hour of computation in a large parallel farm, verify between each snapshot, and start recalculating from there, but it takes time, and I think that's a challenge.

I'm not sure if ZK can keep up, unless demand stabilizes, but I think ultimately demand will flatten out. Assuming hardware continues to improve, at some point, the demand for cryptocurrencies will saturate, just like Google's search volume may have already saturated per second. Then, you will start to see this happening. I think we are still far from that goal.

a16z crypto: Another major difference between these two modes is Ethereum's Rollup-centric worldview, which is essentially a pattern of computing sharding, data availability sharding, bandwidth, and network activity sharding. Therefore, it can be imagined that greater throughput can eventually be achieved, as you can almost infinitely increase Rollups based on a single Rollup, but this means compromising on latency. So, what is more important? The overall throughput of the network or access latency? Perhaps both are important?

Anatoly Yakovenko: I think the main issue is that you have Rollups and sequencers, and people will extract value from the construction of sequencers and Rollups, and in this system, you more or less have some common sequencers. Their operations are no different from Citadel, Jump, brokers, traders, etc., all routing orders. These systems already exist. This design actually doesn't break the entire monopoly. I think the best way is to build a completely permissionless business system, where those intermediaries cannot truly participate, and start capturing the value of a globally synchronized state machine.

It is very likely that the actual cost of use will be lower, as it is like creating a bunch of different small channels (pipes).

In general, the pricing of any given channel is based on the remaining capacity of that channel, rather than the overall network capacity. It's difficult to establish a system with completely shared network bandwidth. You can try to design it like Rollup, putting blocks wherever available, but they will all compete and bid. It's not as simple as a huge pipe, the price is based on the remaining capacity of this chained pipe. Because it is a bandwidth aggregation source, its pricing will be lower, but the ultimate speed and performance will be higher.

Block Space and the Future

a16z crypto: I've heard you say that you don't think the demand for block space is infinite. Do you think the demand for block space will reach a balance point when web3 becomes mainstream?

Anatoly Yakovenko: Just imagine, if engineers at Qualcomm were told that the demand for cellular bandwidth is infinite, and the code is designed for infinity, that would be ridiculous.

In reality, a goal would be set to design for such demand, such as thinking about how much hardware is needed, do I need to start up, what is the simplest implementation, what is the deployment cost, etc. My intuition is that 99.999% of the most valuable transactions may only need up to 100,000 TPS, that's my intuitive guess. And a system that achieves 100,000 TPS is actually quite feasible, current hardware can achieve it, and Solana's hardware can do it. I think a speed of 100,000 TPS could be the block space demand for the next 20 years.

a16z crypto: Could it be that the demand for block space is soaring because it is so affordable, and people want to do all sorts of things with it?

Anatoly Yakovenko: But there is still a bottom price. The price purchase must cover the bandwidth cost of each validator. Just as egress costs dominate verification costs. If you have 10,000 nodes, you probably need to price the network's per-byte usage at 10,000 times the normal egress cost, but that sounds expensive.

a16z crypto: So I guess the question is, do you think Solana will reach its limit at some point, or do you think the single architecture is already sufficient?

Anatoly Yakovenko: So far, people have been doing sharding because they have built systems with much lower bandwidth than Solana, so they have encountered capacity limitations and started bidding for bandwidth, which has far exceeded egress costs. For example, for 10,000 nodes, the egress cost for Solana's validators should be $1 per megabyte, this is a bottom price, you can't use it to stream videos. But it's very low, you can use it for searching, you can basically put every search on the chain, and then get the results from your search engine.

a16z crypto: I think this is actually an interesting point, because we posed the question "What is the ultimate goal of blockchain expansion" at the beginning of the podcast, which means that the scalability of blockchain is the most important issue.

Chris has also used the analogy that much of the progress in AI over the past decade is largely due to better hardware, which is the real key. So I think when we talk about the scalability of blockchain, it's for the same purpose, if we can achieve a significant increase in TPS, everything will work fine. But an interesting counterpoint is that Ethereum can complete 12 transactions per second, and the throughput of Ethereum itself is still larger than any single L2, charging relatively high fees. On Solana, many simple transfer transactions have low fees. When we discuss this issue, we usually come to the conclusion that if our throughput reaches the next order of magnitude, there will be many new applications that we cannot currently reason or think about. In a sense, over the past few years, Solana has been the place to build applications, and many things are very similar to those built on Ethereum.

Do you think higher throughput or lower latency will unleash many new applications? Or will most of what is built on the blockchain in the next 10 years be very similar to the designs we have proposed?

Anatoly Yakovenko: In fact, I think most applications will be very similar. The most difficult thing to crack is how to build a business model, such as how to apply these new tools? I think we have already found these tools.

The reason why Ethereum transactions are so expensive is because its state is very valuable. When you have this state, anyone can write to it, and they will create economic opportunity costs to become the first person to write to this state, which effectively increases the cost. This is why valuable transaction fees are generated on Ethereum. To achieve this, many applications need to create this valuable state, so that people are willing to continuously write to it, and people start competing to raise fees.

a16z crypto: Here's a counterargument. I think we are easily underestimating the creativity of developers and entrepreneurs in the entire field. In fact, if you look back at history, such as the first wave of network and internet waves starting in the 1990s, it took a long time to truly develop the main drivers of interesting applications. And for cryptocurrencies, from around 2014 with Ethereum, we truly have programmable blockchains, things like Solana have only been around for about 4 years, and people are still exploring the design.

The fact is, the number of developers in this field is still very small. For example, developers who know how to write smart contracts, truly understand the prospects of blockchain as a computer, we probably have tens of thousands of people. So I think it's still early to develop interesting ideas on the blockchain. The design space it creates is so vast that I guess we will be surprised by what people will create in the future. They may not just be related to transactions, markets, or finance. They may appear in the form of shared data structures, which are very valuable, but fundamentally unrelated to finance.

A decentralized social network is a good example, where the social graph is placed on the chain as a public good, allowing various entrepreneurs and tech developers to build on it. Because the social graph is on the blockchain and open, all developers can access it, making the social graph a very valuable state maintained by the blockchain. You can imagine people wanting to publish a large number of transactions for various reasons, such as real-time updates to this data structure. If these transactions are cheap enough, I think developers will find ways to leverage them.

Historically, whenever computer speed increases, developers find ways to use the additional computing power to improve their applications. Our computing power has never been enough. People always want more computing power, and I think the same will happen with blockchain computers. And there will be no limit, maybe the limit is not infinite, but I think the demand for block space will be much higher than we imagine.

Anatoly Yakovenko: On the other hand, the use cases of the internet were actually discovered very early, such as search, social graphs, and e-commerce were also discovered very early, probably in the 1990s.

a16z crypto: There are some things that are hard to predict. For example, shared bicycles are hard to predict. In fact, the form that search ultimately takes is also hard to predict, and I widely use things like streaming videos in social networks, which was unimaginable at first.

I think, just like here, we can think of some applications that people might build on the blockchain. But given the current limitations and infrastructure constraints, some of these applications feel impossible to imagine. Once these limitations are removed, once more people enter the field to build, we can imagine, there may be many heavyweight applications in the future. So if we let it develop freely, we may be surprised at how powerful it becomes.

Anatoly Yakovenko: There's an interesting card game called "dot bomb", the goal of the game is to lose money as slowly as possible, you actually can't win money, or make money. You manage a group of different startups using 90s internet ideas. Without exception, every so-called bad idea, such as online grocery delivery and online pet stores, became at least a $1 billion business at some point after 2010. So I think many ideas may initially be bad, or fail in the initial implementation, but they will eventually be well adopted in the future.

Future Adoption of Blockchain

a16z crypto: So the question is, what do you think is the key for blockchain to go from its current applications to becoming a key part of the mainstream internet? If it's not scalability, then what are the other obstacles, such as cultural acceptance of blockchain? Is it privacy issues? Is it user experience?

Anatoly Yakovenko: This reminds me of the development history of the internet. I remember how the entire experience changed. After I went to college, I got an email address, and everyone who worked had an email address, and I started receiving some links with various content, and the user experience on the internet improved, such as the emergence of Hotmail, and the development of Facebook.

Because of this, people's thinking changed, and they understood what the internet is. Initially, people even had a hard time understanding what a URL was, what it meant to click on something, and what it meant to enter a server. We also have the same problem in self-regulation, we need to make people truly understand these concepts, such as what does a mnemonic phrase mean? What does a wallet and a transaction mean? People's thinking needs to change, and this change is slowly happening. I think every user who eventually buys cryptocurrency and deposits it into their self-regulated wallet will understand this once they have this experience. But so far, not many people have had this experience.

a16z crypto: You guys made a phone. Maybe you can tell us where the inspiration for making a phone came from, and how you think the current promotion is going?

Anatoly Yakovenko: My experience at Qualcomm made me realize that this is a problem that exists, we can solve it, and it won't make the entire company turn to the mobile business. So this is a low-cost opportunity for us, which may change the cryptocurrency or mobile industry.

This is something worth doing. We worked with a company to manufacture a device, and when we worked with them to launch specific cryptocurrency features, we received great feedback from people and developers, who thought it was like an alternative to the app store. But everything is unknown, such as whether the application of cryptocurrency at the macro level is so eye-catching that people are willing to switch from iOS to Android? Some people are willing, but not many. Launching a device is very difficult. Basically, every device launched outside of Samsung and Apple has ended in failure, mainly because the production lines of Samsung and Apple have been well optimized, and any new startup is very far behind these giants in terms of hardware.

So, you need to have some "religious" reasons to make people convert, maybe cryptocurrency is that reason. We haven't proven this, but we haven't disproven it either. Just like we haven't seen a breakthrough use case where self-regulation is the key feature people are willing to change their behavior for.

a16z crypto: You are one of the few founders who can build both hardware and decentralized networks. Decentralized protocols or networks are often compared to building hardware because it is very complex. Do you think this analogy holds?

Anatoly Yakovenko: Like my previous work at Qualcomm. If there's a problem with hardware, it causes a lot of problems, for example, if a tape fails, the company spends tens of millions of dollars every day to fix it, which could be catastrophic. In a software company, you can still quickly find problems, you can patch the software 24 hours a day, which makes it easier.

Community and Development

a16z crypto: Solana has done an excellent job in building its own community, with a very strong community. I'm curious, what methods did you take in building the company and ecosystem?

Anatoly Yakovenko: It can be said that there is an element of luck in this. We started as Solana Lab in 2018, which was at the end of the previous cycle. Many of our competitors actually raised several times more funds than we did. At that time, our team was very small. We didn't have enough funds to build and optimize cdm, so we built a runtime that we thought could demonstrate this key feature - a scalable and unrestricted blockchain that is not affected by the number of nodes or severe latency. We really wanted to break through in these three aspects.

At that time, we only focused on building this fast network, without paying too much attention to other aspects. In fact, when the network was launched, we only had very rudimentary resource managers and command-line wallets, but the network speed was very fast. This was also the key to attracting developers, because at that time there were no other fast, inexpensive networks as alternatives, and no programmable networks that could provide this speed, latency, and throughput.

This is actually why developers were able to develop. Because at that time, many people couldn't copy and paste solidity code, so everything had to be built from scratch. The process of building from scratch is actually the entry process for engineers. For example, if you can build your favorite primitives in stack a and stack b, you can learn stack b from start to finish. If you can accept some trade-offs, you may become its advocate.

If we had more funds, we might have made a mistake at that time, which is trying to build EVM compatibility. But in fact, our engineering time was limited, which forced us to prioritize the most important thing, which is the performance of this state machine.

My intuition is that if we can remove the restrictions on developers and give them a very large, very fast, and low-cost network, they can remove their own constraints. And this has indeed happened, surprisingly and admirably. I'm not sure if we would have been successful if the timing was not right, for example, if the macro environment was not right at the time. We announced it on March 12, and then on March 16, the stock market and the cryptocurrency market both crashed by 70%. I think the timing of those three days may have saved us.

a16z crypto: Another important factor here is how to win developers?

Anatoly Yakovenko: It's counterintuitive, you have to build your first program by chewing glass, which requires people to really invest time, we call it "chew glass".

Not everyone will do this, but once enough people do, they will build libraries and tools that make it easier for the next developer to develop. For developers, doing this is actually something to be proud of, and naturally libraries will be built, and software will naturally expand. I think this is what we really want the developer community to build and chew, because it really makes those people own it, and really makes them feel they own the ecosystem. We try to solve the problems they can't solve, such as long-term protocol issues.

I think this is the source of this spirit, you are willing to chew glass because you get something in return, you get ownership of the ecosystem. We are able to focus on making the protocol a cheaper, faster, and more reliable network.

a16z crypto: What are your thoughts on the developer experience, and what role will programming languages play as they gain more mainstream use in this field? It's quite difficult to integrate into this field, learn how to use these tools, and learn how to think.

In the new model, programming languages may play an important role in this, as the security of smart contracts has become an important task for engineers in this field. The risks involved are very high. Ideally, we will eventually see a world where programming languages provide much more help through tools than they do now, such as formal verification, compilers, and automation tools, which can help you determine if your code is correct.

Anatoly Yakovenko: I think formal verification is necessary for all Defi applications. Many innovations happen here, such as building new markets, which are the most vulnerable to hacker threats, and these are the places that really need formal verification and similar tools.

I think there are many other applications that are quickly converging on single-node implementations and becoming trustworthy in effect. Once you can establish a single standard for a certain type of problem, it is much easier than a startup building a new Defi protocol, because no one has written this code before, so you have to bear a lot of implementation risks, and then make people believe in it and put money into this protocol to take risks. This is where you need all the tools. Formal verification, compilers, move language, and so on.

a16z crypto: The programming world is changing in a very interesting way, because most programming in the past was traditional imperative programming, similar to javascript. And when you write some code, it is likely to be incorrect and will be broken, and then you fix it.

But more and more applications are critical to tasks, for these applications, you need a completely different way of programming, a pattern that can better ensure that the code you write is correct. On the other hand, there is another type of programming emerging, which is machine learning, such as using data to synthesize programs. And both of these things are eating away at the original form of imperative programming. There will be fewer and fewer ordinary javascript code in the world. There will be more and more code written by machine learning algorithms based on data. There will be more code written using more formal techniques, which looks more like mathematics and formal verification.

Anatoly Yakovenko: Yes, I can even imagine at some point, a prover optimizing smart contract languages, and then telling LLM to translate it into solidity or other Solana anchors. Two years ago, people might not believe it, but with Gpt 4, there are a lot of leap functions.

a16z crypto: I like this idea. You can use an LLM to generate program specifications that meet the requirements of some formal verification tools. Then, you can ask the same LLM to generate the program itself. Then, you can run formal verification tools on the program to see if it really meets the specifications. If it doesn't, it will give you an error, and you can feed this error back to another LLM to try again. You can keep doing this until you generate a verifiable, formally verified program.

Ecosystem and Talent Recruitment

a16z crypto: We are discussing how to build a strong ecosystem. Many blockchains almost immediately decentralize after launch, to the point where the core team no longer participates in forum discussions, and no longer tries to help other partners participate. And you seem to be very recognized from the start of the network launch and entry into the market. I think this may be a major advantage in building the Solana ecosystem.

Anatoly Yakovenko: To quote a saying, decentralization is not without leadership, but with diverse leadership. I still remember how difficult it was to take Linux seriously in a big company like Qualcomm, even the idea of running Linux on mobile devices seemed ridiculous. When I first joined, the whole community was trying to convince everyone that open source was meaningful, and I think this is what we need to do, the network needs to be decentralized.

But that doesn't mean there is no leadership. In fact, you need a lot of experts to constantly tell people the benefits of using this specific network and its architecture, to constantly bring more people in, and to cultivate more leaders who can teach and guide others around the world. But that doesn't mean everything happens under one roof. If the network and code are open, anyone can contribute and run it. Naturally, it is actually decentralized. You naturally see leadership emerging from unexpected places.

Our goal is to develop everything around us, to make our voice one of many voices, rather than silencing others. We pay a lot of attention to hackathons, fans, and so on, trying to connect them to each other and involve them in this cycle. It's like a flywheel. We try to connect people with developers from around the world, have as many one-on-one interactions with them as possible, and then get them all involved in hackathons, competing, and encouraging them to build their first or second product.

In the cryptocurrency user base, only a few products can enter the market, get venture capital, and have a scalable user base. In my opinion, this means we don't have enough creativity. We don't have enough founders to aim at targets and find truly scalable business models that can reach millions of users. So, we need a lot of companies to compete and see if they can come up with brilliant ideas, which is the biggest challenge.

a16z crypto: Another related question is, how to involve the community in developing parts of the core protocol itself? This is one of the most difficult balancing issues for any blockchain ecosystem. On the one hand, you can actively involve the community, but on the other hand, your flexibility may be reduced. And the governance process involves more people, making coordination difficult. On the other hand, you can also control things in a more top-down way and therefore develop faster. But in terms of community involvement, you will be influenced to some extent. How do you strike a balance?

Anatoly Yakovenko: Generally, when I work at a foundation, we see people actively contributing to what they want to do. Then, they have to go through a proposal process, and then there will be a grant or other accompanying things. This is very similar to the interview process, for example, when I hire people in the lab, it may be that the corporate culture does not match this person, or it may be for some other reason, but that doesn't mean this person is not good, it just means that something didn't work out. Similarly, you will find that engineers have been submitting code and contributing to the codebase. They already know how to merge code culturally and how to deal with open source issues. When you find people who can solve problems on their own, you can give them funding, which is very important to ensure that you can find truly outstanding people who can submit code and are willing to work on it long term.

a16z crypto: What do you think is the best way to run decentralized governance protocols today?

Anatoly Yakovenko: Like L1, the approach we take seems to be very effective, just like Linux, moving forward and trying to avoid vetoes from any participant as much as possible. It takes the path of minimal veto. To be honest, there are many participants who can veto any change, they may feel that the change is not good, or they don't want the change. But we have to make the system faster, more reliable, and use less memory, and no one will object to these changes.

Ideally, we have a process where you publish a design, and everyone discusses it for three months. So, before merging, everyone has a lot of opportunities to look at the code and decide if it's good or bad. This process may seem a bit long, but it's not. If you've worked at a large company, basically working with Google or Qualcomm, you know you have to talk to a lot of people, you have to push it, make sure all the key partners, such as the key figures who touch the codebase, can accept it, and then slowly get it done. It's more difficult to make radical changes. Because many smart people are looking at the same thing, they may really find some mistakes, and then ultimately decide.

a16z crypto: How do you approach talent recruitment?

Anatoly Yakovenko: In terms of engineering, our requirements are often very high, at least we will hire fairly senior personnel. My approach to recruitment is that, in the early stages, I will put effort into something, so I know how to do it, and then I will tell the new employee that this is how I did it. I don't expect them to complete it in 90 days, or to surpass me. I can evaluate them in the interview, tell them this is the problem I am solving. I need someone to take over so I can do unknown things. In a startup, if you are the CEO, it's best not to give others an unknown problem, because you don't know if they can solve it.

When the ecosystem reaches a certain level of development, you need a PM. At that time, I spent too much time answering questions, until 2 am still answering questions. I thought at the time, I should let someone else do this, and now I know exactly what this job is.

a16z crypto: How important do you think privacy is for blockchain in the future?

Anatoly Yakovenko: I think the whole industry will undergo a transformation. First, some visionary people will focus on privacy, and then suddenly, my large payment company or other companies will adopt this technology, and it will become the standard. I think it needs to be a feature - if you don't have this feature, you can't compete. We haven't reached the level of market maturity, but I think we will. Once many people are using blockchain, every merchant in the world will need privacy. This is just the minimum requirement.

a16z crypto: What impact does the Solana architecture have on MEV? Do leaders have too much authority to reorder transactions?

Anatoly Yakovenko: Our initial idea was to have more than one leader assigned to each slot. If we can get as close to the speed of light as possible, about 120 milliseconds, then you can have a discrete batch time auction every 120 milliseconds globally. Users can choose the most recent or the highest rebate from all available block producers. In theory, this may be the most efficient way to operate finance, either I choose delay and send it to the nearest block producer, or I choose the highest rebate and do a delayed dollar transaction. This is a theory, we haven't tested having multiple leaders per slot, but we are getting close to that goal, and I think it may be feasible, maybe next year.

I think once we achieve this solution, we will have a very powerful system that basically forces competition and minimizes MEV.

a16z crypto: What is your favorite system optimization in the Solana architecture?

Anatoly Yakovenko: I like the way we spread blocks, this was one of our early ideas, and it's one of the things we really need to do. We can expand the number of nodes in the network very much, we can transmit a lot of data, but the amount of egress each node must share, that is, the egress load it must bear, is fixed and limited.

If you think about it from a high level, each leader, when creating a block, will chop it into shards and create encoding for these shards. Then, they transmit the shards to a node, and then that node sends them to other nodes in the network. Because all the data is mixed with encoding, as long as someone receives this data, the reliability of the data is very high, because the number of nodes propagating the data is very large, unless 50% of the nodes fail, which is highly unlikely. So, this is a very cool optimization, and it is very low cost and high performance.

a16z crypto: How do you see the future development of applications in cryptocurrency? How will users who are not familiar with blockchain adopt blockchain in the future?

Anatoly Yakovenko: I think we have some breakthrough applications and payment methods, because using cryptocurrency for payments has clear advantages compared to traditional systems. I think once regulations are in place, and Congress passes a few bills, payments will become a breakthrough use case. Once we have a means of payment, I think the other side of it will also develop, such as social applications, it can be messaging apps, social graph apps. These apps are currently growing slowly. I think they are in the golden age of takeoff and will reach truly significant numbers.

Once mainstream adoption is reached, it is possible to iterate, understand what people really want, and provide them with those products. People should use products for their utility, not for tokens.

a16z crypto: What advice do you have for builders in this field or outside of this field? Or for those who are curious about cryptocurrency and Web3?

Anatoly Yakovenko: What I want to say is, now is the best time. The market is relatively low in a macro sense, there is not much noise, and you can focus on the fit between the product and the market. When the market turns around, these discoveries will greatly accelerate your development. If you want to work in the field of artificial intelligence, you shouldn't be afraid to start an artificial intelligence company or a cryptocurrency company or other companies now, you should try and build these ideas.

But what I want to say is, people should try to create greater ideas, rather than repeat what already exists. The best analogy I've heard is that when people discovered cement, everyone focused on using cement to make bricks, and then someone thought, maybe I can build skyscrapers. They came up with a way to combine steel and construction, which no one had thought of. The new tool is cement, you just need to figure out what a skyscraper is, and then go build it.

免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

ad
追热点必备!注册HTX领1500U
Ad
Share To
APP

X

Telegram

Facebook

Reddit

CopyLink