Long article: The Future of Blockchain 3.0 and Web3 from the Perspective of ICP

CN
PANews
Follow
1 year ago

Author: 0xkookoo, former BybitTechLead, now a geek web3 consultant

Author's Twitter: @0xkookoo

Editor: Faust, geek web3

Introduction

BTC proposed electronic cash, opening up the blockchain industry from 0 to 1

ETH proposed smart contracts, leading the blockchain industry from 1 to 100

ICP proposed Chainkey technology, driving the blockchain industry from 100 to 100,000,000

On January 3, 2009, the first block of BTC was mined, and since then, the blockchain has developed for 14 years. Looking back over the past 14 years, the brilliance and greatness of BTC, the emergence of Ethereum, the passion of EOS's crowdfunding, the fate struggle of PoS & PoW, Polkadot's interconnection of thousands of chains, one amazing technology after another, and countless extraordinary stories have made countless insiders bend over!

Currently, in 2023, what is the overall situation of the entire blockchain? Below are my thoughts, detailed in the section "Interpretation of the Public Chain Pattern" in this article.

BTC, with the legitimacy of introducing electronic cash, stands tall and is the industry's cornerstone.

ETH, with the programmability of introducing smart contracts and the composability of the L2 ecosystem, is the industry leader.

Cosmos, Polkadot, and others are attempting to dominate the world with cross-chain interoperability.

Various Ethereum killers emerge endlessly, leading in their respective fields.

But how will the entire blockchain industry develop in the next 10 years? Below are my thoughts.

Sovereignty is the only problem that blockchain needs to solve, including asset sovereignty, data sovereignty, speech sovereignty, etc. Otherwise, there is no need for blockchain;

Immutability is a sufficient condition, but not a necessary condition. As long as you can ensure that my sovereignty is not compromised, I don't care if you tamper with it. If everyone's assets in the world are tampered with and doubled in proportion, what's the difference?

Complete decentralization is impossible to achieve. No matter how it is designed, there will always be "innate" or vested interest holders who have a greater say, and there will always be people who choose not to participate. "Decentralization of multiple centers" is the ultimate pattern.

Transparency is a must. Isn't this whole social experiment for all of humanity to have a say and the right to protect their sovereignty? Although some people are lazy, some are willing to trust more professional individuals, and some choose to give up voting for maximum efficiency, this is also their active choice. As long as everything is transparent and there is no behind-the-scenes manipulation, I am willing to accept it. If I lose, it's because I'm not as skilled as others, survival of the fittest, which also conforms to the market economy.

The control of decentralized code execution is the core. Otherwise, it's like pulling down your pants and farting. After a week of public voting, the project still deploys a malicious version of the code. Even if it's not a malicious version, it's still playing with everyone. It can be said that half of the world is now made up of code. If decentralized entities do not include control over code execution, how can people, including governments, dare to let the blockchain industry grow?

The linear cost of infinite scalability. As the blockchain becomes increasingly integrated with real life and more and more people participate, the demand becomes greater. Infrastructure cannot support infinite scalability, or it becomes too expensive, which is unacceptable.

Why ICP

Long article: Looking at the future of blockchain 3.0 and web3 from ICP

Here, let's first introduce a story. In 2009, Alibaba proposed the "go IOE" strategy, which was a major milestone that later led to Alibaba's "Double 11" success.

Go IOE

The core content of the "go IOE" strategy is to remove IBM minicomputers, Oracle databases, and EMC storage devices, and implant the essence of "cloud computing" into Alibaba's IT genes. Among them,

I refers to IBM p-series minicomputers, with the AIX operating system (IBM's proprietary Unix system);

O refers to Oracle databases (RDBMS);

E refers to high-end EMC SAN storage.

Long article: Looking at the future of blockchain 3.0 and web3 from ICP The reasons for going IOE are mainly the following three points, but the first point is the fundamental reason, and the latter two are more indirect:

Inability to meet demand: Traditional IOE systems are difficult to adapt to the high-concurrency demands of Internet companies and cannot support large-scale distributed computing architectures;

High cost: Maintaining IOE is too expensive, such as IBM minicomputers costing 500,000, and annual maintenance for Oracle costing tens of thousands;

Strong dependence: IOE systems have a strong dependence and are "held hostage" by vendors such as IBM and Oracle, making it difficult to flexibly configure according to their own needs.

So why was the "go IOE" strategy proposed in 2009 and not earlier?

Before this,

Alibaba's business scale and data volume had not reached a level where traditional IOE systems were difficult to adapt, so the need to go IOE was not urgent;

Domestic database products were not mature enough in terms of technology and quality to replace the role of IOE;

Internet thinking and the concept of cloud computing were not yet popular in China, and decentralized architecture was not a popular direction;

It may have taken some time for management and technical personnel to accumulate practical experience before realizing the problems and measures that needed to be taken.

In 2009,

Alibaba rapidly expanded its business, and the IOE system could not support its scale, and cost issues became more apparent;

Some open-source database products, such as MySQL, had a high degree of maturity and could serve as substitutes;

Internet thinking and cloud computing began to be widely circulated and applied in China, making it easier to promote the concept of "go IOE";

Wang Jian, a former Microsoft technology expert, joined Alibaba in 2008 with a global technical perspective and was deeply trusted by Jack Ma, proposing "go IOE."

But going IOE is not simply changing the software and hardware itself, replacing old software and hardware with new ones, but replacing old ways with new ones, using cloud computing to completely change the IT infrastructure. In other words, this is a change in the industry, not just a simple technical upgrade.

Long article: Looking at the future of blockchain 3.0 and web3 from ICP

Three Stages of Enterprise Development

The development of an enterprise can be divided into 3 stages:

Shaping genes, organizational culture, Start-up, from 0 to 1

Rapid growth, small steps, Scale-up, from 1 to 100

Unlimited expansion, broadening boundaries, Scale-out, from 100 to 100,000,000

Below, let's analyze the entire blockchain industry as if it were an enterprise.

Start-up / Blockchain 1.0 / BTC

The innovation of BTC lies in solving a problem that has plagued computer scientists for decades, namely how to create a digital payment system that can operate without the need for trust in any central authority.

However, BTC does have some limitations in its design and development, which provided market opportunities for subsequent blockchain projects such as Ethereum (ETH). Here are some of the main limitations:

Transaction throughput and speed: BTC's block generation time is about 10 minutes, and the size limit of each block restricts its transaction processing capacity. This means that during busy network times, transaction confirmations may take a long time and may require higher transaction fees.

Limited smart contract functionality: BTC's design is primarily as a digital currency, and its supported transaction types and script language functionality are relatively limited. This restricts BTC's application in complex financial transactions and decentralized applications (DApps).

Difficult to upgrade and improve: Due to BTC's decentralization and conservative design principles, major upgrades and improvements usually require widespread consensus in the community, which is difficult to achieve in practice, resulting in relatively slow progress for BTC.

Energy consumption issue: BTC's consensus mechanism is based on Proof of Work (PoW), which means that a large amount of computing resources are used for competition among miners, resulting in a significant amount of energy consumption. This has been criticized by people for its environmental impact and sustainability. Regarding this, everyone can also pay attention to EcoPoW, which partially alleviates this limitation.

Long article: Looking at the future of blockchain 3.0 and web3 from ICP

Scale-up / Blockchain 2.0 / ETH

The current Layer2 expansion form of Ethereum can be seen as a form of "vertical expansion," relying on the security and data availability of the underlying Layer1 to ensure. Although it appears to be a 2-layer structure, it will ultimately be limited by the processing capacity of Layer1, even if it is changed to a multi-layer structure, such as creating Layer3, Layer4, it only adds complexity to the entire system, delaying a bit of time. Moreover, according to the law of diminishing marginal returns, for each additional layer, the additional overhead will greatly reduce the scaling effect. This vertical layering can be seen as a hardware upgrade for a single machine, except that this single machine refers to the entire ETH ecosystem.

As usage increases, users' demand for low-cost and high-performance will also increase. As an application on Layer1, the cost of Layer2 can only be reduced to a certain extent, and ultimately it is still limited by the basic cost and throughput of Layer1. This is similar to the demand curve theory in economics - as prices decrease, total demand will increase. Vertical expansion is difficult to fundamentally solve the scalability problem.

Ethereum is a towering tree, and everyone relies on that root. Once the speed at which that root absorbs nutrients cannot keep up, people's needs will not be met.

Long article: Looking at the future of blockchain 3.0 and web3 from ICP

Therefore, only horizontal expansion is more likely to have infinity.

Some people believe that multi-chain cross-chain is also a form of horizontal expansion.

Taking Polkadot as an example, it is a heterogeneous kingdom, where each country looks different, but each time something is done, a kingdom must be built;

Cosmos is a homogeneous kingdom, where the veins of each country look the same, but each time something is done, a kingdom must be established;

However, from an infrastructure perspective, the models of the above two are somewhat strange. Every time an application is developed, a whole kingdom must be built? Let's take an example to see how strange this is,

Three months ago, I bought a Mac and developed a Gmail application on it;

Now I want to develop a YouTube application, but I have to buy a new Mac to develop it, which is very strange.

And both of these approaches face the problem of high complexity in cross-chain communication when adding new chains, so they are not my first choice.

Long article: Looking at the future of blockchain 3.0 and web3 from ICP

Scale-out / Blockchain 3.0 / ICP

To achieve scale-out, a complete set of underlying infrastructure is needed to support rapid horizontal expansion without reinventing the wheel.

A typical example that supports scale-out is cloud computing. The underlying templates such as VPC, subnet, network ACL, and security groups are all the same, and all machines are numbered and typed. Core components such as RDS and MQ support unlimited expansion, and if more resources are needed, they can be quickly launched with the click of a button.

A leader once shared with me that if you want to understand what infrastructure and components internet companies need, you just need to look at all the services provided by AWS. It is the most comprehensive and powerful combination.

Similarly, let's take a high-level look at ICP and see why it meets the requirements for scale-out.

Long article: Looking at the future of blockchain 3.0 and web3 from ICP

Here are a few concepts to start with:

Dfinity Foundation: A non-profit organization dedicated to advancing the development and application of decentralized computing technology. It is the developer and maintainer of the Internet Computer protocol, aiming to achieve comprehensive development of decentralized applications through innovative technology and an open ecosystem.

Internet Computer (IC): A high-speed blockchain network developed by the Dfinity Foundation, designed specifically for decentralized applications. It uses a new consensus algorithm to achieve high throughput and low-latency transaction processing, while also supporting the development and deployment of smart contracts and decentralized applications.

Internet Computer Protocol (ICP): The native token of the Internet Computer protocol, it is a digital currency used to pay for network usage fees and reward nodes.

What's ICP

The following content may be a bit hardcore, but I have provided a layman's description, and I hope everyone can keep up. If you want to discuss more detailed content with me, you can find my contact information at the top of the article.

Architecture Overview

From a layered structure perspective, from bottom to top:

P2P layer: Collects and sends messages from users, replicas in subnets, and other subnets. Ensures that messages can be delivered to all nodes in the subnet to ensure security, reliability, and resilience.

Consensus layer: The main task is to sort the input to ensure that all nodes within the same subnet process tasks in the same order. To achieve this goal, the consensus layer uses a new consensus protocol designed to ensure security, liveness, and resistance to DOS/SPAM attacks. After consensus is reached within the subnet on the order of processing various messages, these blocks are passed to the message routing layer.

Message routing layer: Prepares input queues for each Canister based on the tasks from the consensus layer. After execution, it is also responsible for receiving the output generated by the Canister and forwarding it to local or other subnet Canisters as needed. It also records and verifies user-requested responses.

Execution layer: Provides a runtime environment for Canisters, reads input in an orderly manner according to the scheduling mechanism, calls the corresponding Canister to complete tasks, and returns the updated state and generated output to the message routing layer. It uses the non-determinism brought by random numbers to ensure the fairness and auditability of computations. Because in some cases, the behavior of Canisters needs to be unpredictable. For example, in cryptographic operations, random numbers are used to increase the security of encryption. In addition, the execution results of Canisters need to be random to prevent attackers from analyzing the execution results of Canisters to discover vulnerabilities or predict the behavior of Canisters.

Long article: Looking at the future of blockchain 3.0 and web3 from ICP

Key Components

Long article: Looking at the future of blockchain 3.0 and web3 from ICP

From a compositional perspective:

Subnet: Supports unlimited expansion, with each subnet being a small blockchain. Subnets communicate with each other using Chain Key technology, as consensus has already been reached within the subnet, so verification through Chain Key is sufficient.

Replica: Each subnet can have many nodes, and each node is a replica. The IC's consensus mechanism ensures that each replica within the same subnet processes the same input in the same order, ensuring that the final state of each replica is the same. This mechanism is called Replicated State Machine.

Canister: A smart contract running on the ICP network, it is a computing unit that can store data and code, and communicate with other Canisters or external users. ICP provides a runtime environment for executing Wasm programs in Canisters and communicating with other Canisters and external users through message passing. It can be thought of as a docker for running code, where you inject a Wasm Code Image to run inside.

Node: Independent servers, as Canisters still need physical machines to run, these physical machines are the actual machines in the data center.

Data Center: Nodes in the data center are virtualized into a replica using the node software IC-OS, and a few replicas are randomly selected from multiple data centers to form a subnet. This ensures that even if a data center is compromised or experiences a natural disaster, the entire ICP network continues to operate normally, somewhat like an upgraded version of Alibaba's "two-site three-center" disaster recovery and high availability solution. Data centers can be distributed worldwide, and even in the future, a data center could be built on Mars.

Boundary Nodes: Provide entry and exit points between the external network and IC subnets, and verify responses.

Principal: Identifiers for external users derived from public keys, used for access control.

Network Nervous System (NNS): A governance algorithm DAO that uses staked ICP for governance of the IC.

Registry: A database maintained by the NNS, containing mappings between entities (such as replicas, canisters, subnets), somewhat similar to how DNS works.

Cycles: Local tokens representing CPU quotas used to pay for the runtime resources consumed by Canisters. If I have to express it in Chinese, I would use the term "computational cycles," because cycles mainly refer to the unit used to pay for computational resources.

Long article: Looking at the future of blockchain 3.0 and web3 from ICP

Key Innovative Technologies of ICP

Long article: Looking at the future of blockchain 3.0 and web3 from ICP

From a foundational perspective, Chain-key technology is used, including:

Publicly Verifiable Secret Sharing scheme (PVSS Scheme): Used in the Internet Computer protocol's whitepaper to achieve decentralized key generation (DKG) to ensure that node private keys are not leaked during the generation process.

Forward-secure public-key encryption scheme: Ensures that even if a private key is leaked, previous messages cannot be decrypted, thereby enhancing system security.

Key resharing protocol: A key sharing scheme based on threshold signatures used for key management in the Internet Computer protocol. The main advantage of this protocol is that it can share existing keys with new nodes without creating new keys, reducing the complexity of key management. Additionally, the protocol uses threshold signatures to protect the security of key sharing, thereby enhancing system security and fault tolerance.

Threshold BLS signatures: ICP implements a threshold signature scheme where each subnet has a publicly verifiable public key, and its corresponding private key is split into multiple shares, each held by a replica in the subnet. Only when a sufficient number of replicas within the same subnet sign a message is it considered valid. This ensures that messages transmitted between subnets and replicas are encrypted but quickly verifiable, ensuring both privacy and security. The BLS algorithm is a well-known threshold signature algorithm, uniquely capable of producing a very simple and efficient threshold signature protocol, with a unique signature for a given public key and message, ensuring only one valid signature.

Non-interactive Distributed Key Generation (NIDKG): To securely deploy the threshold signature scheme, Dfinity designed, analyzed, and implemented a new DKG protocol that operates on an asynchronous network with high robustness (even if up to one-third of the nodes in the subnet crash or become compromised, it can still succeed) while still providing acceptable performance. In addition to generating new keys, this protocol can also be used for resharing existing keys. This feature is crucial for the autonomous evolution of the IC topology over time as subnets change their membership.

PoUW: PoUW, with the additional "U" for "Useful," improves performance and reduces unnecessary work for nodes compared to PoW. It does not artificially create difficult hash calculations and focuses computational power on serving users. Most resources (CPU, memory) are used for executing code in actual canisters.

Chain-evolution technology: This technology maintains the blockchain state machine and ensures its security and reliability. In the Internet Computer protocol, Chain-evolution technology mainly includes the following two core technologies:

  1. Summary blocks: The first block of each epoch is a summary block containing special data for managing different threshold signature schemes. A low-threshold scheme is used to generate random numbers, while a high-threshold scheme is used to authenticate the replicated state of subnets.

  2. Catch-up packages (CUPs): CUPs are used to quickly synchronize node states, allowing newly added nodes to quickly obtain the current state without needing to rerun the consensus protocol.

I deduce the logic of the entire underlying technology of IC as follows:

In traditional public-key cryptography, each node has its own public-private key pair, meaning that if a node's private key is compromised or attacked, the security of the entire system is threatened. The threshold signature scheme divides a key into multiple parts, distributed to different nodes, and only when a sufficient number of nodes cooperate can a signature be generated. This means that even if some nodes are compromised or leaked, it will not have a significant impact on the overall system security. Additionally, the threshold signature scheme can increase the decentralization of the system, as it does not require a centralized entity to manage keys, but rather disperses keys to multiple nodes, avoiding single points of failure and centralization risks. Therefore, IC uses the threshold signature scheme to enhance system security and decentralization, aiming to achieve a secure, scalable, and quickly verifiable general-purpose blockchain using threshold signatures.

BLS is a well-known threshold signature algorithm, uniquely capable of producing a very simple and efficient threshold signature protocol. Additionally, BLS signatures have the advantage of not requiring signature state storage, meaning that for a given public key and message, there is only one valid signature. This ensures high scalability, which is why ICP chose the BLS scheme.

Because the threshold signature scheme is used, a distributor is needed to distribute key fragments to different participants. However, having a single distributor can lead to single point of failure issues. Therefore, Dfinity designed a distributed key distribution technology, NIDKG. During the initialization of subnet creation, all participating replicas collectively and non-interactively generate a public key A, and for the corresponding private key B, each participant mathematically calculates and holds a derived secret share.

To implement NIDKG, it is necessary to ensure that each distributed participant does not cheat. Therefore, each participant can not only obtain their own secret share but also publicly verify the correctness of their secret share, which is a crucial point in achieving distributed key generation.

What if the subnet key at a certain historical moment is leaked? How can the integrity of historical data be ensured? Dfinity uses a forward-secure signature scheme, ensuring that even if the subnet key at a certain historical moment is leaked, attackers cannot change the data of historical blocks. This also prevents later corruption attacks on the historical data of the blockchain. If this restriction is strengthened, it can also ensure that information is not successfully eavesdropped during transmission, as the timestamps do not match. Even if the key is cracked in a short period, the communication content from the past cannot be deciphered.

With NIDKG, if a segment of a secret share is held by a node for a long time, and gradually, various nodes are compromised by hackers, the entire network may encounter problems. Therefore, continuous key updates are needed, but key updates cannot require all participating replicas to gather for interactive communication; they must also be non-interactive. However, because public key A is already registered in NNS, and other subnets also use this public key A for verification, it is best not to change the subnet public key. But if the subnet public key does not change, how can the secret shares between nodes be updated? Therefore, Dfinity designed a Key resharing protocol, which, without creating a new public key, allows all replicas holding the current version of secret shares to non-interactively generate a new round of derived secret shares for the new version of secret share holders. This ensures that the new version of secret shares is authenticated by all current legitimate secret share holders, the old version of secret shares is no longer legitimate, and even if the future new version of secret shares is leaked, the old version of secret shares will not be leaked, as the polynomials between the two are completely unrelated and cannot be reverse-engineered. This also achieves forward security, as mentioned earlier.

Additionally, it ensures efficient re-random distribution. When trusted nodes or access control change, access policies and controllers can be modified at any time without restarting the system. This greatly simplifies the key management mechanism in many scenarios. For example, this is very useful when subnet membership changes, as resharing will ensure that any new member has the appropriate secret share, and any replica that is no longer a member will no longer have a secret share. Furthermore, if a small number of secret shares are leaked to attackers at any time or even every time, these secret shares are of no benefit to attackers.

Because traditional blockchain protocols require storing all block information from the genesis block, as the blockchain grows, this can lead to scalability issues, making it very difficult for many public chains to develop a light client. Therefore, IC aims to solve this problem and has developed Chain-evolution Technology. At the end of each epoch, all processed input and required consensus information can be safely cleared from the memory of each replica, greatly reducing the storage requirements for each replica. This allows IC to scale to support a large number of users and applications. Additionally, Chain-evolution technology includes CUPs, which allow newly added nodes to quickly obtain the current state without needing to rerun the consensus protocol, greatly reducing the barriers and synchronization time for new nodes joining the IC network.

In conclusion, all the underlying technologies of IC are interconnected, based on cryptography (from theory), and fully consider industry challenges such as fast node synchronization (from practice). Truly a comprehensive solution!

ICP Features / Key Features

Reverse Gas Model: Traditional blockchain systems often require users to hold native tokens, such as ETH or BTC, and then consume these tokens to pay transaction fees. This increases the barrier to entry for new users and does not align with user habits. ICP adopts a reverse Gas model design, allowing users to directly use the ICP network, with the project responsible for the fees. This lowers the barrier to use, aligns better with internet service habits, and supports the attainment of a larger network effect, thereby enabling more users to join.

Stable Gas: For other public chains on the market, in order to ensure the security of the chain and for transaction needs, people will buy native tokens, and miners will vigorously mine, or people will hoard native tokens, thus contributing to the computing power of the public chain, such as Bitcoin, or providing security for the staking economy of the public chain, such as Ethereum. It can be said that our demand for btc/eth actually comes from the requirements of Bitcoin/Ethereum public chains for computing power/staking, which is essentially a security requirement for the chain. Therefore, as long as a chain directly uses native tokens to pay for gas, it will still be expensive in the future. Perhaps native tokens are cheap now, but as the ecosystem of the chain develops, they will become more expensive. However, ICP is different. The Gas consumed in the ICP blockchain is called Cycles, which is obtained through the consumption of ICP and is stable under algorithmic adjustment, anchored to 1 SDR (SDR can be considered as a stable unit calculated from multiple national fiat currencies). Therefore, no matter how much ICP rises in the future, the money you spend on anything within ICP will be the same as today (excluding inflation).

Wasm: By using WebAssembly (Wasm) as the standard for code execution, developers can use a variety of popular programming languages (such as Rust, Java, C++, Motoko, etc.) to write code, thereby supporting the participation of more developers.

Support for Running AI Models: Python language can also be compiled into wasm. Python has a large user base worldwide and is the primary language for AI, such as matrix and large integer calculations. Some people have already run the Llama2 model on IC. If the concept of AI + Web3 happens on ICP in the future, I wouldn't be surprised at all.

Web2 User Experience: Many applications on ICP currently achieve astonishing results with millisecond-level queries and second-level updates. If you don't believe it, you can directly use OpenChat, a purely on-chain decentralized chat application.

Running Frontend on the Chain: You may have only heard of the backend part of the content being written into simple smart contracts and then run on the chain, ensuring that core logic such as data assets is not tampered with. However, the frontend also needs to run entirely on the chain to be secure, as frontend attacks are a typical and frequent problem. Imagine that everyone might think that Uniswap's code is very secure, the smart contract has been verified by so many people over so many years, and the code is simple, so there should be no problem. But suddenly, if one day Uniswap's frontend is hijacked, and the contract you interact with is actually a malicious contract deployed by hackers, you could lose everything in an instant. However, if you store and deploy all frontend code in IC's Canister, at the very least, the consensus of IC ensures that the frontend code cannot be tampered with by hackers, providing a more comprehensive protection. Additionally, IC can directly run and render the frontend without affecting the normal operation of the application. On IC, developers can directly build applications without traditional cloud services, databases, or payment interfaces, and there is no need to purchase a frontend server or worry about databases, load balancing, network distribution, firewalls, etc. Users can directly access the frontend web pages deployed on ICP through a browser or mobile app, such as a personal blog I previously deployed.

DAO Control of Code Upgrades: Currently, many DeFi protocols give complete control to the project team, allowing them to make major decisions such as suspending operations or selling funds without going through community voting and deliberation. ICP's ecosystem of DAPPs runs in a container controlled by DAO, even if a project team has a large share in the vote, it still follows a public voting process, meeting the necessary conditions for transparency in blockchain governance described at the beginning of this article. This process ensures that community intentions are better reflected and achieves a higher level of governance compared to other current public chain projects.

Automatic Protocol Upgrades: When a protocol upgrade is needed, a new threshold signature scheme can be added in the summary block to achieve automatic protocol upgrades. This approach ensures network security and reliability while avoiding the inconvenience and risks of hard forks. Specifically, the Chain Key technology in ICP ensures network security and reliability by maintaining the blockchain state machine using a special signature scheme. At the beginning of each epoch, the network uses a low-threshold signature scheme to generate random numbers and then uses a high-threshold signature scheme to authenticate the replication status of subnets. This signature scheme ensures network security and reliability while also enabling automatic protocol upgrades, avoiding the inconvenience and risks of hard forks.

Fast Forwarding: Fast Forwarding is a technology in the Internet Computer protocol that allows newly added nodes to quickly obtain the current state without needing to rerun the consensus protocol. Specifically, the process of Fast Forwarding is as follows:

  1. The newly added node obtains the Catch-up package (CUP) for the current epoch, which includes the Merkle tree root, summary block, and random number for the current epoch.

  2. The newly added node uses the state sync subprotocol to obtain the complete state for the current epoch from other nodes, while using the Merkle tree root in the CUP to verify the correctness of the state.

  3. The newly added node uses the random number in the CUP and protocol messages from other nodes to run the consensus protocol, quickly synchronizing to the current state.

The advantage of Fast Forwarding is that it allows newly added nodes to quickly obtain the current state without having to start from scratch, as is required by some other public chains. This can accelerate network synchronization and expansion, reduce communication between nodes, and improve network efficiency and reliability.

Decentralized Internet Identity: The identity system on IC makes me feel that the DID problem can be completely solved, and thoroughly solved, in terms of both scalability and privacy. The current implementation version of the identity system on IC is called Internet Identity, and there is a more powerful NFID developed based on it.

Its principles are as follows:

  1. During registration, it generates a pair of public and private keys for the user. The private key is stored in the user's device's TPM security chip and will never be leaked, while the public key is shared with services on the network.

  2. When a user wants to log in to a dapp, the dapp creates a temporary session key for the user. The user signs the session key through authorized electronic signature, allowing the dapp to verify the user's identity.

  3. After the session key is signed, the dapp can use the key on behalf of the user to access network services without requiring the user to sign electronically each time. This is similar to authorized login in Web2.

  4. The session key has a short expiration period. After it expires, the user needs to reauthorize with biometric authentication to obtain a new session key.

  5. The user's private key is always stored in the local TPM security chip and does not leave the device. This ensures the security of the private key and the user's anonymity.

  6. By using the temporary session key, different dapps cannot track the user's identity. This achieves true anonymous and private access.

  7. Users can easily sync and manage their Internet Identity across multiple devices, but the devices themselves also require corresponding biometric or hardware keys for authorization.

Internet Identity has several advantages:

  1. No need to remember passwords. Users can directly log in using biometrics such as fingerprint recognition, without needing to set and remember complex passwords.

  2. Private keys do not leave the device, providing higher security. The private key is stored in the TPM security chip and cannot be stolen, addressing the issue of usernames and passwords being stolen in Web2.

  3. Anonymous login, cannot be tracked. Unlike Web2, where usernames correspond to email service providers and can be tracked across platforms, Internet Identity eliminates this tracking.

  4. Convenient management across multiple devices. Users can log in to the same account on any device that supports biometrics, rather than being limited to a single device.

  5. Not reliant on centralized service providers, achieving true decentralization. Unlike Web2, where usernames correspond to email service providers.

  6. Uses a delegate authentication process, eliminating the need for repeated signing in with each login, providing a better user experience.

  7. Supports using dedicated security devices such as Ledger or Yubikey for login, enhancing security.

  8. Hides the user's actual public key, ensuring user privacy by preventing transaction records from being queried using the public key.

  9. Seamlessly compatible with Web3 blockchains, allowing secure and efficient login and signing of blockchain DApps or transactions.

The architecture is more advanced, representing an organic fusion of the advantages of Web2 and Web3, and is the standard for future network accounts and logins.

In addition to providing a new user experience, it also adopts the following technical measures to ensure its security:

  1. Uses TPM security chips to store private keys, designed so that even developers cannot access or extract the private keys, preventing theft of private keys.

  2. Biometric authentication such as fingerprint or facial recognition serves as a secondary authentication mechanism, requiring verification with the device being used, ensuring that only the user holding the device can use the identity.

  3. The session key is designed to have a short expiration period, limiting the time window for misuse and forcing the relevant ciphertext to be destroyed at the end of the session, reducing risk.

  4. Public key encryption technology ensures that data in transit is encrypted, preventing external eavesdroppers from obtaining the user's private information.

  5. Not reliant on third-party identity providers, PRIVATE KEY is generated and controlled by the user, not trusting third parties.

  6. Combining the immutability brought by the IC blockchain consensus mechanism ensures the reliability of the entire system's operation.

  7. Continuously updating and upgrading related cryptographic algorithms and security processes, such as adding more secure mechanisms like multi-signature.

  8. Open-source code and decentralized design optimize transparency, facilitating community collaboration to enhance security.

Long article: Looking at the future of blockchain 3.0 and web3 from ICP

(Internet Identity) Core Team

Looking at the team, there are a total of 200+ employees, all of whom are highly talented individuals. The employees have collectively published over 1600 papers, with over 100,000 citations, and hold a total of 250+ patents.

From an academic perspective, recent mathematical theories include Threshold Relay and PSC chains, Validation Towers and Trees, and USCID.

From a technical background perspective, they have a deep background in technical research and development, having been involved in research in the fields of big data and distributed computing in the early years, laying the technical foundation for building the complex ICP network.

From an entrepreneurial perspective, they previously operated an MMO game using their own distributed system, which hosted millions of users. In 2015, Dominic began the launch of Dfinity, and he is also the president and CTO of String Labs.

From a visionary perspective, he proposed the concept of decentralized internet over 10 years ago, and the long-term advancement of this grand project is no easy feat. Currently, his design thinking is highly forward-looking.

Founder Dominic Williams is a cryptographer and serial entrepreneur.

In terms of the technical team, Dfinity has a very strong team. The Dfinity Foundation has assembled a large number of top cryptography and distributed systems experts, such as Jan Camenisch, Timothy Roscoe, Andreas Rossberg, Maria D., Victor Shoup, and even "L" - Ben Lynn, the author of the BLS cryptographic algorithm, is also employed at Dfinity. This provides strong support for the technical innovation of ICP. The success of a blockchain project depends on technology, and the gathering of top talents can bring about technological breakthroughs, which is a key advantage of ICP.

Long article: Looking at the future of blockchain 3.0 and web3 from ICP

Dfinity Foundation Team Fund-raising & Tokenomics

If I were to cover this section as well, this article would become too long. Therefore, I have decided to write a separate article to provide a detailed analysis for everyone. This article focuses more on the development direction of the blockchain industry and why ICP has great potential.

Long article: Looking at the future of blockchain 3.0 and web3 from ICP

Applications

All types of applications can be developed on ICP, including social platforms, creator platforms, chat tools, games, and even metaverse games.

Many people say that it is difficult to achieve global state consistency on IC, so it is naturally unsuitable for DeFi. However, I think this issue itself is wrong. It's not difficult to achieve global state consistency, it's difficult to achieve global state consistency under low latency. If you can accept a 1-minute delay, even 10,000 machines globally can achieve global consistency. Ethereum and BTC, with so many nodes, have already been forced to achieve global state consistency under high latency, and as a result, they cannot achieve horizontal infinite scalability. IC solves the problem of horizontal infinite scalability through subnet segmentation. As for achieving global state consistency under low latency, it can be achieved through a strong consensus distributed consistency algorithm, well-designed network topology, high-performance distributed data synchronization, valid timestamp verification, and mature fault tolerance mechanisms. However, to be honest, it will be more difficult to build a trading platform on IC at the application layer compared to the high-performance trading platforms built by the people on Wall Street. It's not just about achieving consensus across multiple data centers. However, the difficulty does not mean it is impossible. It just means that many technical issues need to be solved first, and eventually, a moderate state will be found that ensures both security and an acceptable user experience. For example, the following ICLightHouse.

ICLightHouse, a fully on-chain order book DEX. What does "fully on-chain" mean? How many technical challenges need to be solved? On other public chains, this is not even considered, but on IC, at the very least, it's doable, giving us hope.

Long article: Looking at the future of blockchain 3.0 and web3 from ICP

OpenChat, an excellent decentralized chat application. I haven't seen a second product like this in the entire blockchain industry. Many other teams have attempted this direction, but ultimately failed due to various technical issues, mainly because users found the experience unsatisfactory, such as the speed being too slow - taking 10 seconds to send a message and another 10 seconds to receive a message. However, a small team of three people on IC has successfully created such a product, and the smoothness is something you have to experience for yourself. Join the organization, where you can enjoy the collision of ideas and to some extent, the pleasure of freedom of speech.

Long article: Looking at the future of blockchain 3.0 and web3 from ICP

Mora, a platform for super creators, where everyone can create their own planet, build their own personal brand, and the content you produce will always belong to you, and you can even support paid readings. It can be described as a decentralized knowledge planet, and I find myself refreshing articles on it every day.

Long article: Looking at the future of blockchain 3.0 and web3 from ICP

Mora - 0xkookoo

OpenChat and Mora are applications that I use almost every day, giving me a feeling of being unable to do without them, and the two words that describe them are freedom and fulfillment.

Some teams are already developing game applications on IC, and I think IC will eventually take over the narrative of full-chain games. As I mentioned in the GameFi section in a previous article, the playability and fun of games are things that project teams need to consider. Playability is easier to achieve on IC, and I look forward to the masterpiece from Dragginz.

Long article: Looking at the future of blockchain 3.0 and web3 from ICP

Summary

ICP is like the Earth, and the Chainkey technology is like the Earth's core. Its relationship with ICP is similar to the relationship between the TCP/IP protocol and the entire internet industry. Each Subnet is like the continents of Asia, Africa, and Latin America. Of course, a Subnet can also be the Pacific/Atlantic Ocean. In continents and oceans, there are different buildings and areas (Replica and Node), where each area and building can host plants (Canister), and different animals live happily. ICP supports horizontal expansion, allowing each subnet to operate autonomously while also communicating between different subnets. Regardless of the application, whether it's social media, finance, or even the metaverse, the final consensus can be achieved through this distributed network. Achieving a global ledger under synchronous conditions is very easy, but achieving "global state consistency" under asynchronous conditions is a big challenge. Currently, only ICP has the opportunity to achieve this.

It is important to note that here we are referring to "global state consistency," not "global state consistency." "Global state consistency" requires all participating nodes to [achieve consistency in all operation sequences], [achieve consistent final results], [be objectively consistent, regardless of whether nodes fail], [have consistent clocks], and [be immediately consistent, with all operations being synchronized]. This can be guaranteed within a single subnet on IC. However, if you want to ensure "global state consistency," all subnets as a whole need to achieve the above "global state consistency" for the same data and state. In practice, it is impossible to achieve this under low latency, which is the bottleneck that currently prevents public chains like ETH from achieving horizontal scalability. Therefore, IC chooses to achieve consensus within a single subnet, with other subnets quickly verifying their results through communication, thereby achieving "eventual global state consistency." This effectively combines the decentralization of large public chains and the high throughput and low latency of consortium chains, and achieves horizontal infinite scalability through mathematical and cryptographic algorithm proofs.

In conclusion, as per my initial thoughts on the ultimate development direction of blockchain, [sovereignty] + [decentralized multi-centralization] + [transparency] + [control over code execution] + [linear cost infinite scalability],

Sovereignty is the only problem that blockchain needs to solve, including asset sovereignty, data sovereignty, and speech sovereignty, otherwise there is no need for blockchain.

IC has completely achieved this.

Immutability is a sufficient condition, but not a necessary condition. As long as you can ensure that my sovereignty is not compromised, I don't care if you tamper with it. If everyone's assets are tampered with and doubled in the same proportion, what's the difference?

IC has also achieved this.

Complete decentralization is not possible. No matter how it is designed, there will always be "naturally" gifted individuals/established interests who have a greater say, and there will always be people who choose not to participate. [Decentralized multi-centralization] is the ultimate pattern.

IC is currently the best among all public chains, able to maintain a certain degree of decentralization while fully leveraging the advantages of centralized entities, thereby better achieving network governance and operation.

Transparency is a must. Isn't this whole social experiment about allowing everyone to have a say and the right to protect their sovereignty? Although some people are lazy, some are willing to trust more professional individuals, and some choose to abstain from voting for the sake of maximizing efficiency, these are all choices they make voluntarily. As long as everything is transparent, with no behind-the-scenes operations, I am willing to accept the outcome, whether it's a clear win or loss. If I lose, it's because I'm not as skilled as others, and this is in line with market economics.

IC has completely achieved this.

Control over code execution is the core. Otherwise, it's like pulling down your pants and farting. Even if there's a week of public voting, the project team still deploys a malicious version of the code in the end. Even if it's not a malicious version, it's still playing with everyone.

Currently, only IC has achieved this.

Linear cost infinite scalability. As blockchain becomes increasingly integrated with real life and more people participate, the demand grows. Infrastructure that cannot support infinite scalability or is too expensive to scale is unacceptable.

Currently, only IC has achieved this.

Based on these facts and my analysis, I believe that ICP = Blockchain 3.0.

This article is just a discussion of why ICP is likely to be the innovation driver of Blockchain 3.0 from the perspective of the future development direction of the blockchain industry. However, it cannot be denied that there are indeed some issues in ICP's tokenomics design, and the ecosystem has not yet exploded. Currently, ICP still needs to continue to strive towards my ultimate vision of Blockchain 3.0. But there's no need to worry, this was always going to be difficult. Even the Dfinity Foundation has a 20-year roadmap ready, and they have already achieved so much in just 2 years since the mainnet launch. They have also used cryptographic methods to integrate with the BTC and ETH ecosystems, and I believe they will reach even greater heights in 3 years.

Long article: Looking at the future of blockchain 3.0 and web3 from ICP

Future

IC has now completed the bottom-up infrastructure construction, and the top-down applications are also beginning to take shape. My recent direct impression is that IC is becoming more and more capable, preparing for the next bull market.

IC is a paradigm shift, not just a simple technological upgrade. It is a paradigm shift from single-machine computing to distributed computing, and even more so, a paradigm shift from single-machine systems to distributed systems. The concept of decentralized cloud computing allows many small companies to enjoy a one-stop development experience in the initial stages.

According to Yu Jun's product value formula: Product value = (New experience - Old experience) - Migration cost. In the future, as long as some people find that the experience benefits of joining the IC ecosystem outweigh the migration cost, more people, including project teams and users, will join. The scale effect of "cloud computing" will be more easily demonstrated. By solving the "chicken and egg" problem, IC's positive flywheel will also be established.

Of course, everyone's definition of experience is subjective, so some people will choose to join first, while others will choose to join later. Those who join first bear greater risks, but usually also receive greater average returns.

References

[References list] ```

免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

OKX:注册返20%
链接:https://www.okx.com/zh-hans/join/aicoin20
Ad
Share To
APP

X

Telegram

Facebook

Reddit

CopyLink