Layer 2 expansion and misunderstandings redefine scalability

Many solutions to overcome the scalability of the blockchain are mistakenly focused on increasing throughput. However, this ignores the impact of throughput on nodes: in order to process blocks and store network history, the need to upgrade hardware will become stronger and stronger, and then It hinders the decentralization of a network.

Blockchain scalability has always been a hot topic. Almost every blockchain network regards TPS (Transactions Per Second) as a selling point. However, TPS is not an effective indicator for comparing different blockchain networks, and it will bring challenges to evaluating the relative performance of different blockchains. In addition, high TPS usually brings high costs, which raises a question: Are these networks actually scalable, or do they just increase their throughput?

To answer this question, let’s look at how to define scalability, the trade-offs needed to achieve it, and why Validity Rollups are the ultimate scalability solution.

Not all transactions are the same

First of all, we need to state our claim that the simple TPS indicator is not an accurate measure of scalability.

In order to compensate the nodes for executing transactions, and to prevent users from attacking the network by sending unnecessary transactions, the blockchain will charge a fee based on the calculation burden on the chain. In the Ethereum network, the complexity of the computational burden is measured by gas costs. Because gas is a very convenient metric for transaction complexity, this term will also be used for non-Ethereum blockchains in this article, even though it is generally specific to Ethereum.

There are huge differences in the complexity of transactions, so the gas consumed is also different. Bitcoin is the first to implement peer-to-peer transactions without trust, but only supports basic Bitcoin scripts. Simple transfers between these addresses use very little gas. On the contrary, smart contract chains like Ethereum or Solana support virtual machines and Turing complete programming languages, enabling more complex transactions. Therefore, dApps like Uniswap require more gas fees.

This is why it is meaningless to compare the TPS of different blockchains. What we should compare is the computing power or throughput.

All blockchains have variable block size and block time, which determine how many computing units each block can handle and the speed at which new blocks can be added. These two variables determine the throughput of a blockchain.

What limits scalability

The development direction of the blockchain is to achieve the maximum decentralized and inclusive network. In the process of achieving this goal, it will be subject to two basic attributes: hardware and status.

Hardware requirements

A decentralized blockchain network is determined by the ability of the weakest node in the network to verify the blockchain and maintain state. Therefore, the cost of running a node (hardware, bandwidth, storage) should be reduced as much as possible, so that as many people as possible become unlicensed participants in the trustless network.

State growth

The growth of state refers to the speed at which the blockchain grows. The more throughput a blockchain allows per unit of time, the faster the blockchain will grow. Full nodes store network history, and they must also be able to verify the state of the network. By using efficient structures such as trees, the state of the Ethereum network is stored and referenced. As the state grows, new “leaves” and “branches” are added, making it more complicated and time-consuming to perform certain actions. As the size of the chain grows, the worst-case execution of nodes will get worse, resulting in an ever-increasing time to verify new blocks. As time increases, this also increases the total time required for full node synchronization.

Adverse effects of increased throughput

Number of nodes

Minimum requirements and number of nodes for running nodes:

Bitcoin: 350GB hard disk space, 5 Mbit/s connection, 1GB RAM, CPU> 1 Ghz. Number of nodes: ~10,000

Ethereum²: 500GB+ SSD disk space, 25 Mbit/s connection, 4-8GB RAM, CPU 2-4 cores. Number of nodes: ~6,000

Solana³: 1.5TB+ SSD disk space, 300 Mbit/s connection, 128GB RAM CPU 12+ cores. Number of nodes: ~1,200

It can be found that the higher the CPU, bandwidth, and storage requirements of the nodes required for a blockchain’s throughput, the fewer nodes that can participate in the blockchain network, resulting in weaker decentralization and inclusiveness.

Time required to synchronize full nodes

When running a full node for the first time, it needs to synchronize all existing nodes, download and verify the network status from the founding block to the front end of the chain. This process should be as fast and efficient as possible to allow anyone to be a participant in this permissionless agreement.

Taking Jameson Lopp’s 2020 Bitcoin node and 2021 node synchronization test as a reference, Table 1 compares the time required for Bitcoin, Ethereum, and Solana to synchronize a full node on a common consumer computer.

Layer 2 expansion and misunderstandings redefine scalability

Table 1 Blockchain throughput and node synchronization comparison

Table 1 shows that increasing throughput leads to longer synchronization time because more data needs to be processed and stored.

Although the blockchain can alleviate the growing blockchain challenges by continuously improving the node software, such as reducing disk space, faster synchronization speed, stronger crash recovery capabilities, and modularization of certain components, etc. Obviously, the nodes cannot keep up with the increase in throughput.

How should scalability be defined

Scalability is the most misunderstood term in the blockchain field. Although increasing throughput is desirable, this is only part of the puzzle.

Scalability means more transactions on the same hardware. Therefore, scalability can be divided into two categories: sort scalability and verification scalability.

Sorting scalability

Sorting describes the behavior of sorting and processing transactions in the network. As mentioned earlier, any blockchain can increase the block space and shorten the block time to slightly increase the throughput, until this method has a significant impact on the decentralization of the blockchain. However, the improvements provided by adjusting these simple parameters are limited. In theory, the Ethereum virtual machine can achieve up to 2000TPS, but in the long run, it cannot meet the demand for block space.

In order to expand the sorting, Solana has made some impressive innovations: it uses a parallelizable execution environment and a clever consensus mechanism to achieve more efficient throughput. However, despite its improvements, this approach is neither sufficient nor scalable. As Solana increases its throughput, the hardware cost of running nodes and processing transactions has also increased.

Verify scalability

Verifying scalability describes a method of increasing throughput without the burden of nodes that continue to increase hardware costs. Specifically, it refers to cryptographic innovations like Validity proofs. They are the reason why Validity Rollups can extend a blockchain continuously.

What is Validity Rollup?

Validity Rollups (also known as “ZK-Rollup”) migrate calculations and state storage off-chain, but put a small portion of certain data on-chain. At the bottom of the blockchain, there is a smart contract that maintains the state root of Rollup. On Rollup, a batch of highly compressed transactions are sent to the off-chain prover (Prover) along with the current state root. This prover calculates the transaction, generates a proof of validity for the result and the new state root, and sends it to the verifier on the chain. This verifier verifies the validity proof, and the smart contract that maintains the Rollup state updates the proof information provided by the prover to a new state.

How does Validity Rollups achieve scalability with the same hardware requirements?

Even if the prover does need high-end hardware, they will not affect the decentralization of a blockchain; because the validity of the transaction is guaranteed by mathematically verifiable proofs.

The important thing is to verify the certification requirements. Because the participating data is highly compressed and largely abstracted through calculations, its impact on the underlying blockchain nodes is minimal.

Verifiers (Ethereum nodes) do not need high-end hardware, and the size of batches (collecting transactions together) will not increase hardware requirements. Only state transitions and a small part of the data response need to be processed and stored by the node. This allows all Ethereum nodes to verify the batch of Validity Rollup through existing hardware.

The more transactions, the cheaper the price

Generally speaking, the more transactions on the blockchain, the more expensive it is for everyone. Because, as the block space is filled, users need to bid higher than others in a free market in order for their transactions to be included in the block.

For a Validity Rollup, this situation is the opposite. There is a certain cost to verify a batch of transactions on Ethereum. As the transaction volume within a batch grows, the cost of verifying the batch is slowly reduced exponentially. In other words, adding more transactions to a batch will result in cheaper transaction fees, even if the verification fee for the batch is increasing. Validity Rollups hopes to have as many transactions as possible in a batch, so that the verification fee can be shared by all traders. With the unlimited growth of the batch size, each transaction fee is infinitely close to zero. The more transactions on the Validity Rollup, the cheaper it is for everyone.

dYdX is a dAPP supported by Validity Rollup. It is common to see more than 12,000 transactions in a batch. You can compare the Validity Rollup with the gas consumption of the same transaction volume on the main network, and you can see the benefits of reducing costs.

Process a transaction of dYdX on the Ethereum mainnet: 200,000 gas

Process a transaction of dYdX on StarkEx: <500gas

From another perspective: the main cost of Validity Rollups has a linear relationship with the number of users within the same batch.

Why Optimistic Rollups don’t provide scalability like people think

In theory, Optimistic Rollups provide almost the same scalability advantages as Validity Rollups. However, there is an important difference: Optimistic Rollups optimize for the average case, while Validity Rollups optimize for the worst case. Because blockchain systems operate in an unstable environment, optimizing for the worst-case scenario is the only way to achieve security.

In the worst environment of Optimistic Rollup, fraudulent verifiers do not check user transactions. Therefore, in order to fight fraud, users need to synchronize an Ethereum full node and an L2 full node, and they need to calculate suspicious transactions by themselves.

In the worst environment of Validity Rollup, a user will only need to synchronize an Ethereum full node to verify the validity proof, saving himself computational burden.

Unlike Validity Rollups, the cost of Optimistic Rollups is linear with transaction volume, not the number of users, which makes them more expensive.

The last part of the puzzle-getting Rollup status without permission

In order to ensure the validity of the transaction, the user only needs to run an Ethereum node. However, users and developers want to view and run the status and operation of Rollup for various purposes. An index L2 node can perfectly meet this requirement. Not only does it allow users to see transactions in the network, but it is also a critical infrastructure necessary for the operation of the ecosystem infrastructure. Indexers like The Graph, Alchemy, Infura, oracles like Chainlink, and blockchain explorers, all of which are supported by a permissionless, indexing L2 node.

in conclusion

Many solutions to overcome the scalability of the blockchain are mistakenly focused on increasing throughput. However, this ignores the impact of throughput on nodes: in order to process blocks and store network history, the need to upgrade hardware will become stronger and stronger, and then It hinders the decentralization of a network .

With the emergence of cryptographic Validity-proof, a blockchain can achieve true scalability without burdening nodes due to the cost of each upgrade, allowing more extensive decentralization. For the same hardware, more powerful and more complex calculations can now be used for more transactions, thereby reversing the predicament of the fee market in the process. The more activity on the Validity Rollup, the cheaper it is.

Posted by:CoinYuppie,Reprinted with attribution to:https://coinyuppie.com/layer-2-expansion-and-misunderstandings-redefine-scalability/
Coinyuppie is an open information publishing platform, all information provided is not related to the views and positions of coinyuppie, and does not constitute any investment and financial advice. Users are expected to carefully screen and prevent risks.