Multicoin Partners: Why Predictability Matters for Blockchain Scalability

Predictability means that developers write code today, knowing that it will always be in effect and that it will cost less to execute the code in the future than it does now.

This article assumes that readers are familiar with Nick Szabo’s “The Scalability of Social Networks,” Vitalik Buterin’s “Weak Subjectivity,” and Haseeb Qureshi’s “Why Decentralization Isn’t as Important as You Think,” among other articles.

But this article is not intended to refute Szabo’s article and his position.

Szabo concludes his article by defining social network scalability as “sacrificing computational efficiency and scalability – consuming more cheap computational resources – thereby reducing and making better use of the huge human resource expenditures involved in modern institutions (e.g., markets, large corporations, and governments). the huge human resource expenditures required.

The successful forks of Bitcoin Cash (BCH) and Bitcoin SV (BSV) to Bitcoin (BTC) certainly support this theory.

However, the crypto ecosystem has come a long way since Szabo published that eloquent article in February 2017. Although he proposed the idea of smart contracts some 20 years ago, it is only in the last two years that the industry has started to recognize the most useful application of smart contracts: DeFi.

The explosion of DeFi has changed everything.

This article assumes that the most important function of the blockchain is not the non-sovereign currency itself, but DeFi.

The shift in the system’s primary function from strictly censorship-resistant, non-sovereign currency to a highly functional and programmable development environment for financial applications implicates every layer of the technology stack, from the network layer (e.g. gossip vs Turbine) to the execution environment (e.g. EVM vs SeaLevel). (See the appendix of this paper for more information on how Solana differs from Ether)

Therefore, the blockchain should be designed and managed as a DeFi development platform first and foremost.

Predictability is Power
People often ask me the question, “How do you think the world will change in the next 10 years? This question is very common. But I’m almost never asked the question, “What’s going to stay the same in the next century? I actually think the second question is more important because knowing its answer allows you to build a business strategy around those things that remain the same …… In retail, we are able to identify that customers want lower prices, and I know that will be a reality in 10 years; they also want faster logistics; they want more choices. It’s hard to imagine a customer coming up to me one day 10 years from now and saying, “Jeff, I love Amazon; I wish your prices were higher,” or “I love Amazon; I wish your logistics were slower” – that’s just not going to happen. So we spend our efforts on these constants, driving them forward, knowing that the energy we put in today will pay off for Amazon customers 10 years from now. When you know something is right (even if it’s long term), you can put a lot of energy into it.

— Amazon founder and CEO Jeff Bezos

Here are some numbers: Coinbase currently has about 50 million registered users, Robinhood has roughly the same number of users, and most large U.S. banks are in the same order of magnitude.

Assuming Coinbase’s strategic focus is to migrate all of its users to DeFi as quickly as possible, and that the regulatory environment supports it. Imagine that Coinbase could do that on Ether?

This is, of course, a question that cannot be answered at this time. Technically, it’s not impossible, in fact it’s quite possible. But no one or no institution can answer that question. What is the reason?

Because no one really knows how Ether will scale in the future. As a simple example, Vitalik has said that optimistic rollups are probably the most desirable scaling solution in the near to medium term, while zk-rollups will dominate in the long term. But the question arises: when and how will this tipping point happen? Will any infrastructure need to be developed/refactored? How will the money flow between these different rollups, and how does it affect smart contract developers, wallets, users, liquidity providers, fiat channels, etc.?

Also, it seems that whichever scaling solution is more important, it cannot be an instantiation of the whole (e.g. a single Optimism rollup). The consequences of Ether scaling are all heterogeneous.

In the long run, this may be a good thing for Ether. Current scaling solutions have their own trade-offs, and it is not clear which trade-off is the best, or if a combination of scaling solutions would be better. So in the long run, the best option for the ethereum ecosystem is to try multiple scaling solutions and figure out which one is best for which application, and then configure bridges, other interoperability solutions, and address latency issues.

Also, all the teams developing scaling solutions are fully funded and already live and user-facing. So they will continue to develop it.

So, how should Coinbase bring 50 million+ users to DeFi?

With such a large user base, one of the most important considerations is certainty. Any business of that size has a strong need for certainty now and in the future.

Large enterprises can never afford to bet on the wrong technology stack, and the opportunity cost of mistakes and subsequent migrations/bridges is huge.

I think the only blockchain protocol that can respond to this problem now – or in the next two years – is Solana.

All rollup-based scaling solutions (including sharding) suffer from the above niggles. Despite billions of dollars raised by good teams (e.g. Cosmos, Polkadot, Avalanche, etc.) for research and development, virtually none of the sharding systems have scaled to scale (and most don’t even work). Even if they work with PoC (proof-of-concept) consensus, there are many emerging issues to deal with (e.g. cross-slicing transaction processing failures, trading platform integration, etc.).

To be clear, I am not suggesting that sharding and rollup cannot be scaled. I’m actually optimistic about both options. However, they don’t really work today and create many unavoidable second- and third-order problems. With multiple intertwined components scaling Ether, a large organization that needs scalability certainty will have a hard time getting what it wants (certainty) in the next two years.

Scalability Costs of Fragmented Social Networks
In addition to the uncertainty described above, applications scattered across shards and rollups incur significant new social coordination costs that do not exist in single-shard systems. The following are a few examples.

Currently, block times and computational throughput vary between Layer 1 and individual Layer 1s. This is directly related to the way all DeFi protocols that manage risk (which includes almost all major components except Uniswap/Sushiswap) are implemented. Some of the DeFi teams have committed to deploying contracts in multiple Layer 1 and Layer 2. However, each implementation environment requires unique risk parameters. This increases the amount of social coordination required for each protocol community and slows down the industry.

It takes longer to exit optimistic rollups (ORUs). The market generally believes that market makers will provide a liquidity bridge between rollups and Layer 1. However, the implementation details of this operation are tricky. Should protocol front-ends provide native support? If so, should they “contract” with a specific market maker (see, for example, Citadel Securities’ PFOF contract with Robinhood)? Or should they leave it up to the front-end user to configure? What if a user wants to move from one ORU to another …… How does the user signal the application to operate Connext or Thorchain instead of exiting to Layer 1?

For Metamask users (who are mostly advanced users), it might make sense for users to manage such complex operations themselves. But for niche wallets that try to abstract away the complexity (like Exodus or Argent), how much extra development time do these teams need to spend to solve these problems? And how many new features are they forced to abandon? What happens if a market maker stops liquidity in a bridge / niche market for some reason? What backup options are available?

Developer tools would have to be updated to handle new data structures (e.g., unprocessed transactions for ORU, zk output for ZKR). The index and query layers will require significant upgrades, and application developers may need to rewrite their subgraphs to handle new data structures (e.g., it is not possible to map an EVM subgraph to Starkware’s Cairo). Developers will be forced to rewrite a large number of applications across various heterogeneous extension scenarios.

As the number of shards and rollups proliferates, developing exchanges will become more challenging. None of these problems are intractable, but they will slow down development and overwhelm the many developers who don’t want to deal with them.

Predictable, but boring scalability
Solana currently supports 50,000 transactions per second, and has surpassed 600 nodes in the global network. Most importantly, Solana offers an infinitely scalable and predictable path. Because it can execute transactions in parallel on the GPU, it can take advantage of the huge gains from GPU parallelism.

Moore’s Law is probably the most important economic force of the last 50 years. But today it presents more of an illusion.

About 10 to 15 years ago, Moore’s Law was not adapted to single-threaded computing. Because the heat generation increases super linearly with the clock speed. That’s why fan-based devices (desktops and laptops) stagnate at about 3.5-4 GHz, and fanless devices (phones and tablets) stagnate at 1.5-2.0 GHz. Although various optimizations have improved single-threaded performance over the past decade, single-threaded performance has not managed to double every 18-24 months.

In the last decade, almost all of the computing gains have come from chip specialization (FPGAs and ASICs) and parallel computing. Modern desktop graphics cards typically have more than 4,000 cores. The number of cores per chip has been growing by Moore’s Law for the last decade, and this trend will continue because the heat generated by the increased number of cores barely affects the clock speed increase.

Solana is the only blockchain that performs intra-chip parallel computation via the SeaLevel runtime. seaLevel executes transactions locally on the GPU. If Nvidia releases a new GPU with 8,000 cores in the next year or two, the computational bandwidth of the Solana network will roughly double.

In the process, developers don’t have to know, or care, or change a single line of code.

This is the definition of predictability: developers write code today, knowing that it will always be in effect and that it will cost less to execute it in the future than it does today.

At this point, the main physical limitation of scale-out computing is heat dissipation. In a literal sense, Solana’s scalability is able to achieve a physical upper limit.

Quantitative decentralization and weak subjectivity

On the surface, many people believe that the Solana protocol is not decentralized enough. Of course, they haven’t really bothered to quantify this claim, but keep reiterating it. What are the facts? Let’s do some calculations to quantify how decentralized each network is.

First let’s look at hardware costs.

Bitcoin can run on a $25 Raspberry Pi and requires only a weak Internet connection.

Ether runs on a $500 laptop (which is not necessarily accurate given current Gas prices) and requires a broadband connection.

Solana needs to run on a server that costs around $3500 and requires a gigabit Internet connection.

The next major consideration is state size. With 50,000 TPS and billions of users, Solana’s state size will increase. This is a very good point, why? Because 1) Solana is assumed to be running on servers with upgradable storage (not laptops that can’t be upgraded), 2) NVMe SSDs scale read and write performance linearly via RAID 0, and 3) multi-byte NVMe SSDs are less than $300 – scaling The cost and maintenance of stateful storage will be negligible.

If you’ve read this far and understand everything that’s been said, then the odds are good that you’ll be using a Macbook Pro that costs more than $2,000, which is the high-end computer favored by 50 to 100 million developers worldwide. I suspect that optimizing for $500 to $1,000 hardware would be the ideal approach. So what’s so special about the $500 to $1000 price point?

Let’s consider a reasonable price point for the upper end of hardware requirements. $25,000 is definitely too high, because developers don’t have that hardware. Now think differently. Instead of considering arbitrary hardware costs, let’s start with how many nodes are needed to achieve sufficient anti-censorship capabilities. Obviously, the word ‘enough’ here is inherently subjective, but if you assume that 1 million nodes are needed, the question naturally arises: “Are there enough, $3,500 servers and gigabit network connections in the world to make 1 million nodes seem reasonable?

It’s hard to dismiss this question when you consider the number of gamers, developers and businesses in the world with all the high-end hardware.

So hardware costs cannot be considered in isolation, but must be considered in the context of the system’s design goals.

Previously I have argued that the blockchain should cater to DeFi, rather than going for maximum censorship resistance (i.e. 100 million or even 1 billion nodes needed). There is simply no need to optimize for $25 or $500 hardware, as the vast majority of people will never run a node. So why bother optimizing the cost of the hardware and the protocols that underlie it?

The world is weakly subjective
This comes to weak subjectivity, while acknowledging that decentralization is not as important as you might think.

The world is weakly subjective. What does that mean? Let’s look at the following example.

Recall the last time you walked into a tall building, did you first inspect the structure of the building and then ask the builder so you could make sure the building wouldn’t collapse and then bury you?

Ever had that thought when you were flying, driving or in your house?

So, everything in this world is based on some level of trust. If everyone had to independently verify the structural integrity of all things involved, the world would not function properly.

Conversely we can argue that the world works because everyone knows that many other people have already verified the system and the probability that the system is safe is very high.

This is the basic assumption of weak subjectivity. This theory, when applied to the number of nodes, becomes this key question: in the absence of running nodes, can users themselves reasonably assume that enough other participants and institutions are running nodes so that the system can be trusted?

Solana currently has about 600 nodes, after having only about 100 a year ago. Like other blockchains, the number of nodes will grow over time as the ecosystem continues to evolve. Similar to the current mainstream blockchains, the Solana network will naturally become more and more decentralized as time passes and more users adopt it and more value is delivered.

This is where Qureshi’s point is correct, and decentralization is not as important as you might think. Clearly, decentralization is essential to censorship resistance. But there is also no clear threshold in the industry (nor are there enough counterexamples to draw a conclusion), and the exact value itself doesn’t matter – what really matters is that 1) the risk-decentralization degree curve is actually the inverse S-curve, and 2) we know that blockchains over time become more decentralized, and therefore censorship resistance will increase with it. As long as the blockchain decentralizes fast enough – and you believe it can decentralize as fast as the degree of decentralization increases – then users will be able to maintain the censorship resistance they need.

Multicoin Partners: Why Predictability Matters for Blockchain Scalability

Image credit: Haseeb Qureshi, “Why Decentralization Isn’t as Important as You Think

Conclusion
So the basic question now becomes: how do you distinguish between the ‘origin’ and the ‘end’?

In the first decade since the birth of Bitcoin in 2009, it was clear that censorship-resistant, non-sovereign currencies were the ‘first’ and the rest should be the ‘last’.

But this is changing. Today’s cryptocurrency users would argue that DeFi is going to completely reshape finance. So I think the ‘end’ has reversed: today DeFi is the ‘beginning’ and non-sovereign currencies are the ‘end’.

They both require some degree of censorship resistance, but given the technology constraints, there is a fundamental trade-off between maximizing the effectiveness of DeFi and maximizing the censorship resistance of the system.

And when the fundamental design goals of the system change, the associated technology stack needs to change as well. In order to scale DeFi to billions of users, each layer of the stack must be reconsidered in compliance with the first-nature principle.

Acknowledgements: Thanks to Hasu for reviewing a draft of this paper.

Appendix: A brief comparison of Ether and Solana

Both Bitcoin and Ether make many assumptions in their respective designs, perhaps the most obvious being the network and execution layers.

The Network Layer

At the network layer, Bitcoin and Ether utilize the gossip protocol. The gossip protocol also means that each node broadcasts data to every other node indiscriminately. While this maximizes censorship resistance, it does so at the expense of some performance. By definition, rebroadcasting data in a highly redundant manner is not efficient and therefore cannot be properly optimized for high-throughput DeFi applications.

Solana, on the other hand, invented a new network protocol called Turbine, which was inspired by the BitTorrent protocol. Turbine is heavily optimized for efficiency and works as follows: Let’s consider a 1MB-sized block. Instead of transmitting the entire block to another node, the nodes transmit 10KB (1% of the block size) to nodes #2 and #3, and then those nodes rebroadcast the 10KB block to nodes #4 and #5, and so on. The original node then broadcasts a different 10KB packet to nodes #6 and #7, which then rebroadcast the 10KB content to nodes #8 and #9, and so on. In addition, the advantage of this model is that the latency time, as well as the absolute bandwidth available, remains constant as the number of nodes increases. The only degradation in performance is that the latency increases log(n) (very sublinear) compared to most other systems that increase linearly or super-linearly.

Execution Layer

At the execution layer, the EVM is a single-threaded computer. Since any transaction can modify any part of the global state, in order to support parallelism, the system needs some way to ensure that two transactions do not try to write to the same state at the same time. the EVM election simply cannot handle this problem, it can only simply execute all transactions serially.

Solana is the only protocol that attempts to handle intra-partition concurrency. How does it do this? Solana’s runtime, SeaLevel, requires the transaction header to specify the relevant state of the transaction. With this information, SeaLevel can determine which transactions are likely to conflict and will serialize them. All non-duplicate transactions can be parallelized, running in parallel across thousands of GPU cores.

Disclosure: Multicoin has developed, maintained and implemented written policies and procedures reasonably designed to identify and effectively manage conflicts of interest related to its investment activities. Multicoin Capital complies with the No Trading Policy for the assets listed in this report (“No Trading Period”) for three days after public release. multicoin Capital holds positions in SOL and ETH.

Posted by:CoinYuppie,Reprinted with attribution to:https://coinyuppie.com/multicoin-partners-why-predictability-matters-for-blockchain-scalability/
Coinyuppie is an open information publishing platform, all information provided is not related to the views and positions of coinyuppie, and does not constitute any investment and financial advice. Users are expected to carefully screen and prevent risks.

Like (0)
Donate Buy me a coffee Buy me a coffee
Previous 2021-06-01 07:07
Next 2021-06-01 07:15

Related articles