Understanding the technology direction behind the Web 3.0 concept

While Web 3.0 is still in its infancy, many believe it is the next evolution of the Internet and will be the framework that defines the next software era.

In this article, we will examine Web 3.0 as it relates to decentralized peer-to-peer (p2p) file sharing. Decentralized architectures are relatively new and still under development. The same words may mean different things to different groups, so I will begin by outlining some general concepts and definitions so that we can agree on semantics.

Web 2.0 is the current iteration of the Internet – it is the framework used by the vast majority of today’s Web-based applications and services, and it is what most users think of as the “Internet” or “Web “.

The term Web 2.0 was coined by Tim O’Reilly in the dot com era, and loosely defines Web 1.0 innovations in mobile, social and cloud computing. The idea is more subtle, but can be summed up by two main concepts – “services, not packaged software” and “software above the level of the individual device”.

While Web 2.0 still has a high value of creativity, especially in the enterprise, there are still successful projects emerging from what is potentially the next Internet paradigm (today called Web 3.0). It is still in its infancy, but many believe it is the next evolution of the Internet and will be the framework that defines the next software era.

Like Web 2.0, the definition of Web 3.0 is subtle (still evolving, and often defined differently by different people). For the purposes of this paper, I will simply define Web 3.0 as a shift to decentralized networks that rely on peer-to-peer networks, rather than client-server networks that rely on a centralized infrastructure. We will discuss the details of p2p networks further, but let us first discuss why decentralization is valuable.

Decentralized networks typically have higher performance. Let’s look at a file sharing use case where user A wants to send a video file to user B. In a centralized client-server network, the user’s video file is uploaded and downloaded to the server, and then user B downloads the file (the server then uploads it).

The performance is limited by the upload speed of user A, the download speed of user B, the upload and download speed of the server, and the distance between the two parties. In a pure peer-to-peer system, files are transferred directly from one peer to another. It is limited only by the upload speed of user A, download speed of user B, and distance between users and users.

Therefore, if the distance between user and server is close and the server upload/download capacity is higher than that of user A and user B, the performance may be equivalent to that of a p2p network.

However, as the distance between user and server increases and/or the performance of the server decreases (e.g., due to high demand), the client-server network will perform worse than the p2p network.

Decentralized networks are anti-fragile – they become more powerful as additional “pressure” is added. As more users (peers) are added to the network, network performance and availability actually improve. In Web 2.0 social applications, this is reflected in the “network effect”: the value of the network/product increases as the size of the network increases.

Unfortunately, this is not the case with the underlying infrastructure. As applications based on the client-server model become more popular, application providers must increase server capacity/centralized computing to maintain performance and availability.

Understanding the technology direction behind the Web 3.0 concept

A decentralized network is trustless – the participants involved do not need to know or trust each other or a third party to the system. Because of the client-server relationship, Web 2.0 applications rely on a centralized server/service, which means that users must inherently trust this central authority (the person or entity that owns/operates the server).

This central authority has the ability to unilaterally define rules. In practice, this usually manifests itself as conflicts around data ownership. For example, the popular storage and file sharing service dropbox uses a client-server model: users upload data to centralized servers owned and operated by dropbox, and then download data from these servers.

As a result, all data passes through dropbox, and dropbox has the technical ability to read and replicate this data. Therefore, the only thing that protects dropbox user data is the user agreement that they trust dropbox to adhere to (and not change). In a decentralized p2p network, there is no central authority. Users are the owners and operators of their data, and trust lies with the software itself (not the operators).

A decentralized network is more secure. It goes by the name: peer-to-peer. Data is uploaded directly from one peer to another and downloaded by another peer, without the use of a middleman (central server).

This means that there is no central authority or custodian for your data (unless you choose to do so). In addition, when there is a central authority, there is a higher incentive to attack because the data/values are consolidated (i.e., the attacker gets a large amount of value from a single success).

Conversely, when the data/value is highly distributed, the attacker must have more successful attempts to obtain the same size value. Decentralization reduces the attacker’s motivation/reward.

Understanding the technology direction behind the Web 3.0 concept

Now that we have some context for sharing, let’s discuss decentralized p2p file sharing specifically. At this point, you may be thinking, “Decentralization sounds great, but what’s new about it?” Haven’t services like p2p file sharing been around for over a decade?”

To some extent, you are correct, but there are some fundamental differences between the p2p services of Web 2.0 and the p2p services we build on Web 3.0 today. Let’s take the example of BitTorrent, which was originally released in 2001. BitTorrent had three major problems (which can now be solved): it was not fully decentralized, it lacked an incentive model, and the code was not available or not scalable.

BitTorrent (and other similar services) is not decentralized in two important ways: it uses centralized servers to keep track of peers and store content metadata. While peer-to-peer connections have been established (to improve performance, i.e., by increasing the number of “seeders”), the instantiation of these connections requires a special server called a tracker, which helps nodes communicate with each other.

The tracker server also keeps track of the location of file copies residing on the peer machine, which copies are available when requested by the client, and helps coordinate the efficient transfer and reassembly of the copied files. Using a tracker increases the cost to the service provider and limits the privacy/security of the user.

Understanding the technology direction behind the Web 3.0 concept

BitTorrent does not have a proper incentive model. When users start using the service to download files, they are called “downloaders”, and once they have a piece of the file (or if they have stored files that other users want), they can become “seed users” and can also upload files to other users. They can also upload files to other users.

BitTorrent gives download priority to seed users (users who seed at faster upload speeds get further priority). However, beyond that, there is no real incentive to seed or store files. Since BitTorrent relies on seeds and file availability, it is vulnerable without an incentive model.

BitTorrent (and similar services) suffers from typical software usability problems, which make it difficult to leverage previous work. Some examples include: lack of good documentation (or none at all), restrictive licensing (or no licensing), not easy to reach touch points, closed source code (or source code that no longer exists), implementation without a publicly friendly API, and tight integration of these items with specific use cases.

Ultimately, this means that the service is severely limited in its ability to adapt and change to user needs. In many cases, this makes it impossible to handle new use cases. Specifically, BitTorrent has made some improvements (it relies less on trackers than it used to), but essentially it is still the same service it was 20 years ago.

Now let’s move on to file sharing as an example to examine Web 3.0 features. Today, you can build completely decentralized p2p applications. Protocol Labs, which developed Libp2p (for decentralized transport) and IPFS (for decentralized storage), is a leader in this area.

Specifically, Libp2p’s transport protocol (via circuit relay and NAT traversal) solves the problem of connecting two peer points without the need for a special tracking server. In addition, IPFS (a decentralized storage service using content addressing) allows for fully decentralized storage and does not rely on content servers or a very large number of active/available peers for file availability.

Understanding the technology direction behind the Web 3.0 concept

The Web 3.0 concept of crypto/tokens solves the incentive problem. Specifically, Filecoin is a decentralized storage network that turns cloud storage into an algorithmic marketplace.

The marketplace runs on a blockchain with local protocol tokens (FIL), which miners earn by providing storage to clients. Instead, clients spend FIL (hired miners) to store or distribute data. Applied to a service like BitTorrent, this means that seeders would store files in Filecoin and offer them for download, while stealers would spend Filecoin to access those files.

This incentive structure would increase the incentive for seeding and storing files, thus improving availability and performance for users. The monetary incentive also creates a way to instantiate a legitimate business model. Assume that file storage can make money.

In this case, the legitimate owners of the files (media companies) may be willing to participate in the network. As Spotify demonstrates, users don’t actually care about the “free” aspect of seed files, they care about the ease of discovery and distribution.

Most importantly, entities like Protocol Labs are addressing the issue of software availability. They are the creators of Libp2p, IPFS, and Filecoin. All three of these projects are open source, have up-to-date documentation, and are supported (by a contactable team).

In addition, protocols like Libp2p are highly modular and built specifically for general purpose use. This means that products built with Libp2p are highly scalable and can change quickly to meet evolving user needs and new use cases.

Understanding the technology direction behind the Web 3.0 concept

In summary, decentralized p2p applications offer many native advantages (especially in terms of performance and security), and the tools and infrastructure to build robust p2p applications are now available.

Posted by:CoinYuppie,Reprinted with attribution to:https://coinyuppie.com/understanding-the-technology-direction-behind-the-web-3-0-concept/
Coinyuppie is an open information publishing platform, all information provided is not related to the views and positions of coinyuppie, and does not constitute any investment and financial advice. Users are expected to carefully screen and prevent risks.

Like (1)
Donate Buy me a coffee Buy me a coffee
Previous 2021-06-09 09:43
Next 2021-06-09 21:44

Related articles