Messari: Current challenges facing the Ocean Protocol and plans to address them

Key points of this article

  • Ocean Protocol is a set of decentralized data sharing technologies designed to lower barriers to access to high-quality data. The Ocean ecosystem includes a collection of data marketplaces and data orchestration smart contracts.
  • Ocean V3 and V4 employ automated market makers (AMMs) and data pools that facilitate transactions between buyers and sellers of data. The protocol is being introduced to Ocean V4X to address design flaws and product demand challenges on V3 and V4.
  • The protocol provides new capabilities, such as its Compute-to-Data capability, which enables data users to train data models while keeping the actual dataset private.
  • Current industry norms prevent Ocean from realizing the full benefits of potential network effects. Therefore, the main challenge for the protocol is to ensure a continuous supply of data and maintain a reliable pipeline of demand. The protocol is addressing these challenges with innovative solutions in its recent Ocean V4X update.
  • Relying on an agile team, Ocean Protocol is ready to gain a larger foothold in the field as industry norms continue to evolve rapidly.

introduce

Data is an invaluable asset in artificial intelligence (AI), biotechnology, financial technology (fintech), consumer retail, and more. In 2017, The Economist stated that “the world’s most valuable resource is no longer oil, but data” and reported that the top five companies with the most data access combined profit in the first quarter of 2017 for $25 billion. Data is valuable for many reasons, one major being its importance in advancing emerging technologies. AI companies need large, high-quality datasets to develop neural network-like management learning algorithms. Biotechnology R&D (biotechnology research and development) teams consume large amounts of data when replicating results to confirm research findings. Consumer retailing increasingly relies on vast amounts of consumer data to curate targeted advertising. Web2 giants, such as Google or Meta, that focus on collecting big data, have a market share in various industries that require high-quality data. Additionally, these data oligarchs can act as gatekeepers to data-intensive industries and play a huge role in determining the success of small businesses.

Ocean is a set of decentralized data sharing technologies. The protocol is interoperable and is currently deployed on Ethereum, Polygon, Polkadot, Binance Smart Chain (BSC), Moonriver and Energy Web Chain. Its open-source code is readily available on GitHub, allowing other programmers to fork Ocean and build their own marketplaces. To make high-quality data more accessible, Ocean specifically intends to attract AI startups that need high-quality datasets. The protocol’s mission is outlined in its white paper. This report will explore Ocean V3 and V4, the current challenges facing the protocol, and Ocean’s plans to address them.

Ocean ‘s product evolution

The core goal of the Ocean Protocol is to provide the infrastructure and services required for decentralized data sharing technology. First released in 2018, the Ocean Protocol allows data providers to share data with consumers. Since then, it has iterated over several versions and added new features.

Ocean V2 introduces a new Compute-to-Data feature. It allows data users to run models on private or sensitive data that they would otherwise be unable to use without this capability. This architecture allows private data to remain on the data owner/vendor’s servers while being used to train models for data users to comply with the EU General Data Protection Regulation (GDPR).

Ocean V3 has developed “datatokens”, an ERC-20 token (or equivalent on other blockchains) that facilitates transactions between data providers and consumers. Open source data marketplaces or Ocean marketplaces have also been introduced. Ocean Market is structured as an Automated Market Maker (AMM) that allows data price discovery. By pairing OCEAN tokens (Ocean protocol native tokens) with data tokens, users can purchase access to datasets as needed. Ocean V3 also introduces the Data Farming Program, which allows OCEAN stakers to signal about the quality of their datasets. Ocean’s Data Feed Reward Function (RF) calculates staking rewards based on dataset popularity/perceived quality, community feedback, and other variables.

Ocean V4 adds features designed to increase platform requirements. Launched in Q2 2022, Ocean V4 creates the Data NFT, a tradable, non-fungible ERC-721 token that represents the NFT owner’s data as intellectual property on-chain (without legalization contract enforcement). The protocol also adds protection to data from malicious data publishers who exit their “data token” positions, compromising stakers in the data pool. The V4 AMM model for price discovery has been adjusted so that data suppliers do not receive any initial tokens, and algorithms that adjust prices based on supply and demand can independently mint/destroy “data tokens”. The Ocean Marketplace also gets new fees, including the Consume Market Consumption Fee – where the marketplace can set its own rates for its preferred currency, and the Publisher Market Consumption Fee from dataset publications Consumption Fee). There was an impressive initial traction that saw V4 grow from 20 to 38 data pools in just one week at the end of June/early July.

Messari: Current challenges facing the Ocean Protocol and plans to address them

OCEAN Token Economics

OCEAN aims to accumulate value through growth in network engagement. As users use Ocean’s tools and marketplaces, the fee income earned by the protocol is reinvested in OceanDAO, which is then used to burn OCEAN tokens. Burning OCEAN reduces the outstanding token supply, and as network-driven demand increases, so does the upward pressure on the price of OCEAN. The OCEAN token distribution is shown in the figure below.

Messari: Current challenges facing the Ocean Protocol and plans to address them

Ocean ‘s Challenge

Traction stalls among data providers

The agreement has struggled to attract enough data providers. When Ocean initially launched “Data Tokens,” the hype around the product and the issuance of network rewards gained high initial traction. When looking at the number of “data token” minted transactions and “data token” creations as a measure of traction, Ocean reached a peak in demand from data providers in November 2020 (413 “data tokens” minted at that time ” and created 340 data tokens). Since December 2020, Ocean has created an average of 10-15 minting transactions and 10 “data tokens” per month.

Messari: Current challenges facing the Ocean Protocol and plans to address them

Traction stalls among data consumers

Ocean has been working hard to improve its product-market fit to cater to data consumers. Looking at the number of “data token” transactions as an indicator of Ocean’s data consumption demand, the traction among consumers was greatest in November 2020, when the number of data token transfers or transactions (excluding minting transactions) exceeded 3,000. The number of data token transfers will vary between 20 and 200 per month for most of 2021, before dropping to around 10 to 20 per month in 2022.

Messari: Current challenges facing the Ocean Protocol and plans to address them

Unstable network effects

The realization of network effects is critical to Ocean’s success. Data providers will not see the value in providing data on platforms with low appeal among data consumers. Data consumers don’t bother to check data on platforms with limited data supply. As a result, Ocean faced what a16z partner Andrew Chen described as a “cold start problem.” Without ongoing participation, the incentives for participants become less clear and the network effect challenges of the protocol may continue.

Data quality and accountability

Some data consumers, such as technical-level AI researchers, may see the decentralized nature of the protocol as a liability. While the Data Farming Program incentivizes high-quality datasets, it does not actively discourage low-quality datasets. Ocean only identifies data vendors by wallet address, which limits the liability of data vendors who provide low-quality datasets. To achieve its stated goal of serving AI companies, Ocean could consider better tailoring its products to meet common standards for enterprise data.

Vulnerability on Ocean V4 in July 2022 _ 

In the first month after launch, Ocean V4 faced a well-orchestrated attack triggered by a series of transactions exploiting the AMM data pool. The exploit involves staking OCEAN in the data pool, buying “data tokens”, removing the staked OCEAN tokens, and then selling the “data tokens”, which slowly drains OCEAN from the data pool. These attacks are achieved through the AMM mechanism, which allows data publishers and stakers to withdraw their liquidity from the data pool when they want to sell OCEAN for “data tokens”.

The Ocean team immediately responded to this attack by increasing the exchange fee (Ocean Community Fee) to 15% so that attackers would lose money for performing such transactions. The team also quickly alerted community members, advising them to withdraw liquidity from Ocean’s data pools, and advised data vendors to set a fixed price for their datasets. Setting a fixed price for a dataset counteracts the dynamic pricing mechanism inherent in AMM-like data pools.

Ultimately, Ocean’s team set the exchange fee at 100% to fully defuse the attack. Recognizing the need for a long-term solution to protect its community, Ocean Markets is migrating from an AMM-based price discovery model to a fixed-price data asset model, Ocean V4X. Markets built on Ocean can still opt for a dynamic pricing model, but at their own risk.

Ocean ‘s Roadmap: Ocean V4X and veOCEAN 

To address the challenges surrounding the Ocean Data Pool AMM architecture, Ocean is introducing a new data curation mechanism based on a voting escrow (ve) token model. The veOCEAN design not only changes the dynamics of data curation, but also hopes to improve incentive alignment between data publishers, consumers, and OCEAN holders.

Ocean’s veOCEAN design is a fork of the veCRV model. This fixed-price data asset model relies on staking smart contracts to facilitate dataset curation while continuing to receive rewards from the Data Farming community. In the previous Data Farming model, stakers were rewarded based on their proportional share in the AMM liquidity pool. The veOCEAN model still allows stakers to earn based on their proportional share, but their veOCEAN is locked in the Ocean vault and they cannot get their funds back for a certain period of time. Stakeholders who actively participate in evaluating the quality of their data are rewarded weekly with Data Farming. The longer veOCEAN stakers lock up their funds, the higher their rewards. All veOCEAN token holders will receive a community fee paid to OceanDAO.

Ocean Data Bounty Program

Ocean’s Data Bounty Program encourages Ocean’s data consumers (40% of whom are data scientists) to leverage Ocean’s data assets, brainstorm, come up with new dataset use cases, compile datasets, analyze data for insights , training data model, etc. The Ocean Data Bounty Program operates in three phases: Ideation, Insight, and Problem Solving:

  • Ideation stage: 10-20 winning ideas are selected from 100-200 proposals, and the winner will receive an OCEAN award of $5,000-10,000;
  • Insights stage: 5-20 data patterns, reports and other important insights will receive OCEAN rewards of $10,000-20,000;
  • Problem Solving: Proposals to improve Ocean’s business and product challenges will be awarded $10,000-25,000.

The initiative is expected to attract data consumers outside the Web3 community, from academic data scientists to non-crypto industry researchers. The bounty program could create new demand in market segments Ocean has been trying to penetrate, such as AI or biotech startups.

Epilogue

Ocean’s mission is to democratize access to high-quality data, but its products, in their current form, have struggled to find satisfactory product-market fit. The protocol has developed many new capabilities (i.e. Compute-to-Data procedures) that may be very useful for various industrial and academic applications in the future. Despite the novel features, many industries may still be hesitant to trust datasets on the platform due to the inability to guarantee the high quality of the data. Companies will always choose the most cost-effective, efficient and trusted data solutions, and Ocean has a lot to do in these areas. Ocean is trying to really solve the problem, but the protocol needs to keep innovating to generate the necessary network effects and find its product-market fit.

Posted by:CoinYuppie,Reprinted with attribution to:https://coinyuppie.com/messari-current-challenges-facing-the-ocean-protocol-and-plans-to-address-them/
Coinyuppie is an open information publishing platform, all information provided is not related to the views and positions of coinyuppie, and does not constitute any investment and financial advice. Users are expected to carefully screen and prevent risks.

Like (0)
Donate Buy me a coffee Buy me a coffee
Previous 2022-07-21 11:13
Next 2022-07-21 11:16

Related articles