Binance Square
LIVE
Mint Ventures
@Mint_Ventures
Mint Ventures is a research-driven venture firm that specializes in cryptocurrency and early-stage blockchain start-ups.
Following
Followers
Liked
Shared
All Content
LIVE
--
dappOS: Understanding The Intent-Centric Execution NetworkBy Lawrence Lee, Researcher at Mint Ventures Introduction The crypto economy has grown significantly, leading to sophisticated on-chain infrastructure. Despite these advancements, the user experience still needs to be improved. Enhancing user interaction is crucial as it could attract more participants into the blockchain ecosystem. This would likely stimulate further infrastructure development and diversify business models, creating a dynamic where progress in one area spurs growth in another—a phenomenon we might describe as a “ladder-like stepping effect.” The “1995 moment” of crypto may depend on the emergence of a user-oriented killer application or operating system. The intent-based Network is actively working towards a major paradigm shift by moving from a system that assumes all users are experts to one that assumes every user is a beginner. This involves simplifying complex operations and presenting a streamlined, secure user interface. Furthermore, integrating AI could seamlessly translate user intentions into actions. This article features dappOS, an intent execution network supported by leading institutions. Intent: From Usable to User-Friendly—Enhancing the On-Chain Experience Boosting the Experience for Crypto Users The “DeFi Summer” of 2020 ignited large-scale commercial activities on blockchain platforms. Since then, on-chain commercial activity has flourished. Beyond financial services, vibrant developments in NFTs, gaming, and social platforms have brought new dimensions to blockchain engagement. Currently, over $90 billion in assets (excluding NFTs) remain active on major blockchains, with peak values nearing $200 billion during the last bull market. Recent daily trading volume has surged past $5 billion, with spikes reaching $10 billion in March—nearly half the trading volume of the Hong Kong Stock Exchange. Source: Defillama Market participants effectively vote with their wallets, and the blockchain, which offers “greater freedom and more affordable trust”  (as discussed in “How to Invest in Web3 with its Fundamental Value“), is continuously proving its potential as a robust infrastructure for a broad spectrum of commercial activities. Despite its impressive expansion, the blockchain space is in an early phase, with less than seven years in formation and less than four in genuine business operations. Signs of its emergent nature are evident, such as when the leading blockchain, Solana, experienced a 30-hour downtime in February this year. Furthermore, mainstream blockchain wallets still require that users maintain backups of lengthy seed phrases or private keys. Additionally, on-chain applications on the blockchain that have sustainable business models currently focus primarily on crypto-native assets, the integration of purely off-chain businesses with blockchain still needs to be explored. In recent years, smart contract blockchains like Ethereum and Solana have seen significant growth in market value and expanded their capabilities to support a wide range of business activities. Regarding the resolution of “Availability” issues, there is now a clear strategy and achievable roadmap: Ethereum has adopted a strategy focused on scalability and efficiency, embracing Rollups and evolving towards a modular framework, with Layer2 solutions like Arbitrum and Base thriving, and a proliferation of dedicated layer2 and layer3 appchains.  Conversely, Solana has pursued a different path, focusing on maximizing performance on a single chain, with an average TPS (transactions per second) exceeding 2,000, and a steady influx of new users and assets. However, the “usability” of blockchain remains a significant challenge. The current on-chain experience, although adequate for the few million daily active addresses (with actual active users being fewer), is not yet capable of supporting the onboarding of hundreds of millions or even billions of individuals.  To achieve “massive adoption” of blockchain, a transformative improvement in user experience is imperative. In July 2023, Paradigm introduced the “intent-centric” concept, enhancing the Web3 user experience with what is now known as the Intent Network. In simple terms, “intent” refers to a user’s specific need or goal. For example, if someone wants to “buy $1000 $BRETT (a memecoin),” this statement outlines their intent. Realizing this intent typically requires multiple transactions, especially when additional constraints lead to an increased number of transactions associated with fulfilling one intent. In the given example, if my holdings of stablecoins are insufficient on the Base chain and are primarily on the Ethereum network, fulfilling the stated intent would involve the following steps: Consider a scenario where a user needs to execute the intent of buying $BRETT but lacks the necessary stablecoins on the Base chain, with their funds primarily in Ethereum. The steps to fulfill this intent would include: Convert 1005 USDC into ETH on the Ethereum mainnet, because USDC cannot be used as gas fee on Base.Bridge $ETH from Ethereum to Base.Swap $ETH to $BRETT. Executing these three steps demands a substantial amount of underlying knowledge. For example, consider the steps needed to buy Meme Coin $BRETT: First, I must identify and select the most efficient and cost-effective bridge between Ethereum and Base for transferring assets. Second, I need to find the Base chain’s RPC details and learn to add Base network to my wallet. Furthermore, it’s essential to know whether there is an effective trading aggregator on Base chain to secure the best pricing. If not, I need to identify which decentralized exchange primarily supports liquidity for $BRETT. For veterans accustomed to on-chain operations, these tasks may seem straightforward due to their familiarity and experience. However, for beginners, this process can be quite challenging. They often need detailed tutorials to work through each step of the process methodically. In many ways, today’s on-chain experience mirrors the user interactions with computers before Windows 95 was introduced. Back then, computers were powerful tools capable of complex computations and sophisticated file management, yet they primarily operated through the CMD command prompt. This interface was highly efficient for those well-versed in computer logic and experienced in command-line operations—a preference that persists among many technology enthusiasts to this day. However, for most newcomers, this mode of interaction was overwhelmingly complex, often described as a “nightmare.” To navigate these systems, users were typically provided with a hefty manual, which was essential for mastering the various operations required to effectively use their computers. However, the situation changed completely with the release of the Windows 95 operating system in 1995. Upon startup, computers directly entered a graphical user interface (GUI), where users could execute tasks with simple mouse clicks. The emergence of browsers also reduced the complexity of accessing the internet. For some, the shift from command-line to graphical user interfaces was seen as a minor enhancement in user experience; the graphical interface was merely a user-friendly packaging of underlying command-line operations. Unlike hardware advancements such as faster processors, which improved performance, Windows 95 made technology truly accessible to the masses. It lowered the barriers for average users, driving the widespread adoption of personal computers. Increased sales of personal computers led to reductions in the cost of Intel CPUs, which in turn enhanced both the performance and experience of PCs.  This development triggered a “stepping effect,” where progress in one area spurred advancements in another, boosting global computer and internet penetration and setting the stage for the Internet Boom. Internet Penetration Rate Looking back, it’s clear that what seemed like a modest improvement fundamentally transformed everyday life for individuals and propelled companies like Microsoft, Apple, Google, and NVIDIA to greatness over the last thirty years. This pivotal change is often celebrated as the “1995 moment” in the evolution of personal computers and the internet, a phrase that marks the beginning of a technological explosion. Despite the smooth approval of the Bitcoin ETF, which gained positive governmental recognition for the cryptocurrency industry and attracted a lot of new users, most people still only hold their cryptocurrencies on centralized exchanges, using blockchain technology merely for basic transactions. Considering the adoption of on-chain applications, the “1995 moment” for cryptocurrency has yet to arrive. It may wait for a game-changing application or operating system tailored for end-users. Projects within the Intent concept stand a good chance of bringing about this pivotal moment.  Intent: The Best AI+Crypto Use Case AI has advanced swiftly in recent years, with many believing that 2023 will be the “1995 moment” for AI, as ChatGPT and other chatbots powered by large language models have started to become a part of everyday life, and both the capital market and the general public’s attention to AI reaches an all-time high. The improvement of large language modeling capabilities is still far from seeing a bottleneck, leaving us uncertain about the limits of AI’s capabilities and its potential impacts. Furthermore, with the launch of GPT-4o, it seems the arrival of AI in our daily lives has been significantly accelerated.  The integration with AI has been one of the significant topics within the cryptocurrency space over the past year, with related concept tokens experiencing a price surge. From a commercial standpoint, blockchain-based AI agents represent one of the most potent use cases for AI + Crypto and this will make it possible to realize intentions more easily. This is because the rules on the blockchain are determined, with clear boundaries and no black boxes. As said in our report in April 2024, AI thrives within blockchain systems, fundamentally because the rules of the crypto economy are explicitly defined, and the system allows for permissionlessness. Operating under clear guidelines significantly reduces the risks tied to AI’s inherent stochasticity. For example, AI’s dominance over humans in chess and video games stems from the fact that these environments are closed sandboxes with straightforward rules. Conversely, advancements in autonomous driving have been more gradual. The open-world challenges are more complex, and our tolerance for AI’s unpredictable problem-solving in such scenarios is markedly lower. In this case, as long as there is enough information input, AI will “identify the optimal solution more quickly than humans” in solving specific problems, thus helping people realize their intentions more easily and quickly. An Analysis of dappOS Business Overview dappOS is an intent execution network, that simplifies the way users interact with decentralized applications (dApps) and public blockchains. Here’s how it works: Users simply communicate their desires—such as wanting to “buy $1000 worth of $BRETT”—directly to dappOS. From there, dappOS takes over, managing all the necessary interactions with various dApps and public blockchains to execute the transactions required to fulfill the user’s intent. The ambition behind the name dappOS is clear—they aim to become the operating system for dAPPs, much like Windows 95 did for personal computers. dappOS has established an open dual-marketplace within its ecosystem. On the demand side are the developers who create user-facing applications, and on the supply side are the service nodes that execute these user intents. To maintain a high standard of service, dappOS utilizes an Optimistic Minimum Staking (OMS) mechanism. Framework of dappOS Network Key roles and their functions within the dappOS ecosystem are defined as follows: Users: They publish intent tasks based on the framework provided by dappOS.Service Providers: These entities execute the intent services. After depositing a specific amount of dappOS tokens as collateral, they can start accepting intents from users and generate revenue.Execution Validators: Tasked with overseeing the performance of service nodes. Validators have the authority to impose penalties on any service node that fails to fulfill its duties as required.Matchers: Responsible for matching users with suitable service providers Workflow of dappOS Network During the execution process, users interact with the system through a frontend interface to submit their intents to matchers. The matchers then approach the relevant service providers to gather quotes for these intents and present them to the users. If the users deem the quotes acceptable, they can select their preferred service provider to carry out the task. After making their selection, users will authorize the intent by signing it and transferring the necessary resources to the service provider, who then executes the intent as specified. After the specified task duration, numerous validators verify whether the task has been successfully executed. If anyone identifies that the task was not completed, a challenge can be raised within the network. The validators will then use PoS voting to reach a consensus. If it is agreed that the task was not executed as required, the service provider is obligated to use their deposited collateral to compensate the user. In addition to its core functionalities, dappOS implements an Optimistic Minimum Staking (OMS) mechanism. This approach requires service nodes to stake an amount slightly above the total value of the intents they are tasked with, enabling them to provide services with financial commitment (minimal). It also permits service nodes to continue executing tasks before verification (optimistic). If the service outcomes are validated successfully by the validators, the service nodes are entitled to their revenue from the tasks. However, if a task is found to be unsuccessful, the system imposes penalties on the service nodes, and the users are compensated as previously agreed. The OMS mechanism aims to strike a balance between task efficiency, capital efficiency, and overall system security. Its goal is to ensure that user tasks are completed successfully while reducing the financial burden on service providers as much as possible. Additionally, through the intent execution network, ordinary users can benefit from the cost and efficiency of execution that professional service organizations provide. For instance, professional service providers have access to unique channels such as VIP accounts on exchanges with very low fees, the ability to aggregate multiple transactions to conserve gas, and superior capabilities for combating MEV on the blockchain. Moreover, competitive dynamics among service providers ensure that service costs are driven down to the most favorable levels, directly benefiting users. The dappOS intent execution network enables even ordinary users to enjoy the same level of service quality and cost-effectiveness that large institutions do. dappOS currently provides three distinct intent frameworks: Intent Trading: This framework is tailored to optimize spot trading by helping users secure the most favorable costs.Intent Assets: Within the dappOS ecosystem, this framework treats a series of equivalent assets in a fungible way, addressing the yield-generating and trading properties of the assets.  Intent-based dAPP Interaction: This framework serves as a practical tool for specific activities, like the “buying $1000 $BRETT” example. It streamlines the process of interfacing with dAPPs and efficiently facilitates asset bridging. The Intent Assets framework smooths out the variations between different blockchains and different fungible assets, thus eliminating much of the work involved in cross-chain or fungible asset exchanges and enhancing user experience. We will take stablecoins and $ETH, which are currently the most universally utilized crypto assets, as examples.  In the case of stablecoins, users have the option to deposit USDT or USDC from any blockchain into their dappOS account to mint intentUSD. Once minted, intentUSD can be invested in various stablecoin projects that generate interest automatically, with the liquidity of funds transparent and visible to the user, ensuring both earnings and transparency. This setup also simplifies transactions: for instance, when users wish to transfer USDT into a centralized exchange, they can directly use intentUSD. Similarly, if users need to deposit USDC into GMX as collateral, they can conveniently withdraw intentUSD for this purpose.In the case of $ETH, when users convert their deposits into intentETH, it similarly begins to accrue interest automatically. Additionally, intentETH grants the flexibility to purchase assets across any blockchain. For instance, users can readily swap intentETH to $QUICK on the Polygon network, or $JOE on the Avalanche network. Similarly, intentETH can be deposited into Aave on the Arbitrum chain for lending or borrowing services, or as gas on the Ethereum and Layer2 networks, depending on user demand. It’s clear that intent assets have advantages: It strikes a balance between profitability and convenience. For instance, while sDAI provides stable returns from real-world assets (RWAs), its liquidity is insufficient for large transactions. Conversely, commonly used assets like USDT and USDC, despite their widespread adoption, do not accrue earnings. intentUSD effectively bridges these gaps, offering both liquidity and returns.They eliminate the slippage between similar assets. For example, intentETH allows for a cost-free bridge among ETH on mainnet, ETH on Arbitrum, stETH, and aETH. Intent assets also demonstrate advantages over other traditional yield-bearing assets: LST or LRT Assets: Unlike these assets, intentETH doesn’t need to be locked up, thus providing better liquidity.sDAI or RWA-based Stablecoins: Unlike these stablecoins, intentUSD is always available for trading.Flexible Earn Products on Centralized Exchanges: Intent assets can be easily used across dApps.Lending Protocols and DeFi Yield Platforms: Intent assets offer the unique capability to move seamlessly across blockchains, available for immediate use in trading. Overall, the experience offered by Intent Assets closely mirrors services like money market fund products, providing an excellent experience with a new trade-off between the profitability, usability, and convenience of assets. Enhanced user experiences are important in onboarding hundreds of millions, perhaps even billions, of users to the crypto economy. The crypto economy is still in its early stages and as pioneers, we have grown accustomed to using multiple stablecoins, such as USDT and USDC, and can efficiently exchange between them at minimal costs, choosing the right asset for different situations. However, in the physical world, no one thinks we should accept “JPMorgan dollars” and “Citibank dollars” simultaneously, despite their close similarities. For newcomers to the crypto economy, grasping technical details such as “the differences between L1 and L2” or “how to execute cross-chain transactions” might not be necessary, much like how most people are unaware of the intricacies of interbank clearing systems. They simply require a more straightforward method to meet their needs. Bridging this gap in user experience is precisely the goal of Intent execution projects. Business Partnership dappOS operates with a unique business model that heavily relies on collaborative efforts with other dAPPs. dappOS has successfully forged partnerships with a wide array of dAPPs. In January 2023, dappOS formed a partnership with the perpetual platform GMX.  As part of this collaboration, dappOS launched a specialized website, gmx.dappOS.com. This integration allows GMX users to streamline their trading operations directly through dappOS. Key benefits of this partnership include a substantial 20% reduction in trading costs and the added convenience of paying gas fees with any token. During the first quarter of 2024, weekly active users peaked at over 6,000, and more recently, the figure has consistently been around 1,000. The platform also achieved a trading volume of nearly $150 million, with daily trading highs of over $10 million. Source: Dune dappOS has established partnerships with Kyberswap, a dex, and Benqi, a lending protocol on the Avalanche network. These collaborations have resulted in substantial engagement, with dappOS weekly unique users registered via Kyberswap over 3,000 and Benqi maintaining around 1,000. Source: Dune Moreover, dappOS has also forged partnerships with major public blockchains including Avalanche, zkSync, and Polygon, alongside collaborations with DeFi protocols such as Quickswap, MakerDAO, and Frax. Source: dappOS Financing Backgroud dappOS has completed three funding rounds.  It was selected for the fifth season of the Binance Labs Incubation Program in November 2022, and subsequently secured Pre-Seed funding from Binance Labs on June 20, 2023. The details of the funding amount, however, have not been publicly disclosed. On July 21, 2023, dappOS closed its seed funding round at a valuation of $50 million. This round was led by IDG Capital and Sequoia China. Other notable participants included OKX Ventures, HashKey Capital, KuCoin Ventures, TronDao, Gate Labs, Taihill Ventures, Symbolic Capital, Foresight Ventures, BlueRun Ventures, Mirana Ventures, Leland Ventures, among others. On March 28, 2024, dappOS finalized its Series A funding round, raising $15.3 million at a valuation of $300 million. Polychain led this round, with participation from Nomad Capital, IDG, Flow Traders, IOBC, NGC, Amber Group, Uphonest, Taihill, Waterdrip, Bing Ventures, Metalpha, Spark Digital Capital, Web3Port Foundation, and Satoshi Lab, among other participants. Overall, dappOS has a robust investment background and has recently finished a well-funded $15.3 million funding round. Summary  The Intent execution project is focused on improving the user experience in the Web3 industry, aiming to catalyze Web3’s “1995 moment” and foster massive adoption. Over the past year, it has also emerged as a popular trend for venture capital investment, with many projects converging around the concept of intent. However, the intent execution network is still budding, with most projects lacking gp-to-market products and clear business models. Specifically, the majority of products and mechanisms associated with dappOS, as discussed in this article, are yet to be launched. Consequently, there remains considerable uncertainty about the future trajectory of both the track and the projects within it. For projects that may lack short-term deliverables but possess substantial potential for long-term value, two critical indicators stand out: the quality of investment backing and the capacity for commercial expansion. dappOS is supported by a lot of famous investors, including top exchanges, traditional venture capitalists, and crypto VCs, showcasing an impressive roster of backers. Moreover, its successful partnerships with notable DeFi projects like GMX highlight dappOS’s robust commercial capabilities.  As dappOS continues to develop and expand its role in leading the Intent Execution, we will maintain a close watch on its progress.

dappOS: Understanding The Intent-Centric Execution Network

By Lawrence Lee, Researcher at Mint Ventures
Introduction
The crypto economy has grown significantly, leading to sophisticated on-chain infrastructure. Despite these advancements, the user experience still needs to be improved. Enhancing user interaction is crucial as it could attract more participants into the blockchain ecosystem. This would likely stimulate further infrastructure development and diversify business models, creating a dynamic where progress in one area spurs growth in another—a phenomenon we might describe as a “ladder-like stepping effect.” The “1995 moment” of crypto may depend on the emergence of a user-oriented killer application or operating system.
The intent-based Network is actively working towards a major paradigm shift by moving from a system that assumes all users are experts to one that assumes every user is a beginner. This involves simplifying complex operations and presenting a streamlined, secure user interface. Furthermore, integrating AI could seamlessly translate user intentions into actions.
This article features dappOS, an intent execution network supported by leading institutions.
Intent: From Usable to User-Friendly—Enhancing the On-Chain Experience
Boosting the Experience for Crypto Users
The “DeFi Summer” of 2020 ignited large-scale commercial activities on blockchain platforms. Since then, on-chain commercial activity has flourished. Beyond financial services, vibrant developments in NFTs, gaming, and social platforms have brought new dimensions to blockchain engagement. Currently, over $90 billion in assets (excluding NFTs) remain active on major blockchains, with peak values nearing $200 billion during the last bull market. Recent daily trading volume has surged past $5 billion, with spikes reaching $10 billion in March—nearly half the trading volume of the Hong Kong Stock Exchange.

Source: Defillama

Market participants effectively vote with their wallets, and the blockchain, which offers “greater freedom and more affordable trust”  (as discussed in “How to Invest in Web3 with its Fundamental Value“), is continuously proving its potential as a robust infrastructure for a broad spectrum of commercial activities.
Despite its impressive expansion, the blockchain space is in an early phase, with less than seven years in formation and less than four in genuine business operations. Signs of its emergent nature are evident, such as when the leading blockchain, Solana, experienced a 30-hour downtime in February this year. Furthermore, mainstream blockchain wallets still require that users maintain backups of lengthy seed phrases or private keys. Additionally, on-chain applications on the blockchain that have sustainable business models currently focus primarily on crypto-native assets, the integration of purely off-chain businesses with blockchain still needs to be explored.
In recent years, smart contract blockchains like Ethereum and Solana have seen significant growth in market value and expanded their capabilities to support a wide range of business activities. Regarding the resolution of “Availability” issues, there is now a clear strategy and achievable roadmap: Ethereum has adopted a strategy focused on scalability and efficiency, embracing Rollups and evolving towards a modular framework, with Layer2 solutions like Arbitrum and Base thriving, and a proliferation of dedicated layer2 and layer3 appchains.  Conversely, Solana has pursued a different path, focusing on maximizing performance on a single chain, with an average TPS (transactions per second) exceeding 2,000, and a steady influx of new users and assets.
However, the “usability” of blockchain remains a significant challenge. The current on-chain experience, although adequate for the few million daily active addresses (with actual active users being fewer), is not yet capable of supporting the onboarding of hundreds of millions or even billions of individuals.  To achieve “massive adoption” of blockchain, a transformative improvement in user experience is imperative.
In July 2023, Paradigm introduced the “intent-centric” concept, enhancing the Web3 user experience with what is now known as the Intent Network.
In simple terms, “intent” refers to a user’s specific need or goal. For example, if someone wants to “buy $1000 $BRETT (a memecoin),” this statement outlines their intent. Realizing this intent typically requires multiple transactions, especially when additional constraints lead to an increased number of transactions associated with fulfilling one intent.
In the given example, if my holdings of stablecoins are insufficient on the Base chain and are primarily on the Ethereum network, fulfilling the stated intent would involve the following steps:
Consider a scenario where a user needs to execute the intent of buying $BRETT but lacks the necessary stablecoins on the Base chain, with their funds primarily in Ethereum. The steps to fulfill this intent would include:
Convert 1005 USDC into ETH on the Ethereum mainnet, because USDC cannot be used as gas fee on Base.Bridge $ETH from Ethereum to Base.Swap $ETH to $BRETT.
Executing these three steps demands a substantial amount of underlying knowledge. For example, consider the steps needed to buy Meme Coin $BRETT: First, I must identify and select the most efficient and cost-effective bridge between Ethereum and Base for transferring assets. Second, I need to find the Base chain’s RPC details and learn to add Base network to my wallet. Furthermore, it’s essential to know whether there is an effective trading aggregator on Base chain to secure the best pricing. If not, I need to identify which decentralized exchange primarily supports liquidity for $BRETT. For veterans accustomed to on-chain operations, these tasks may seem straightforward due to their familiarity and experience. However, for beginners, this process can be quite challenging. They often need detailed tutorials to work through each step of the process methodically.
In many ways, today’s on-chain experience mirrors the user interactions with computers before Windows 95 was introduced. Back then, computers were powerful tools capable of complex computations and sophisticated file management, yet they primarily operated through the CMD command prompt. This interface was highly efficient for those well-versed in computer logic and experienced in command-line operations—a preference that persists among many technology enthusiasts to this day. However, for most newcomers, this mode of interaction was overwhelmingly complex, often described as a “nightmare.” To navigate these systems, users were typically provided with a hefty manual, which was essential for mastering the various operations required to effectively use their computers.
However, the situation changed completely with the release of the Windows 95 operating system in 1995. Upon startup, computers directly entered a graphical user interface (GUI), where users could execute tasks with simple mouse clicks. The emergence of browsers also reduced the complexity of accessing the internet. For some, the shift from command-line to graphical user interfaces was seen as a minor enhancement in user experience; the graphical interface was merely a user-friendly packaging of underlying command-line operations. Unlike hardware advancements such as faster processors, which improved performance, Windows 95 made technology truly accessible to the masses. It lowered the barriers for average users, driving the widespread adoption of personal computers. Increased sales of personal computers led to reductions in the cost of Intel CPUs, which in turn enhanced both the performance and experience of PCs.  This development triggered a “stepping effect,” where progress in one area spurred advancements in another, boosting global computer and internet penetration and setting the stage for the Internet Boom.

Internet Penetration Rate

Looking back, it’s clear that what seemed like a modest improvement fundamentally transformed everyday life for individuals and propelled companies like Microsoft, Apple, Google, and NVIDIA to greatness over the last thirty years. This pivotal change is often celebrated as the “1995 moment” in the evolution of personal computers and the internet, a phrase that marks the beginning of a technological explosion.
Despite the smooth approval of the Bitcoin ETF, which gained positive governmental recognition for the cryptocurrency industry and attracted a lot of new users, most people still only hold their cryptocurrencies on centralized exchanges, using blockchain technology merely for basic transactions. Considering the adoption of on-chain applications, the “1995 moment” for cryptocurrency has yet to arrive. It may wait for a game-changing application or operating system tailored for end-users. Projects within the Intent concept stand a good chance of bringing about this pivotal moment. 
Intent: The Best AI+Crypto Use Case
AI has advanced swiftly in recent years, with many believing that 2023 will be the “1995 moment” for AI, as ChatGPT and other chatbots powered by large language models have started to become a part of everyday life, and both the capital market and the general public’s attention to AI reaches an all-time high. The improvement of large language modeling capabilities is still far from seeing a bottleneck, leaving us uncertain about the limits of AI’s capabilities and its potential impacts. Furthermore, with the launch of GPT-4o, it seems the arrival of AI in our daily lives has been significantly accelerated. 
The integration with AI has been one of the significant topics within the cryptocurrency space over the past year, with related concept tokens experiencing a price surge. From a commercial standpoint, blockchain-based AI agents represent one of the most potent use cases for AI + Crypto and this will make it possible to realize intentions more easily.
This is because the rules on the blockchain are determined, with clear boundaries and no black boxes.
As said in our report in April 2024, AI thrives within blockchain systems, fundamentally because the rules of the crypto economy are explicitly defined, and the system allows for permissionlessness. Operating under clear guidelines significantly reduces the risks tied to AI’s inherent stochasticity. For example, AI’s dominance over humans in chess and video games stems from the fact that these environments are closed sandboxes with straightforward rules. Conversely, advancements in autonomous driving have been more gradual. The open-world challenges are more complex, and our tolerance for AI’s unpredictable problem-solving in such scenarios is markedly lower.
In this case, as long as there is enough information input, AI will “identify the optimal solution more quickly than humans” in solving specific problems, thus helping people realize their intentions more easily and quickly.
An Analysis of dappOS
Business Overview
dappOS is an intent execution network, that simplifies the way users interact with decentralized applications (dApps) and public blockchains. Here’s how it works: Users simply communicate their desires—such as wanting to “buy $1000 worth of $BRETT”—directly to dappOS. From there, dappOS takes over, managing all the necessary interactions with various dApps and public blockchains to execute the transactions required to fulfill the user’s intent.
The ambition behind the name dappOS is clear—they aim to become the operating system for dAPPs, much like Windows 95 did for personal computers.
dappOS has established an open dual-marketplace within its ecosystem. On the demand side are the developers who create user-facing applications, and on the supply side are the service nodes that execute these user intents. To maintain a high standard of service, dappOS utilizes an Optimistic Minimum Staking (OMS) mechanism.

Framework of dappOS Network

Key roles and their functions within the dappOS ecosystem are defined as follows:
Users: They publish intent tasks based on the framework provided by dappOS.Service Providers: These entities execute the intent services. After depositing a specific amount of dappOS tokens as collateral, they can start accepting intents from users and generate revenue.Execution Validators: Tasked with overseeing the performance of service nodes. Validators have the authority to impose penalties on any service node that fails to fulfill its duties as required.Matchers: Responsible for matching users with suitable service providers

Workflow of dappOS Network

During the execution process, users interact with the system through a frontend interface to submit their intents to matchers. The matchers then approach the relevant service providers to gather quotes for these intents and present them to the users. If the users deem the quotes acceptable, they can select their preferred service provider to carry out the task. After making their selection, users will authorize the intent by signing it and transferring the necessary resources to the service provider, who then executes the intent as specified.
After the specified task duration, numerous validators verify whether the task has been successfully executed. If anyone identifies that the task was not completed, a challenge can be raised within the network. The validators will then use PoS voting to reach a consensus. If it is agreed that the task was not executed as required, the service provider is obligated to use their deposited collateral to compensate the user.
In addition to its core functionalities, dappOS implements an Optimistic Minimum Staking (OMS) mechanism. This approach requires service nodes to stake an amount slightly above the total value of the intents they are tasked with, enabling them to provide services with financial commitment (minimal). It also permits service nodes to continue executing tasks before verification (optimistic). If the service outcomes are validated successfully by the validators, the service nodes are entitled to their revenue from the tasks. However, if a task is found to be unsuccessful, the system imposes penalties on the service nodes, and the users are compensated as previously agreed.
The OMS mechanism aims to strike a balance between task efficiency, capital efficiency, and overall system security. Its goal is to ensure that user tasks are completed successfully while reducing the financial burden on service providers as much as possible. Additionally, through the intent execution network, ordinary users can benefit from the cost and efficiency of execution that professional service organizations provide. For instance, professional service providers have access to unique channels such as VIP accounts on exchanges with very low fees, the ability to aggregate multiple transactions to conserve gas, and superior capabilities for combating MEV on the blockchain. Moreover, competitive dynamics among service providers ensure that service costs are driven down to the most favorable levels, directly benefiting users. The dappOS intent execution network enables even ordinary users to enjoy the same level of service quality and cost-effectiveness that large institutions do.
dappOS currently provides three distinct intent frameworks:
Intent Trading: This framework is tailored to optimize spot trading by helping users secure the most favorable costs.Intent Assets: Within the dappOS ecosystem, this framework treats a series of equivalent assets in a fungible way, addressing the yield-generating and trading properties of the assets.  Intent-based dAPP Interaction: This framework serves as a practical tool for specific activities, like the “buying $1000 $BRETT” example. It streamlines the process of interfacing with dAPPs and efficiently facilitates asset bridging.
The Intent Assets framework smooths out the variations between different blockchains and different fungible assets, thus eliminating much of the work involved in cross-chain or fungible asset exchanges and enhancing user experience. We will take stablecoins and $ETH, which are currently the most universally utilized crypto assets, as examples. 
In the case of stablecoins, users have the option to deposit USDT or USDC from any blockchain into their dappOS account to mint intentUSD. Once minted, intentUSD can be invested in various stablecoin projects that generate interest automatically, with the liquidity of funds transparent and visible to the user, ensuring both earnings and transparency. This setup also simplifies transactions: for instance, when users wish to transfer USDT into a centralized exchange, they can directly use intentUSD. Similarly, if users need to deposit USDC into GMX as collateral, they can conveniently withdraw intentUSD for this purpose.In the case of $ETH, when users convert their deposits into intentETH, it similarly begins to accrue interest automatically. Additionally, intentETH grants the flexibility to purchase assets across any blockchain. For instance, users can readily swap intentETH to $QUICK on the Polygon network, or $JOE on the Avalanche network. Similarly, intentETH can be deposited into Aave on the Arbitrum chain for lending or borrowing services, or as gas on the Ethereum and Layer2 networks, depending on user demand.

It’s clear that intent assets have advantages:
It strikes a balance between profitability and convenience. For instance, while sDAI provides stable returns from real-world assets (RWAs), its liquidity is insufficient for large transactions. Conversely, commonly used assets like USDT and USDC, despite their widespread adoption, do not accrue earnings. intentUSD effectively bridges these gaps, offering both liquidity and returns.They eliminate the slippage between similar assets. For example, intentETH allows for a cost-free bridge among ETH on mainnet, ETH on Arbitrum, stETH, and aETH.
Intent assets also demonstrate advantages over other traditional yield-bearing assets:
LST or LRT Assets: Unlike these assets, intentETH doesn’t need to be locked up, thus providing better liquidity.sDAI or RWA-based Stablecoins: Unlike these stablecoins, intentUSD is always available for trading.Flexible Earn Products on Centralized Exchanges: Intent assets can be easily used across dApps.Lending Protocols and DeFi Yield Platforms: Intent assets offer the unique capability to move seamlessly across blockchains, available for immediate use in trading.
Overall, the experience offered by Intent Assets closely mirrors services like money market fund products, providing an excellent experience with a new trade-off between the profitability, usability, and convenience of assets.
Enhanced user experiences are important in onboarding hundreds of millions, perhaps even billions, of users to the crypto economy. The crypto economy is still in its early stages and as pioneers, we have grown accustomed to using multiple stablecoins, such as USDT and USDC, and can efficiently exchange between them at minimal costs, choosing the right asset for different situations. However, in the physical world, no one thinks we should accept “JPMorgan dollars” and “Citibank dollars” simultaneously, despite their close similarities. For newcomers to the crypto economy, grasping technical details such as “the differences between L1 and L2” or “how to execute cross-chain transactions” might not be necessary, much like how most people are unaware of the intricacies of interbank clearing systems. They simply require a more straightforward method to meet their needs. Bridging this gap in user experience is precisely the goal of Intent execution projects.
Business Partnership
dappOS operates with a unique business model that heavily relies on collaborative efforts with other dAPPs. dappOS has successfully forged partnerships with a wide array of dAPPs.
In January 2023, dappOS formed a partnership with the perpetual platform GMX.  As part of this collaboration, dappOS launched a specialized website, gmx.dappOS.com. This integration allows GMX users to streamline their trading operations directly through dappOS. Key benefits of this partnership include a substantial 20% reduction in trading costs and the added convenience of paying gas fees with any token. During the first quarter of 2024, weekly active users peaked at over 6,000, and more recently, the figure has consistently been around 1,000. The platform also achieved a trading volume of nearly $150 million, with daily trading highs of over $10 million.

Source: Dune

dappOS has established partnerships with Kyberswap, a dex, and Benqi, a lending protocol on the Avalanche network. These collaborations have resulted in substantial engagement, with dappOS weekly unique users registered via Kyberswap over 3,000 and Benqi maintaining around 1,000.

Source: Dune
Moreover, dappOS has also forged partnerships with major public blockchains including Avalanche, zkSync, and Polygon, alongside collaborations with DeFi protocols such as Quickswap, MakerDAO, and Frax.

Source: dappOS
Financing Backgroud
dappOS has completed three funding rounds. 
It was selected for the fifth season of the Binance Labs Incubation Program in November 2022, and subsequently secured Pre-Seed funding from Binance Labs on June 20, 2023. The details of the funding amount, however, have not been publicly disclosed.
On July 21, 2023, dappOS closed its seed funding round at a valuation of $50 million. This round was led by IDG Capital and Sequoia China. Other notable participants included OKX Ventures, HashKey Capital, KuCoin Ventures, TronDao, Gate Labs, Taihill Ventures, Symbolic Capital, Foresight Ventures, BlueRun Ventures, Mirana Ventures, Leland Ventures, among others.
On March 28, 2024, dappOS finalized its Series A funding round, raising $15.3 million at a valuation of $300 million. Polychain led this round, with participation from Nomad Capital, IDG, Flow Traders, IOBC, NGC, Amber Group, Uphonest, Taihill, Waterdrip, Bing Ventures, Metalpha, Spark Digital Capital, Web3Port Foundation, and Satoshi Lab, among other participants.

Overall, dappOS has a robust investment background and has recently finished a well-funded $15.3 million funding round.
Summary 
The Intent execution project is focused on improving the user experience in the Web3 industry, aiming to catalyze Web3’s “1995 moment” and foster massive adoption. Over the past year, it has also emerged as a popular trend for venture capital investment, with many projects converging around the concept of intent.
However, the intent execution network is still budding, with most projects lacking gp-to-market products and clear business models. Specifically, the majority of products and mechanisms associated with dappOS, as discussed in this article, are yet to be launched. Consequently, there remains considerable uncertainty about the future trajectory of both the track and the projects within it.
For projects that may lack short-term deliverables but possess substantial potential for long-term value, two critical indicators stand out: the quality of investment backing and the capacity for commercial expansion.
dappOS is supported by a lot of famous investors, including top exchanges, traditional venture capitalists, and crypto VCs, showcasing an impressive roster of backers. Moreover, its successful partnerships with notable DeFi projects like GMX highlight dappOS’s robust commercial capabilities. 
As dappOS continues to develop and expand its role in leading the Intent Execution, we will maintain a close watch on its progress.
Understanding Chain Abstraction by Problem FramingBy Lydia Wu, Researcher at Mint Ventures If you found yourself baffled upon your first encounter with the “chain abstraction” concept, you’re not alone.  It appears significant, with numerous projects, and extensive funding, all claiming to be the standard
 yet its practical use still needs to be discovered. Is “chain abstraction” just another buzzword in the pipeline of new Web3 concepts?  This article will start with the concept, return to the fundamental questions, and aim to make something out of nothing. TL; DR: The purpose of abstraction is to hide complexity, and the levels of abstraction in the Web3 context are often higher than in Web2, making it more challenging.Modularity simplifies the process of building blockchains. Meanwhile, chain abstraction involves restructuring the relationships among chains and enhancing the experience for both users and developers.Analyzing cross-chain asset transfers, cross-chain communication, interoperability, and chain abstraction: A conceptual hierarchy centered on coordinating state changes (transactions) across different chains, though these concepts often blur into one another in practice.Intent-based chain abstraction solutions are becoming a popular architecture, with many component-based products potentially coming together like puzzle pieces to gradually shape the final form of chain abstraction.Current discussions and efforts around chain abstraction have yet to break free from an infrastructure-centric orthodoxy. The validity of chain abstraction as a real issue depends on active on-chain engagement, advancements in modularity, and the influx of new users and developers. The future of chain abstraction is not a straightforward journey; it requires an evaluation of its impact on long-tail chains and an exploration of non-DeFi applications. What exactly is chain abstraction? Is chain abstraction a real problem?What kind of problem does chain abstraction belong to?What is the difference between cross-chain, interoperability, and chain abstraction? Is Chain Abstraction A Real Problem? —Not necessarily. A problem’s validity depends on its context. Imagine asking someone 500 years ago for their opinion on an energy crisis. So, where does the discussion of chain abstraction come from? The answers vary but often touch on key terms such as the Ethereum roadmap, modularity, intent, and mass adoption
 At present, the most compelling perspective seems to be that chain abstraction represents the latter stages of modularity.  A clear definition of chain abstraction is essential to grasp this perspective. In computer science, “abstraction” is the process of extracting high-level operations and concepts from the backend processes, intended to simplify comprehension by masking complexity. For example, most Web2 users merely need to be familiar with browsers and ChatGPT, remaining oblivious to the underlying complexities or even the notion of abstraction itself. Similarly:  Account Abstraction: Facilitates seamless account functionality by hiding internal details like addresses, private keys, and mnemonic phrases, to facilitate a seamless user experience. Chain Abstraction: Ensures seamless operation across chains by hiding internal specifics such as consensus mechanisms, gas fees, and native tokens. In traditional software development, abstraction, and modularity are interconnected and critical concepts. Abstraction outlines the system’s structural hierarchy, whereas modularity is the practice of implementing this structure. Each module symbolizes a level of abstraction, and the interactions between modules hide their internal complexities, which aids in code extension, reuse, and maintenance. Without abstraction, the demarcations between modules would be intricate and challenging to manage. Lecture 3 Scribe Notes: Abstraction and Modularity It’s important to recognize that Web2 products often perform abstraction and modularity within closed or semi-closed ecosystems, concentrating abstraction layers within a single platform or application in controlled environments, typically devoid of cross-platform or systemic compatibility concerns. In contrast, within the Web3 framework, driven by the commitment to decentralization and open ecosystems, the dynamics between modularity and abstraction are considerably more complex. Although modularity can help deal with abstraction issues within single chain and reduce the barriers to chain development, it has not entirely addressed the abstraction of user and developer experiences in a multi-chain context. There’s a noticeable “island effect” among various chains and ecosystems, particularly evident in the fragmentation of liquidity, developers, and users. The introduction of chain abstraction involves redesigning the relationships among different chains to facilitate their interconnectivity, integration, and compatibility, as demonstrated in an article released by Near in January of this year. We can argue that the urgency of chain abstraction as a legitimate concern is intimately tied to the evolution of the following factors: On-chain activity: Whether the presence of diversified dAPPs leads to increased user engagement on chain.Progress in Modular Blockchain: Whether heightened on-chain activities encourage the development of more rollups and appchains.Barriers for New Users and Developers: To what extent does the current blockchain environment inhibit the entry of newcomers and developers (referring to the friction in a rising trend, rather than frustration in a stagnant state)? What Kind of Problem Does Chain Abstraction Belong to? Chain abstraction itself is an abstract concept that operates at a high-dimensional level within the Web3 narrative. This may partly explain why it presents as both all-inclusive and somewhat perplexing. Specifically, chain abstraction is not a solution but an instructive philosophy. Similar to how Bitcoin today, after several halvings, dramatic price fluctuations, and the introduction of ETFs, has transcended its original identity as a technology solution or an asset. It has transformed into a timeless ideology and a crypto totem that embodies core cryptographic values and will continue to steer industry innovation and development well into the future. Differences and Connections: Cross-chain, Interoperability, and Chain Abstraction These concepts can be understood on a spectrum from concrete to abstract. They represent a conceptual hierarchy centered on coordinating state changes (transactions) across different chains, yet often involve a great deal of grey area in practical use. Cross-chain applications and protocols can broadly be divided into two main categories: Cross-chain Asset Transfer: Such as cross-chain bridges, cross-chain Automated Market Makers (AMMs), and cross-chain aggregators.Cross-chain Communication: Protocols such as Layerzero, Wormhole, and Cosmos IBC, etc. Asset transfer also relies on message passing. In cross-chain asset transfer applications, the message layer typically involves a set of on-chain smart contracts and state update logic. Abstracting this message-passing functionality into a universal, protocol-layer solution is what defines a cross-chain communication protocol. Cross-chain communication protocols can handle complex operations across blockchains, including governance, liquidity farming, NFT trading, token issuance, and gaming interactions. Interoperability protocols extend these capabilities further, delving into deeper data processing, consensus, and validation to ensure consistency and compatibility between different blockchains. In practice, however, these two concepts are often two sides of the same coin and can be used interchangeably depending on the context. The essence of chain abstraction includes blockchain interoperability but also introduces an additional layer focused on enhancing the experiences of users and developers. This aspect is closely linked to the intent narrative that has gained attraction in this cycle. The integration of intent with chain abstraction will be further detailed in the subsequent sections. What specific issues are involved in chain abstraction? How can chain abstraction be achieved?Why is the integration of intent with chain abstraction significant? How can chain abstraction be achieved? Different projects have distinct interpretations and entry points concerning chain abstraction. We can classify these into two schools: the Classical School, which emerged from interoperability protocols and focused on developer-side abstraction, and the Intent School, which incorporates new intent architectures and concentrates more on user-side abstraction. The roots of the Classical School can be traced back to Cosmos and Polkadot, well before the advent of the chain abstraction concept. Newer entrants like Optimism Superchain and Polygon Agglayer are now focusing on liquidity aggregation and interoperability within the Ethereum L2 ecosystem. Cross-chain communication protocols such as Layerzero, Wormhole, and Axelar are expanding to additional chains and striving for greater adoption to amplify their network effects. Within the Intent School, L1 projects like Near and Particle Network are devoted to offering comprehensive chain abstraction solutions. Additionally, component-based strategies that tackle specific challenges are prevalent, especially within DeFi protocols, exemplified by UniswapX, 1inch, and Across Protocol. For both the Classical and Intent schools, the fundamental design principles emphasize secure and fast cross-chain functionalities along with intuitive user interactions. Key features include unified user interfaces, seamless cross-chain functionality for dAPPs, as well as the management and subsidy of gas fees. Why is the integration of intents with chain abstraction significant? “Intent-based protocols” are emerging in abundance, and this section will explore why they have become a popular architectural choice and their potential implications.  Similar to abstraction and modularity, intent is not a native concept in Web3. Intent recognition has been a significant aspect of natural language processing for decades and has been extensively studied in human-computer dialogues. When discussing intent research in Web3, it’s impossible to ignore the famous paper by Paradigm. While similar design principles have already been implemented in products like CoWSwap, 1inch, and Telegram Bots, it was this paper that formally articulated the essence of intent architecture: users simply define what they want to achieve and leave the complexities of the process to be handled by third parties. This philosophy aligns with the focus of chain abstraction on enhancing user experience, providing a distinct and practical solution approach. The market features a diverse range of frameworks for chain abstraction, with the CAKE framework (Chain Abstraction Key Elements) from Frontier Research being particularly prominent. This framework, which incorporates intent architecture, organizes the various technologies and solutions of chain abstraction into distinct layers: permission layer, solver layer, and settlement layer. Other frameworks have fine-tuned this approach, such as Everclear, which added a liquidation layer between the solver layer and settlement layer. Source: Frontier Research Specifically: Permission Layer: Central to this layer is account abstraction, acting as the portal for dAPP users to request intent quotes. Solver Layer: This is generally an off-chain third-party solver layer tasked with fulfilling user intents.Settlement Layer: Once users approve transactions, tools such as oracles and cross-chain bridges come into play to guarantee the execution of transactions. In the Solver Layer, solvers are third-party off-chain entities known by various titles—such as solvers, resolvers, searchers, fillers, takers, and relayers—in different protocols. These solvers are generally required to stake assets as collateral to be eligible to compete for orders. The process of using intent-based products is similar to filling a limit order. In cross-chain scenarios, to quickly satisfy user intents, solvers often advance funds and collect a risk premium upon settlement. This arrangement is similar to a short-term loan where the loan duration is equivalent to the blockchain state syncing time, and the interest is akin to a service fee. The comprehensive solutions represented by Near, which hopes to combine permission, solver, and settlement layers into a unified infrastructure, are in the early stages of proof-of-concept, making it difficult to observe and evaluate its utility. Conversely, component-based solutions, particularly those in cross-chain DeFi protocols, have demonstrated advantages over traditional cross-chain solutions. Across Bridge, the flagship product of Across Protocol, utilizes an intent-centric architecture to offer higher speed, lower price, and stronger fee-generating capability among EVM-compatible cross-chain bridges, with its benefits being particularly pronounced in smaller transactions. The bridge speed and fees of different cross-chain products in Jumper Bridge speed and fees of L2-L1 chains on Across Protocol and Stargate Across Protocol has a higher fee-generating capability, Source: DefiLlama According to the roadmap, Across Protocol plans to launch a modular settlement layer to facilitate cross-chain intents in its third phase. Uniswap Labs and Across Protocol have co-proposed ERC-7683, which seeks to simplify the entry process for solvers by standardizing intent expressions and creating a universal network for solvers.  Intent-based chain abstraction solutions will likely become a popular architecture, with many components potentially assembling the ultimate standard of chain abstraction like pieces of a puzzle. What challenges exist in our understanding and implementation of chain abstraction? What issues stem from an infrastructure-centric perspective? What other concerns related to chain abstraction are worth further exploration? What issues stem from an infrastructure-centric perspective?  As leading interoperability protocols, Layerzero has raised a cumulative $290 million and Wormhole $225 million, yet the substantial FDV and low market cap have led their tokens to become symbols of the much-criticized VC tokens of this cycle, undermining market confidence in the chain abstraction. Returning to the cartoon at the beginning, it is apparent that chain abstraction projects, despite their unique technology stacks and token standards, are often branded as “useless infrastructure” due to the stagnant external market growth. Additionally, the downturn in metrics before and after Layerzero’s airdrop has intensified doubts regarding the market demand for “cross-chain communication.” Significant decline in metrics following the airdrop by Layerzero On the ERC-7683 forum page, developers discussed the role of the ERC standard itself in response to criticism that cross-chain asset transfer functionality is too minor, not universal enough. Proponents of minimalist ERCs argue that tool-level standards are sufficient to address existing problems and can be combined with existing standards, making adoption relatively easier. Given that the design philosophy of intent architecture is largely application-focused, “universal, full-stack, and compatible” protocol standards can sometimes become “too vague to be meaningful” or “too complex to address real-world problems.” This leads to an ironic phenomenon: the chain abstraction protocols, which are born to solve fragmentation issues, end up providing fragmented solutions themselves. ERC-7683: Cross Chain Intents Standard What other concerns related to chain abstraction are worth further exploration? Similar to how globalization affects underdeveloped regions, chain abstraction makes it more difficult to maintain TVLs for new and long-tail chains. What effect will this have on the adoption of chain abstraction?A study by Variant suggests that UniswapX may lead to a new situation where long-tail tokens are directed towards AMMs, while mainstream tokens are increasingly filled by off-chain solvers. Is this the future trend for DEXs? Will there be a global solver layer stacked on top of the global liquidity layer in the future?Besides DeFi protocols, what other forms might intent-based product architectures take?Will chain abstraction become the next big trend after modularity, or will it turn out to be a major bubble

Understanding Chain Abstraction by Problem Framing

By Lydia Wu, Researcher at Mint Ventures

If you found yourself baffled upon your first encounter with the “chain abstraction” concept, you’re not alone. 
It appears significant, with numerous projects, and extensive funding, all claiming to be the standard
 yet its practical use still needs to be discovered. Is “chain abstraction” just another buzzword in the pipeline of new Web3 concepts? 
This article will start with the concept, return to the fundamental questions, and aim to make something out of nothing.
TL; DR:
The purpose of abstraction is to hide complexity, and the levels of abstraction in the Web3 context are often higher than in Web2, making it more challenging.Modularity simplifies the process of building blockchains. Meanwhile, chain abstraction involves restructuring the relationships among chains and enhancing the experience for both users and developers.Analyzing cross-chain asset transfers, cross-chain communication, interoperability, and chain abstraction: A conceptual hierarchy centered on coordinating state changes (transactions) across different chains, though these concepts often blur into one another in practice.Intent-based chain abstraction solutions are becoming a popular architecture, with many component-based products potentially coming together like puzzle pieces to gradually shape the final form of chain abstraction.Current discussions and efforts around chain abstraction have yet to break free from an infrastructure-centric orthodoxy. The validity of chain abstraction as a real issue depends on active on-chain engagement, advancements in modularity, and the influx of new users and developers. The future of chain abstraction is not a straightforward journey; it requires an evaluation of its impact on long-tail chains and an exploration of non-DeFi applications.
What exactly is chain abstraction?
Is chain abstraction a real problem?What kind of problem does chain abstraction belong to?What is the difference between cross-chain, interoperability, and chain abstraction?
Is Chain Abstraction A Real Problem?
—Not necessarily. A problem’s validity depends on its context. Imagine asking someone 500 years ago for their opinion on an energy crisis.
So, where does the discussion of chain abstraction come from?
The answers vary but often touch on key terms such as the Ethereum roadmap, modularity, intent, and mass adoption
 At present, the most compelling perspective seems to be that chain abstraction represents the latter stages of modularity. 
A clear definition of chain abstraction is essential to grasp this perspective.
In computer science, “abstraction” is the process of extracting high-level operations and concepts from the backend processes, intended to simplify comprehension by masking complexity. For example, most Web2 users merely need to be familiar with browsers and ChatGPT, remaining oblivious to the underlying complexities or even the notion of abstraction itself.
Similarly: 
Account Abstraction: Facilitates seamless account functionality by hiding internal details like addresses, private keys, and mnemonic phrases, to facilitate a seamless user experience. Chain Abstraction: Ensures seamless operation across chains by hiding internal specifics such as consensus mechanisms, gas fees, and native tokens.
In traditional software development, abstraction, and modularity are interconnected and critical concepts. Abstraction outlines the system’s structural hierarchy, whereas modularity is the practice of implementing this structure. Each module symbolizes a level of abstraction, and the interactions between modules hide their internal complexities, which aids in code extension, reuse, and maintenance. Without abstraction, the demarcations between modules would be intricate and challenging to manage.

Lecture 3 Scribe Notes: Abstraction and Modularity
It’s important to recognize that Web2 products often perform abstraction and modularity within closed or semi-closed ecosystems, concentrating abstraction layers within a single platform or application in controlled environments, typically devoid of cross-platform or systemic compatibility concerns. In contrast, within the Web3 framework, driven by the commitment to decentralization and open ecosystems, the dynamics between modularity and abstraction are considerably more complex.
Although modularity can help deal with abstraction issues within single chain and reduce the barriers to chain development, it has not entirely addressed the abstraction of user and developer experiences in a multi-chain context. There’s a noticeable “island effect” among various chains and ecosystems, particularly evident in the fragmentation of liquidity, developers, and users. The introduction of chain abstraction involves redesigning the relationships among different chains to facilitate their interconnectivity, integration, and compatibility, as demonstrated in an article released by Near in January of this year.
We can argue that the urgency of chain abstraction as a legitimate concern is intimately tied to the evolution of the following factors:
On-chain activity: Whether the presence of diversified dAPPs leads to increased user engagement on chain.Progress in Modular Blockchain: Whether heightened on-chain activities encourage the development of more rollups and appchains.Barriers for New Users and Developers: To what extent does the current blockchain environment inhibit the entry of newcomers and developers (referring to the friction in a rising trend, rather than frustration in a stagnant state)?
What Kind of Problem Does Chain Abstraction Belong to?
Chain abstraction itself is an abstract concept that operates at a high-dimensional level within the Web3 narrative. This may partly explain why it presents as both all-inclusive and somewhat perplexing. Specifically, chain abstraction is not a solution but an instructive philosophy.
Similar to how Bitcoin today, after several halvings, dramatic price fluctuations, and the introduction of ETFs, has transcended its original identity as a technology solution or an asset. It has transformed into a timeless ideology and a crypto totem that embodies core cryptographic values and will continue to steer industry innovation and development well into the future.

Differences and Connections: Cross-chain, Interoperability, and Chain Abstraction
These concepts can be understood on a spectrum from concrete to abstract. They represent a conceptual hierarchy centered on coordinating state changes (transactions) across different chains, yet often involve a great deal of grey area in practical use.

Cross-chain applications and protocols can broadly be divided into two main categories:
Cross-chain Asset Transfer: Such as cross-chain bridges, cross-chain Automated Market Makers (AMMs), and cross-chain aggregators.Cross-chain Communication: Protocols such as Layerzero, Wormhole, and Cosmos IBC, etc.
Asset transfer also relies on message passing. In cross-chain asset transfer applications, the message layer typically involves a set of on-chain smart contracts and state update logic. Abstracting this message-passing functionality into a universal, protocol-layer solution is what defines a cross-chain communication protocol.
Cross-chain communication protocols can handle complex operations across blockchains, including governance, liquidity farming, NFT trading, token issuance, and gaming interactions. Interoperability protocols extend these capabilities further, delving into deeper data processing, consensus, and validation to ensure consistency and compatibility between different blockchains. In practice, however, these two concepts are often two sides of the same coin and can be used interchangeably depending on the context.
The essence of chain abstraction includes blockchain interoperability but also introduces an additional layer focused on enhancing the experiences of users and developers. This aspect is closely linked to the intent narrative that has gained attraction in this cycle. The integration of intent with chain abstraction will be further detailed in the subsequent sections.
What specific issues are involved in chain abstraction?
How can chain abstraction be achieved?Why is the integration of intent with chain abstraction significant?
How can chain abstraction be achieved?
Different projects have distinct interpretations and entry points concerning chain abstraction. We can classify these into two schools: the Classical School, which emerged from interoperability protocols and focused on developer-side abstraction, and the Intent School, which incorporates new intent architectures and concentrates more on user-side abstraction.
The roots of the Classical School can be traced back to Cosmos and Polkadot, well before the advent of the chain abstraction concept. Newer entrants like Optimism Superchain and Polygon Agglayer are now focusing on liquidity aggregation and interoperability within the Ethereum L2 ecosystem. Cross-chain communication protocols such as Layerzero, Wormhole, and Axelar are expanding to additional chains and striving for greater adoption to amplify their network effects.
Within the Intent School, L1 projects like Near and Particle Network are devoted to offering comprehensive chain abstraction solutions. Additionally, component-based strategies that tackle specific challenges are prevalent, especially within DeFi protocols, exemplified by UniswapX, 1inch, and Across Protocol.
For both the Classical and Intent schools, the fundamental design principles emphasize secure and fast cross-chain functionalities along with intuitive user interactions. Key features include unified user interfaces, seamless cross-chain functionality for dAPPs, as well as the management and subsidy of gas fees.

Why is the integration of intents with chain abstraction significant?
“Intent-based protocols” are emerging in abundance, and this section will explore why they have become a popular architectural choice and their potential implications. 
Similar to abstraction and modularity, intent is not a native concept in Web3. Intent recognition has been a significant aspect of natural language processing for decades and has been extensively studied in human-computer dialogues.
When discussing intent research in Web3, it’s impossible to ignore the famous paper by Paradigm. While similar design principles have already been implemented in products like CoWSwap, 1inch, and Telegram Bots, it was this paper that formally articulated the essence of intent architecture: users simply define what they want to achieve and leave the complexities of the process to be handled by third parties. This philosophy aligns with the focus of chain abstraction on enhancing user experience, providing a distinct and practical solution approach.
The market features a diverse range of frameworks for chain abstraction, with the CAKE framework (Chain Abstraction Key Elements) from Frontier Research being particularly prominent. This framework, which incorporates intent architecture, organizes the various technologies and solutions of chain abstraction into distinct layers: permission layer, solver layer, and settlement layer. Other frameworks have fine-tuned this approach, such as Everclear, which added a liquidation layer between the solver layer and settlement layer.

Source: Frontier Research

Specifically:
Permission Layer: Central to this layer is account abstraction, acting as the portal for dAPP users to request intent quotes. Solver Layer: This is generally an off-chain third-party solver layer tasked with fulfilling user intents.Settlement Layer: Once users approve transactions, tools such as oracles and cross-chain bridges come into play to guarantee the execution of transactions.
In the Solver Layer, solvers are third-party off-chain entities known by various titles—such as solvers, resolvers, searchers, fillers, takers, and relayers—in different protocols. These solvers are generally required to stake assets as collateral to be eligible to compete for orders.
The process of using intent-based products is similar to filling a limit order. In cross-chain scenarios, to quickly satisfy user intents, solvers often advance funds and collect a risk premium upon settlement. This arrangement is similar to a short-term loan where the loan duration is equivalent to the blockchain state syncing time, and the interest is akin to a service fee.
The comprehensive solutions represented by Near, which hopes to combine permission, solver, and settlement layers into a unified infrastructure, are in the early stages of proof-of-concept, making it difficult to observe and evaluate its utility.
Conversely, component-based solutions, particularly those in cross-chain DeFi protocols, have demonstrated advantages over traditional cross-chain solutions. Across Bridge, the flagship product of Across Protocol, utilizes an intent-centric architecture to offer higher speed, lower price, and stronger fee-generating capability among EVM-compatible cross-chain bridges, with its benefits being particularly pronounced in smaller transactions.

The bridge speed and fees of different cross-chain products in Jumper

Bridge speed and fees of L2-L1 chains on Across Protocol and Stargate

Across Protocol has a higher fee-generating capability, Source: DefiLlama
According to the roadmap, Across Protocol plans to launch a modular settlement layer to facilitate cross-chain intents in its third phase. Uniswap Labs and Across Protocol have co-proposed ERC-7683, which seeks to simplify the entry process for solvers by standardizing intent expressions and creating a universal network for solvers. 
Intent-based chain abstraction solutions will likely become a popular architecture, with many components potentially assembling the ultimate standard of chain abstraction like pieces of a puzzle.
What challenges exist in our understanding and implementation of chain abstraction?
What issues stem from an infrastructure-centric perspective? What other concerns related to chain abstraction are worth further exploration?
What issues stem from an infrastructure-centric perspective? 
As leading interoperability protocols, Layerzero has raised a cumulative $290 million and Wormhole $225 million, yet the substantial FDV and low market cap have led their tokens to become symbols of the much-criticized VC tokens of this cycle, undermining market confidence in the chain abstraction.
Returning to the cartoon at the beginning, it is apparent that chain abstraction projects, despite their unique technology stacks and token standards, are often branded as “useless infrastructure” due to the stagnant external market growth. Additionally, the downturn in metrics before and after Layerzero’s airdrop has intensified doubts regarding the market demand for “cross-chain communication.”

Significant decline in metrics following the airdrop by Layerzero
On the ERC-7683 forum page, developers discussed the role of the ERC standard itself in response to criticism that cross-chain asset transfer functionality is too minor, not universal enough. Proponents of minimalist ERCs argue that tool-level standards are sufficient to address existing problems and can be combined with existing standards, making adoption relatively easier.
Given that the design philosophy of intent architecture is largely application-focused, “universal, full-stack, and compatible” protocol standards can sometimes become “too vague to be meaningful” or “too complex to address real-world problems.” This leads to an ironic phenomenon: the chain abstraction protocols, which are born to solve fragmentation issues, end up providing fragmented solutions themselves.

ERC-7683: Cross Chain Intents Standard
What other concerns related to chain abstraction are worth further exploration?
Similar to how globalization affects underdeveloped regions, chain abstraction makes it more difficult to maintain TVLs for new and long-tail chains. What effect will this have on the adoption of chain abstraction?A study by Variant suggests that UniswapX may lead to a new situation where long-tail tokens are directed towards AMMs, while mainstream tokens are increasingly filled by off-chain solvers. Is this the future trend for DEXs? Will there be a global solver layer stacked on top of the global liquidity layer in the future?Besides DeFi protocols, what other forms might intent-based product architectures take?Will chain abstraction become the next big trend after modularity, or will it turn out to be a major bubble
Exploring The Updated AAVEnomics: Buybacks, Profit Distribution, and Safety Module ShiftBy Alex Xu, Research Partner at Mint Ventures Aave has long been on my radar, and just a few days ago, its governance team, ACI, unveiled a draft of Aave’s upgraded tokenomics on the governance forum. This proposal details expected enhancements in key areas, including the token value capture of Aave and improvements to the protocol’s safety modules. For more insights on Aave, read my recent research Altcoins Keep Falling, Time to Refocus on DeFi, where I thoroughly evaluate its current status, competitive edges, and token valuation. This article delves into the significantly impactful new proposal, specifically answering the following four key questions: What are the main points of the proposal?The potential impacts associated with these points.The scheduled timeline and prerequisites for the proposal’s activation.The potential long-term effects of this proposal on the valuation of $AAVE token. You can read the proposal here. The Main Points of the AAVEnomics Proposal  The proposal, titled “[TEMP CHECK] AAVEnomics Update”, is currently in the preliminary “temperature check” phase of community proposals and was posted on July 25th. The initiator of the proposal is ACI, which can be interpreted as the governance arm of the official Aave team and the central coordinator for community governance. ACI’s major proposals are usually fully communicated with other governance representatives and professional service providers before release, resulting in a high likelihood of approval. The main points of the [TEMP CHECK] AAVEnomics Update include: Overview of Aave’s Robust Operational Health and Strong Financial Reserves Aave continues to lead the DeFi lending space, with revenues substantially outstripping expenses. With reserves mostly in $ETH and stablecoins, there is a timely opportunity to update the tokenomics and begin the distribution of protocol revenues. Bad Debt Management Update: Transition From the Old Safety Module to The New “Umbrella” System Aave has established reserves known as the “Safety Module” to address potential bad debts within the protocol. These reserves consist of: Staked $AAVE, with a current market value of $275 millionStaked GHO, which is a native stablecoin of Aave, holds a current market value of $60 millionStaked $AAVE-$ETH LP tokens, a significant source of on-chain liquidity for Aave, currently valued at $124 million The proposed Umbrella safety system is set to replace the traditional safety module. Specifically:  The bad debt reserves within the system will be administered by the innovative aToken module, which is funded by user-initiated voluntary deposits. Depositors not only continue to accrue their usual interest but will also receive an additional security subsidy, which will come from Aave’s protocol revenue. The Evolving Role of Aave Tokens and Initiation of Protocol Revenue Distribution The Aave staking module remains operational, yet its role has evolved beyond serving as a risk reserve. It now fulfills two vital functions: Stakers are eligible to receive distributions from the protocol’s profit surplus, beyond what is necessary for operations. This distribution is managed by Aave treasury, which executes periodic buybacks of $AAVE on the secondary market based on community governance proposals, benefiting the stakers.Staked $AAVE yields “Anti-GHO,” which can be used either to repay GHO debts or directly deposited into the GHO staking module, thereby enabling $AAVE stakers to also benefit from the profits generated through GHO. Modifications to the GHO Staking Module  Initially, the GHO staking module was responsible for bad debts across the entire Aave protocol. However, following recent changes, its coverage is now specifically limited to bad debts associated with GHO liabilities alone. Other Updates  The liquidity of $AAVE is no longer dependent on the Aave – ETH staking incentives but is now operated by the ALC (Aave Liquidity Committee). The swap from the initial protocol token, $LEND, to $AAVE will be discontinued, with any tokens not exchanged by the deadline being allocated to the treasury. The following graph visibly depicts Aave’s new tokenomics: Effects of the Proposal There are two primary effects: Aave tokens now exhibit a more defined capture of value, with a corresponding reduction in sell pressure, which further aligns with the protocol’s robust development.The value capture originates from buybacks funded by protocol net revenues plus GHO interest profit.The decrease in sell pressure is due to the phasing out of the staking module, implying that Aave will utilize stablecoins and ETH from protocol revenues for expenditures, instead of issuing new $AAVE tokens. This shift will reduce Aave’s market pressure and enhance its scarcity.The introduction of the Umbrella system will enhance the flexibility of the protocol structure and optimize incentive distribution, elevating the ceiling for safety governance and raising higher governance requirements.Previously, the Aave safety module is fully incentivized by $AAVE token emissions, which offered limited flexibility. In contrast, the Umbrella safety module, similar to Eigenlayer’s AVS model, is a modular system that allows for customized incentives based on asset type, duration, and capacity. Moreover, this change means that Aave’s risk team must now consider an additional set of metrics when evaluating and establishing risk parameters, beyond market size, interest rate curves, and loan-to-value ratios. Timeline and Prerequisites for Implementation  ACI said that the rollout of the plan will occur in a phased approach, contingent upon specific prerequisites. It will be structured into three stages, each corresponding to a separate governance proposal, to execute the outlined measures. Phase I: Staking Module and GHO Update Staked GHO(StkGHO) will solely cover the bad debt associated with GHO liabilities. The Aave and AAVE-ETH staking modules will be transitioned to “Legacy Safety Modules,” maintaining their guarantee roles until they are phased out. The cooldown period for withdrawing staked $AAVE has been eliminated. Prerequisites: Met  Timing of Implementation: The proposal is set to move forward once it receives sufficient community input and BGD Labs, Aave’s primary community developer, has approved the Umbrella upgrade. Phase II: $AAVE Token Utility Upgrade and Introduction of New Tokenomics The functionality that allowed for GHO borrow rate discount through AAVE staking will be discontinued.The Anti-GHO feature will be introduced, enabling AAVE stakers to obtain Anti-GHO.The swap from Lend to Aave tokens will be ended. Prerequisites: The market size of GHO must reach $175 million, up from the current level of approximately $100 million.GHO’s market liquidity needs to support a $10 million trading depth affecting the price by less than 1%. Currently, a trading of about $2.1 million is sufficient to move the GHO price by 1%. Phase III: Activation of Fee Switch and Initiation of Buybacks Turn off the legacy security module.Activate the aToken mode of the Umbrella Security Module, enabling users to back the system with their deposits and earn extra rewards.Aave’s financial service provider manages the governance-driven buyback of $AAVE tokens and distributes them to Aave stakers, progressively transitioning to an automated process. Prerequisites:  The average net asset value in Aave’s revenue pool over the last 30 days must be adequate to cover the operational costs of current service providers for the next two years. * As of now, Aave’s treasury, excluding AAVE tokens, holds assets valued at approximately $67 million, with 61% in stablecoins, 25% in Ethereum, and 3% in Bitcoin. The forecasted expenses for 2024 are about $35 million, according to the head of ACI. Assuming similar expenses for 2025, the combined expenses for the two years would be around $70 million. Given Aave’s consistent weekly revenue of $1-2 million in 2024, the threshold is nearly met, and it is projected that this level could be achieved within a month. Asset Type of Aave’s Treasury Aave’s Protocol Revenue The annualized revenue from the Aave protocol for the past 90 days must account for at least 150% of the total protocol expenses in 2024, which includes the allocations for $AAVE token buybacks and the funding for the Umbrella safety module. *Budgets are defined, allocated, and adjusted quarterly by the Aave Finance service provider. Overall, Phase I is ready for deployment. Phase II could take a few more months, contingent upon the Liquidity Committee’s dedication to and budget for GHO liquidity. The rollout of Phase III is more challenging to forecast due to its dependence on specific budgetary allocations, market dynamics, and revenue streams. Nonetheless, given Aave’s robust revenue performance, meeting the necessary criteria is likely manageable. Long-Term Impact of the Proposal on $AAVE Token Prices Over the long term, this proposal establishes a direct connection between the progression of the Aave protocol and the $AAVE token for the first time. It introduces a buyback support mechanism that underpins the token’s floor price while providing token holders with a stream of profit. This strategy is expected to positively influence the token price. However, given that the implementation of the proposal requires time and will be executed in phases, combined with the fact that the proposal was only released recently and there is still a need for discussion and revision on specific terms, the value capture of the $AAVE tokens will be a gradual and long term process. However, should the proposal be successfully executed, Aave, one of the leading DeFi projects, may further captivate the interest of value-oriented investors due to its robust, transparent governance and rewarding approach towards token holders. Such interest is likely to extend beyond the crypto community, attracting newcomers to Web3 from the traditional financial sectors.

Exploring The Updated AAVEnomics: Buybacks, Profit Distribution, and Safety Module Shift

By Alex Xu, Research Partner at Mint Ventures

Aave has long been on my radar, and just a few days ago, its governance team, ACI, unveiled a draft of Aave’s upgraded tokenomics on the governance forum. This proposal details expected enhancements in key areas, including the token value capture of Aave and improvements to the protocol’s safety modules.
For more insights on Aave, read my recent research Altcoins Keep Falling, Time to Refocus on DeFi, where I thoroughly evaluate its current status, competitive edges, and token valuation.
This article delves into the significantly impactful new proposal, specifically answering the following four key questions:
What are the main points of the proposal?The potential impacts associated with these points.The scheduled timeline and prerequisites for the proposal’s activation.The potential long-term effects of this proposal on the valuation of $AAVE token.
You can read the proposal here.
The Main Points of the AAVEnomics Proposal 
The proposal, titled “[TEMP CHECK] AAVEnomics Update”, is currently in the preliminary “temperature check” phase of community proposals and was posted on July 25th. The initiator of the proposal is ACI, which can be interpreted as the governance arm of the official Aave team and the central coordinator for community governance. ACI’s major proposals are usually fully communicated with other governance representatives and professional service providers before release, resulting in a high likelihood of approval.
The main points of the [TEMP CHECK] AAVEnomics Update include:
Overview of Aave’s Robust Operational Health and Strong Financial Reserves
Aave continues to lead the DeFi lending space, with revenues substantially outstripping expenses. With reserves mostly in $ETH and stablecoins, there is a timely opportunity to update the tokenomics and begin the distribution of protocol revenues.
Bad Debt Management Update: Transition From the Old Safety Module to The New “Umbrella” System
Aave has established reserves known as the “Safety Module” to address potential bad debts within the protocol. These reserves consist of:
Staked $AAVE, with a current market value of $275 millionStaked GHO, which is a native stablecoin of Aave, holds a current market value of $60 millionStaked $AAVE-$ETH LP tokens, a significant source of on-chain liquidity for Aave, currently valued at $124 million
The proposed Umbrella safety system is set to replace the traditional safety module. Specifically: 
The bad debt reserves within the system will be administered by the innovative aToken module, which is funded by user-initiated voluntary deposits. Depositors not only continue to accrue their usual interest but will also receive an additional security subsidy, which will come from Aave’s protocol revenue.
The Evolving Role of Aave Tokens and Initiation of Protocol Revenue Distribution
The Aave staking module remains operational, yet its role has evolved beyond serving as a risk reserve. It now fulfills two vital functions:
Stakers are eligible to receive distributions from the protocol’s profit surplus, beyond what is necessary for operations. This distribution is managed by Aave treasury, which executes periodic buybacks of $AAVE on the secondary market based on community governance proposals, benefiting the stakers.Staked $AAVE yields “Anti-GHO,” which can be used either to repay GHO debts or directly deposited into the GHO staking module, thereby enabling $AAVE stakers to also benefit from the profits generated through GHO.
Modifications to the GHO Staking Module 
Initially, the GHO staking module was responsible for bad debts across the entire Aave protocol. However, following recent changes, its coverage is now specifically limited to bad debts associated with GHO liabilities alone.
Other Updates 
The liquidity of $AAVE is no longer dependent on the Aave – ETH staking incentives but is now operated by the ALC (Aave Liquidity Committee). The swap from the initial protocol token, $LEND, to $AAVE will be discontinued, with any tokens not exchanged by the deadline being allocated to the treasury.
The following graph visibly depicts Aave’s new tokenomics:

Effects of the Proposal
There are two primary effects:
Aave tokens now exhibit a more defined capture of value, with a corresponding reduction in sell pressure, which further aligns with the protocol’s robust development.The value capture originates from buybacks funded by protocol net revenues plus GHO interest profit.The decrease in sell pressure is due to the phasing out of the staking module, implying that Aave will utilize stablecoins and ETH from protocol revenues for expenditures, instead of issuing new $AAVE tokens. This shift will reduce Aave’s market pressure and enhance its scarcity.The introduction of the Umbrella system will enhance the flexibility of the protocol structure and optimize incentive distribution, elevating the ceiling for safety governance and raising higher governance requirements.Previously, the Aave safety module is fully incentivized by $AAVE token emissions, which offered limited flexibility. In contrast, the Umbrella safety module, similar to Eigenlayer’s AVS model, is a modular system that allows for customized incentives based on asset type, duration, and capacity. Moreover, this change means that Aave’s risk team must now consider an additional set of metrics when evaluating and establishing risk parameters, beyond market size, interest rate curves, and loan-to-value ratios.
Timeline and Prerequisites for Implementation 
ACI said that the rollout of the plan will occur in a phased approach, contingent upon specific prerequisites. It will be structured into three stages, each corresponding to a separate governance proposal, to execute the outlined measures.
Phase I: Staking Module and GHO Update
Staked GHO(StkGHO) will solely cover the bad debt associated with GHO liabilities. The Aave and AAVE-ETH staking modules will be transitioned to “Legacy Safety Modules,” maintaining their guarantee roles until they are phased out. The cooldown period for withdrawing staked $AAVE has been eliminated.
Prerequisites: Met 
Timing of Implementation: The proposal is set to move forward once it receives sufficient community input and BGD Labs, Aave’s primary community developer, has approved the Umbrella upgrade.
Phase II: $AAVE Token Utility Upgrade and Introduction of New Tokenomics
The functionality that allowed for GHO borrow rate discount through AAVE staking will be discontinued.The Anti-GHO feature will be introduced, enabling AAVE stakers to obtain Anti-GHO.The swap from Lend to Aave tokens will be ended.
Prerequisites:
The market size of GHO must reach $175 million, up from the current level of approximately $100 million.GHO’s market liquidity needs to support a $10 million trading depth affecting the price by less than 1%. Currently, a trading of about $2.1 million is sufficient to move the GHO price by 1%.
Phase III: Activation of Fee Switch and Initiation of Buybacks
Turn off the legacy security module.Activate the aToken mode of the Umbrella Security Module, enabling users to back the system with their deposits and earn extra rewards.Aave’s financial service provider manages the governance-driven buyback of $AAVE tokens and distributes them to Aave stakers, progressively transitioning to an automated process.
Prerequisites: 
The average net asset value in Aave’s revenue pool over the last 30 days must be adequate to cover the operational costs of current service providers for the next two years.
* As of now, Aave’s treasury, excluding AAVE tokens, holds assets valued at approximately $67 million, with 61% in stablecoins, 25% in Ethereum, and 3% in Bitcoin. The forecasted expenses for 2024 are about $35 million, according to the head of ACI. Assuming similar expenses for 2025, the combined expenses for the two years would be around $70 million. Given Aave’s consistent weekly revenue of $1-2 million in 2024, the threshold is nearly met, and it is projected that this level could be achieved within a month.

Asset Type of Aave’s Treasury

Aave’s Protocol Revenue

The annualized revenue from the Aave protocol for the past 90 days must account for at least 150% of the total protocol expenses in 2024, which includes the allocations for $AAVE token buybacks and the funding for the Umbrella safety module.
*Budgets are defined, allocated, and adjusted quarterly by the Aave Finance service provider.
Overall, Phase I is ready for deployment. Phase II could take a few more months, contingent upon the Liquidity Committee’s dedication to and budget for GHO liquidity. The rollout of Phase III is more challenging to forecast due to its dependence on specific budgetary allocations, market dynamics, and revenue streams. Nonetheless, given Aave’s robust revenue performance, meeting the necessary criteria is likely manageable.
Long-Term Impact of the Proposal on $AAVE Token Prices
Over the long term, this proposal establishes a direct connection between the progression of the Aave protocol and the $AAVE token for the first time. It introduces a buyback support mechanism that underpins the token’s floor price while providing token holders with a stream of profit. This strategy is expected to positively influence the token price.
However, given that the implementation of the proposal requires time and will be executed in phases, combined with the fact that the proposal was only released recently and there is still a need for discussion and revision on specific terms, the value capture of the $AAVE tokens will be a gradual and long term process.
However, should the proposal be successfully executed, Aave, one of the leading DeFi projects, may further captivate the interest of value-oriented investors due to its robust, transparent governance and rewarding approach towards token holders. Such interest is likely to extend beyond the crypto community, attracting newcomers to Web3 from the traditional financial sectors.
Altcoins Keep Falling, Time to Refocus on DeFiBy Alex Xu, Research Partner at Mint Ventures & Lawrence Lee, Researcher at Mint Ventures Introduction Despite being one of the most mature sectors in the crypto space, DeFi projects have shown disappointing performance in this bull run. Over the past year, the DeFi sector has seen a modest increase of 41.3%, significantly lagging behind the average market growth of 91% and Ethereum’s 75.8% rise. Source: artemis Focusing on the data from 2024 alone, the performance of the DeFi sector is hard to say positive, with an overall decline of 11.2%. Source: artemis However, from my perspective, in the unusual market context where altcoin prices have collectively dropped following BTC’s new high, some leading DeFi projects may have reached the optimal time for strategic investment.  In this article, I aim to clarify the current value of DeFi by exploring the following questions: The reasons behind altcoin’s significant underperformance compared to BTC and Ethereum in this cycleWhy now is the best time to focus on DeFiSome DeFi projects worth paying attention to, along with their sources of value and associated risks This article does not aim to cover all DeFi projects with investment potential in the market. The DeFi projects mentioned are merely analysis examples and are not financial advice. Please note that this article reflects my current thinking and may evolve. The opinions here are subjective and may be facts, data, and logical reasoning errors. Feedback and discussions are welcomed. The Enigma of the Steep Decline in Altcoin Prices In my view, the disappointing performance of altcoin prices in this cycle can be attributed to three main internal factors within the crypto industry: Insufficient Demand Growth: There is a lack of attractive business models, and most crypto sectors are far from achieving product-market fit (PMF).Supply-Side Overgrowth: With the infrastructures becoming more robust and entry barriers lowering, new projects have been excessively issued.Persistent Token Unlocking: The continuous unlocking of tokens from low-circulation, high FDV (Fully Diluted Valuation) projects has led to significant selling pressure. Let’s look at each of the three reasons in detail. Insufficient Demand Growth: The First Bull Market Lacking Innovative Narratives In my early March article, “Preparing for Primary Wave: My Periodic Strategy on This Bull Market Cycle,” I pointed out that this bull market lacks the scale of business innovation and narratives seen in the DeFi boom of 2021 and the ICO surge of 2017. Hence, the strategy should be to overweight BTC and ETH (benefiting from the influx of funds from ETFs), and to control the allocation to altcoins. To date, my observation has proven to be accurate. The absence of new business stories has reduced the influx of entrepreneurs, investments, users, and funds. More importantly, this has dampened investors’ overall expectations for industry growth. When the market lacks compelling stories like “DeFi will disrupt traditional finance,” “ICO is a new paradigm for innovation and financing,” and “NFTs are revolutionizing the content industry ecosystem,” investors naturally gravitate towards sectors with new narratives, such as AI. However, I do not support overly pessimistic views. Although we have not yet seen attractive innovations in this cycle, the infrastructure is continuously improving: The cost of block space has significantly decreased across both Layer 1 (L1) and Layer 2 (L2) solutions.Cross-chain communication solutions are becoming more comprehensive, offering a wide array of options.Wallets upgraded their experiences to become more user-friendly. For example, Coinbase’s smart wallet supports keyless quick creation and recovery, direct calling of CEX balances, and eliminating the need to top up gas, providing users with a better product experience.Solana’s Actions and Blinks features allow interactions with the Solana blockchain to be published in any common internet environment, further shortening the user journey. These infrastructure is like real-world water, electricity, coal, and roads. They are not the result of innovation but are the soil from which it springs. Excessive Supply Growth: Over-Issuance of Projects and Continuous Token Vesting of High-Market-Cap Projects In fact, looking from another angle, although the prices of many altcoins have hit new lows for the year, the total market capitalization of altcoins relative to BTC has not suffered as severely. Trading view, June 25, 2024 BTC’s price has fallen by approximately 18.4% from its peak, while the total market cap of altcoins has only decreased by 25.5%. Note: Altcoins’s market cap refers to the total crypto market cap excluding BTC and ETH, namely “Total3” in the Trading View system. Trading view, June 25, 2024 The limited decline in the total market cap of altcoins occurs against a backdrop of significant expansion in the total quantity and market cap of newly issued altcoins. The chart below clearly illustrates that during this bull market, the growth trend in new tokens is the most rapid in history. New Tokens by Blockchain It is important to note that the above data only includes newly-issued tokens on EVM chains, with over 90% issued on the Base chain. In reality, even more new tokens have been issued on Solana. Whether on Solana or Base, most of the newly issued tokens are memecoins. Among the high market cap memecoins that have emerged in this bull market are: MemecoinsCirculating Market Cap ($million)Dogwifhat2,040Brett1,660Notcoin1,610DOG‱GO‱TO‱THE‱MOON630Mog Coin560Popcat470Maga410 In addition to memecoins, a large number of infrastructure concept tokens are or will be issued on the exchange this year. Layer2 Solutions L2 SolutionsCirculating Market Cap ($billion)FDV ($billion)Starknet0.937.17ZKsync0.613.51Manta Network0.331.02Taiko0.121.9Blast0.482.81 Cross-chain Services Cross-chain ServicesCirculating Market Cap ($billion)FDV ($billion)Wormhole0.633.48Layerzero0.682.73Zetachain0.231.78Omni Network0.1471.42 Modular Blockchains Modular BlockchainsCirculating Market Cap ($billion)FDV ($billion)Altlayer0.291.87Dymension0.31.59Saga0.141.5 *The market cap is sourced from Coingecko as of June 28, 2024. Additionally, many tokens listed on CEXs are facing substantial vesting. These tokens commonly have low circulation ratios, and high FDVs, and have undergone early VC funding rounds, resulting in low token costs for institutions. The combination of weak demand and narratives, along with over-issuance on the supply side, is an unprecedented situation in the crypto cycle. Despite manual efforts to sustain valuations by further reducing the token circulation ratio at the time of listing – from 41.2% in 2022 to 12.3% in 2024 – and gradually selling to secondary investors, the convergence of these factors has led to an overall downward shift in the valuation of these crypto projects. In 2024, only a few concepts, such as Memecoins, CEX, and DePIN, have managed to maintain positive returns. MC/FDV of Newly-launched tokens However, from my perspective, the valuation collapse of high market cap VC Token is a normal market response to various crypto anomalies: Duplicate creation of Rollups, leading to the ghost-town phenomenon that high-TVL projects are filled with bots but lack authentic usersRaising funds by rebranding terms while providing essentially similar solutions, especially among cross-chain communication servicesLaunching projects based on trends rather than actual user needs, such as numerous AI+Web3 projectsProjects fail to find profitable models and their tokens with no ability to capture value ​​The overall decline in the valuation of these altcoins is a result of the market’s self-correction. It is a healthy process of the bubble bursting, where funds vote with their feet, leading to a market clearing and self-rescue.  The reality is that most VC coins are not entirely without value; they were simply overpriced, and the market has finally adjusted them to their rightful positions. The Right Time to Focus on DeFi: PMF Products Emerging from the Bubble Since 2020, DeFi has officially become a category within the altcoin ecosystem. In the first half of 2021, the Top 100 crypto market cap rankings were dominated by DeFi projects, with a dizzying array of subcategories, all aiming to replicate every existing business model in traditional finance on the blockchain.  During that year, DeFi was the fundamental module of public chains, and DEXs, lending platforms, stablecoins, and derivatives became the essential components for any new public chain. However, with the over-issuance of similar projects, numerous hacker attacks or insider jobs, and the rapid collapse of TVL obtained by relying on Ponzi schemes, the once soaring token prices spiraled down to zero. As we enter the current bull market cycle, the price performance of most surviving DeFi projects has been unsatisfactory, and primary investments in the DeFi sector have dwindled. As is typical at the start of any bull market, investors are most attracted to the new narratives emerging in this cycle, and DeFi does not fit into that type. However, it is precisely for this reason that DeFi projects, surviving from the bubble, are beginning to look more attractive than other altcoin projects. Specifically: Business Overview: They have mature business models and profit models, and leading projects have competitive edges DEXs and derivatives earn trading fees, lending platforms generate income from interest spreads, stablecoins collect stability fees, and staking projects charge fees for their services. These sectors have clear profit models. Leading projects in each sector have organic user demand, have largely moved past the subsidy phase, and some have achieved positive cash flow even after accounting for token emissions. Rankings of Profitable Crypto Projects According to statistics from Token Terminal, as of 2024, 12 of the top 20 most profitable protocols are DeFi projects. By category, they include: Stablecoins: MakerDAO, EthenaLending: Aave, VenusStaking Services: LidoDEX: Uniswap Labs, PancakeSwap, Thena (earning from trading fees)Derivatives: dYdX, Synthetix, MUXYield Aggregators: Convex Finance These projects have various competitive advantages, which can derive from multi-sided or bilateral network effects, user habits and brand recognition, or unique ecosystem resources. However, the leading DeFi projects in their respective sectors share some common traits: stabilizing market share, fewer later competitors, and service pricing power. We will analyze thesel DeFi projects in detail later. Supply Side: Low Emissions, High Circulation Ratio, Minimal Token Unlocking In the previous section, we noted that one of the main reasons for the continuous decline in the current cycle of altcoin valuation is the high emissions from many projects based on inflated valuations, coupled with the negative expectations from the large-scale unlocking of tokens. In contrast, leading DeFi projects, due to their earlier launch dates, have mostly passed their peak token emission periods, and institutional tokens have largely been released, resulting in minimal future selling pressure. For example, Aave currently has a token circulation ratio of 91%, Lido’s is 89%, Uniswap’s is 75.3%, MakerDAO’s is 95%, and Convex’s is 81.9%. This is partly an indication of low future dumping pressure, but it also means that whoever wants to gain control of these projects will have to buy tokens from the market. Valuation Analysis: Divergence Between Market Attention and Business Metrics, Valuation Levels at Historical Lows Compared to new concepts like Meme, AI, DePIN, Restaking, and Rollup services, DeFi has gained very little attention in this bull market, and its price performance has been mediocre. However, the core business metrics of leading DeFi projects, such as trading volume, lending scale, and profit levels, have continued to grow. This divergence between price and business metrics has resulted in the valuation levels of some leading DeFi projects reaching historical lows. Take the lending protocol Aave as an example. While its quarterly revenues (referring to net income, not overall agreement fees) have surpassed the highs of the last cycle and hit all-time highs, its PS ratio (circulating market cap/annualized revenue) has hit an all-time low and is currently at just 17.4x. Tokenterminal Regulation: The FIT21 Act is favorable for DeFi compliance and may trigger potential M&A FIT21, the Financial Innovation and Technology for the 21st Century Act, aims to establish a clear federal regulatory framework for the digital asset market, enhance consumer protection, and promote U.S. leadership in the global digital asset market. Proposed in May 2023 and passed by a wide margin in the House of Representatives on May 22 of this year, this Act clarifies the regulatory framework and rules for market participants. Once officially passed, it will facilitate investment in DeFi projects for both startups and traditional financial entities. Given traditional financial institutions like BlackRock’s recent embrace of crypto assets, such as promoting ETF listings and issuing bond assets on Ethereum, DeFi is likely to be a major focus area for them in the coming years. For traditional financial giants, mergers and acquisitions could be one of the most convenient options, and any sign of relevance, even mere acquisition intentions, will trigger a Defi leading project’s revaluation. I will analyze the business conditions, competitive advantages, and valuations of selected DeFi projects as the examples.  Given the multitude of DeFi projects, I will prioritize those with better business development, significant competitive advantages, and more attractive valuations. Notable DeFi Projects Lending Protocol: Aave Aave stands out as one of the most established DeFi projects, having secured its initial funding round in 2017. Post-funding, Aave evolved from a peer-to-peer lending framework (formerly known as Lend) to a sophisticated peer-to-pool model. During the previous bull market, Aave outperformed Compound, a major competitor in its space, and now leads the lending protocols in both market share and market cap  The core business model of Aave revolves around capturing the spread between borrowing and lending rates. In 2023, Aave introduced its stablecoin, $GHO, which is designed to bolster its interest income. Managing GHO, however, means additional operational costs including marketing and liquidity incentives. Business Overview For lending protocols, the active loan volume is a pivotal indicator, serving as the primary revenue stream for such projects.  The chart below illustrates Aave’s market share in terms of active loan sizes over the last year. Aave’s proportion of active loans has been on an upward trajectory for the past six months, now claiming a significant 61.1% market share. It’s important to note that this percentage might be understated. The graph inadvertently includes a double count of the loan volumes attributed to Morpho’s optimizers which operate on both Aave and Compound. Tokenterminal Another vital metric is the protocol’s profitability or its profit margins. Here, profits are calculated as the protocol’s revenue minus token incentives. As illustrated in the subsequent graph, Aave’s protocol profitability has distanced itself from other lending protocols. Aave effectively abandoned the Ponzi model of relying on token subsidies to fuel operations, a strategy still employed by others such as Radiant (indicated by the purple segment in the chart). Tokenterminal Competitive Advantages Aave has four foundational strengths: 1. Ongoing Accumulation of Security Credit: Many new lending protocols face security breaches within their first year of operation. However, Aave has maintained a flawless record with no incidents at the smart contract level since its inception. This track record of safe and stable operations is a crucial factor for DeFi users when selecting a lending platform, particularly for high-volume investors or “whales.” Justin Sun, for instance, is a notable long-term user of Aave. 2. Bilateral Network Effect: Similar to many online platforms, DeFi lending operates as a classic bilateral market where depositors and borrowers form the respective sides of supply and demand. An increase in activity on one side—whether in deposits or loans—naturally boosts activity on the other, setting a high barrier for new entrants. Moreover, the greater the overall liquidity of the platform, the smoother the transitions for both deposits and withdrawals become, and the more likely to be favored by whales, which in turn stimulates the growth of the platform business. 3. Exceptional DAO Governance: Aave has transitioned to a fully decentralized DAO-based governance system, providing greater transparency and deeper community engagement in decision-making compared to centralized management. The Aave DAO boasts a vibrant ecosystem of governance participants, including top venture capitalists, university blockchain clubs, market makers, risk management firms, third-party developers, and financial advisors. These participants are diverse and actively engaged in governance. From the operational results of the project, Aave, as a latecomer to peer-to-peer lending services, has managed to balance growth and security effectively in product development and asset expansion and realized its overtaking of the industry leader Compound. The DAO governance has played a crucial role in this process. 4. Strategic Positioning in the Multi-Chain Ecosystem: Aave has established a strong presence across nearly all EVM-compatible Layer 1 and Layer 2 networks, consistently ranking at the top in terms of Total Value Locked (TVL) on each. The upcoming Aave V4 will enhance multi-chain liquidity integration, amplifying the benefits of cross-chain liquidity flows. The following chart provides more details. In addition to EVM-compatible chains, Aave is actively assessing other networks like Solana and Aptos, considering potential deployments on these networks in the future. Valuation Insights As per Tokenterminal data, Aave has seen its valuation metrics dip to historical lows due to a steady increase in protocol fees and revenues, along with a persistently low token price. The Price to Sales (PS) ratio, which compares the circulating market cap with protocol revenue, stands at 17.44x, while the Price to Fees (PF) ratio, comparing market cap to protocol fees, is at 3.1x. Tokenterminal Risks and Challenges Although Aave has been successfully expanding its share in the lending market, it faces emerging competition from Morpho Blue, a noteworthy modular lending platform. Morpho Blue offers a flexible suite of modular protocols to third parties aiming to establish their own lending markets. This platform allows for the customization of collaterals, borrowing assets, oracles, and risk parameters, enabling the creation of tailored lending environments. This modular approach has facilitated the entry of numerous new players into the lending space, who have begun to offer lending services. For instance, Gauntlet, previously a risk manager for Aave, opted to sever ties with Aave in favor of launching its own lending market on Morpho Blue. Morpho BLue Morpho Block Analytics Since its launch more than half a year ago, Morpho Blue has experienced rapid growth, now ranking as the fourth-largest lending platform by TVL, just behind Aave, Spark (a MakerDAO-launched copy of Aave v3), and Compound.  Its expansion on Base has been particularly swift, achieving a TVL of $27 million in less than two months, while Aave’s TVL on Base stands at approximately $59 million. Morpho Block Analytics Decentralized Exchanges: Uniswap & Raydium Uniswap and Raydium are key players within the Ethereum EVM and Solana ecosystems, respectively. Uniswap debuted on the Ethereum mainnet with its V1 version in 2018, but it was the introduction of its V2 in May 2020 that catapulted the platform to prominence. Raydium, on the other hand, made its entry on the Solana network in 2021. The rationale behind highlighting two distinct entities in the decentralized exchanges sector is their affiliation with the two most populous Web3 ecosystems: the EVM ecosystem, centered around Ethereum—the leading public blockchain—and the fast-expanding Solana ecosystem. Both projects boast unique advantages and face specific challenges. Let’s delve into a detailed analysis of each. Uniswap Business Overview Since its V2 release, Uniswap has maintained its position as the leading decentralized exchange (DEXs) in terms of trading volume across the Ethereum mainnet and other EVM-compatible chains. We focus primarily on two key metrics: trading volume and trading fees. The chart below illustrates the monthly trading volume share of Uniswap V2 from its launch, excluding trading volumes on non-EVM chains: Tokenterminal From the launch of its V2 in May 2020, Uniswap’s market share reached a peak of 78.4% in August 2020 and then declined to a bottom of 36.8% during the most fierce DEX wars in November 2021. It has since rebounded to a stable 56.7%, evidencing its ability to withstand tough competition.  Tokenterminal This trend is also mirrored in its share of trading fees; after bottoming out at 36.7% in November 2021, Uniswap’s market share in fees has steadily climbed, now standing at 57.6%. Remarkably, Uniswap has largely refrained from subsidizing liquidity with tokens, except for brief periods in 2020 on the Ethereum mainnet and at the end of 2022 on the Optimism mainnet. This restraint stands in stark contrast to most other DEXs, which continue to rely on liquidity incentives to this day. The chart below illustrates the monthly incentives of major DEXs. It can be observed that Sushiswap, Curve, Pancakeswap, and Aerodrome, a project adopts ve(3,3) model and built on Base, all of which at one point had the largest subsidy amounts. However, none of them have managed to secure a higher market share than Uniswap. Tokenterminal One persistent critique of Uniswap is that despite not engaging in token incentives, its tokens still lack utility, as the protocol has not activated $UNI as gas fee. In late February 2024, Erin Koen, a Uniswap developer and governance lead in Uniswap foundation, submitted a proposal to upgrade the protocol. This would introduce a fee structure to benefit $UNI holders who have approved and delegated their tokens, leading to significant community debate. Although the vote was initially planned for May 31, it has been postponed and remains pending. Despite these delays, Uniswap has taken initial steps toward enabling fees and enhancing the utility of $UNI tokens, with the revised contract already developed and audited. Uniswap will have a separate revenue stream from the protocol in the near future. Additionally, Uniswap Labs started implementing a swap fee in October 2023 for users trading through the official Uniswap website and the Uniswap wallet. The fee is set at 0.15% and involving ETH, USDC, WETH, USDT, DAI, WBTC, agEUR, GUSD, LUSD, EUROC, and XSGD. However, it is important to note that swap between stablecoins and wraps between ETH and WETH are excluded from this fee. Simply initiating a fee structure on Uniswap’s interface has positioned Uniswap Labs as one of the highest revenue-generating teams within the Web3 space. With the anticipated activation of protocol layer fees and based on the annualized fees from the first half of 2024, Uniswap could generate around $1.13 billion annually. If the protocol charges a 10% fee ratio, this would translate to an annual protocol revenue of approximately $113 million.  Additionally, the expected launches of Uniswap X and V4 later this year are set to potentially boost its market share in trading volumes and fees even further. Competitive Advantages The competitiveness of Uniswap are underpined by the three key factors: 1. Reputation Among Users: There was considerable skepticism when Uniswap first implemented interface fees last year. Many thought users would swiftly migrate their swaps to aggregators like 1inch to avoid extra fees. Contrary to these expectations, interface fee revenue continued to grow and even outpaced Uniswap’s fee growth for the entire protocol. Tokenterminal This data is a strong indication proving the power of user habits on Uniswap. Many users are not caring about this 0.15% fee expense and choosing to stick with their familiar trading routines. 2. Bilateral Network Effect: Uniswap functions as a classic two-sided market. On one side are the traders, and on the other, the liquidity providers (LPs). The more vibrant the trading at a given platform, the more it attracts LPs to offer liquidity, creating a cycle of mutual reinforcement. The other dimension of this bilateral effect involves the traders and the teams deploying initial token liquidity. To ensure their tokens are easily discoverable and tradable, these teams often prefer to establish initial liquidity on well-known DEXs like Uniswap rather than on lesser-known, smaller platforms. This strategy not only enhances the visibility of new tokens but also reinforces trader habits to first seek out Uniswap for new investments, thereby strengthening the two-sided market dynamic between “blockchain projects” and “traders.” 3. Multi-Chain Deployment: Like Aave, Uniswap has been actively expanding its presence across multiple blockchain networks. It is visible on all major EVM chains, consistently ranking among the top decentralized exchanges in terms of trading volume on those networks.  With the upcoming introduction of Uniswap X, which will enhance support for tradings across different chains, Uniswap’s competitive edge in multi-chain liquidity is poised to grow even stronger. Valuation Insights The primary measure for assessing Uniswap’s value is the Price to Fees (PF) ratio, which compares its circulating market cap to its annualized fees. Presently, $UNI tokens are valued within a historically high percentile, likely influenced by market anticipation of the upcoming fee switch upgrade. Tokenterminal As for market valuation, Uniswap currently boasts a circulating market cap of nearly $6 billion and a fully diluted valuation of approximately $9.3 billion, indicating a reasonable valuation. Risks and Challenges Regulatory Risk: In April 2024, Uniswap was served a Wells Notice by the SEC, signaling potential forthcoming enforcement actions. While the progressive advancement of the FIT21 bill may eventually offer DeFi projects like Uniswap a clearer and more predictable regulatory framework, but given that it will still take a long time for the bill to be voted on and put into place, and the lawsuit from the SEC will put pressure on Uniswap’s operational activities and token prices in the medium term. Position in DeFi Ecosystem: DEXs are the fundamental layer of liquidity. Traditionally, the upstream entities above DEXs are aggregators (e.g., 1inch, Cowswap, Paraswap), which offer users comparative pricing across chains to identify optimal trading routes. This model to a certain extent inhibit the downstream DEX of the user trading behavior of the charges and pricing ability. With the development of blockchain, wallets with built-in trading functions have emerged as higher-level infrastructure. With the adoption of ‘intent-based’ models, DEXs are expected to serve as invisible sources of liquidity, possibly diminishing the direct use of platforms like Uniswap in favor of a comprehensive “comparison shopping mode.” Aware of these dynamics, Uniswap is making concerted efforts to ascend within the ecosystem, notably through promoting its own wallet and launching Uniswap X to become more aggregated, aiming to enhance its strategic positioning. Raydium Business Overview We will closely analyze Raydium’s trading volume and fees. Raydium has a significant advantage over Uniswap due to its early implementation of protocol fees, resulting in robust cash flows. Consequently, Raydium’s protocol revenue will also be a major focus of our analysis. Reviewing Raydium’s trading volume, which have surged since October 2023 due to the thriving Solana ecosystem, we see a peak in March when volume reached $47.5 billion—about 52.7% of Uniswap’s trading volume for the same month. Flipside In terms of market share, Raydium’s trading volume on the Solana has consistently increased since September 2023 and now represents 62.8% of all trading volume within the Solana ecosystem. Raydium’s dominance in Solana even exceeds Uniswap’s influence within the Ethereum ecosystem. Dune Analytics Raydium’s impressive ascent in market share, rising from less than 10% during a slump to over 60%, can largely be attributed to the sustained Meme hype in this bull cycle. Raydium utilizes two types of liquidity pools: standard AMMs and CPMMs. The standard AMM model, similar to Uniswap V2, features evenly distributed liquidity suitable for assets with high volatility. In contrast, the CPMM model, akin to Uniswap V3, allows liquidity providers to set specific ranges for their liquidity, resulting in a system that is more flexible yet more complex. Raydium’s competitor, Orca, opts for a concentrated liquidity pool model similar to Uniswap V3, but Raydium’s standard AMM model proves more conducive for memecoins who need to provide and allocate liquidity in large volumes every day. This has made Raydium the go-to liquidity platform for memecoins. Additionally, Solana has become the leading incubator for memecoins during this bull market, witnessing the creation of thousands of new memecoins each day since November. These memecoins have become the driving force behind the thriving Solana ecosystem, significantly boosting Raydium’s business expansion. Dune Analytics As indicated by the chart, in December 2023, Raydium introduced 19,664 new tokens within a week, in stark contrast to 89 new tokens on Orca. Theoretically, Orca’s concentrated liquidity mechanism could emulate traditional AMMs by setting liquidity to span “the full range.” However, this method lacks the simplicity of Raydium’s standard pool model.  This is further evidenced by Raydium’s trading data, which shows that 94.3% of its trading volume stems from standard pools, largely driven by memecoins. Additionally, Raydium operates as a bilateral market similar to Uniswap, catering to both projects and individual users. The more retail traders on Raydium encourages memecoins to establish their initial liquidity on the platform. This dynamic spurs users and user-support tools (like various memecoin-tracking bots on telegram) to conduct their trading through Raydium, establishing a self-perpetuating cycle that significantly extends Raydium’s lead over Orca.  Regarding swap fees, Raydium accrued about $300 million in the first half of 2024, which is 9.3 times the fees it collected throughout all of 2023. Flipside Raydium’s standard AMM pools charge a swap fee of 0.25%, with 0.22% of that fee distributed to liquidity providers (LPs), and the remaining 0.03% allocated for the buyback of $Ray, the native token of Raydium. In the CPMM pools, the fee ratio are customizable and can be freely set at 1%, 0.25%, 0.05%, or 0.01%, with the LP receiving 84% of the trading fees and the remaining 16% is split between buying back Ray tokens (12%) and contributing to the treasury (4%). Flipside In the first half of 2024, Raydium used about $20.98 million of its protocol revenue to buy back $Ray tokens, an amount 10.5 times the total spent on buybacks in 2023. Besides trading fees, Raydium charges for creating new pools: 0.4 SOL for a standard AMM pool and 0.15 SOL for a CPMM pool. Raydium’s daily earnings from these pool creation fees average around 775 SOL. Calculated at the $SOL on June 30th, 2024, the total revenue should be $108,000. These funds are directed towards protocol development and maintenance, serving as operational income for the team, rather than being deposited into the treasury or used for $Ray buybacks. Flipside Similar to most decentralized exchanges, Raydium continues to offer incentives for liquidity providing. While there is no data available for the amounts of these incentives, one can estimate the current value of the incentives being provided to active liquidity pools by analyzing the data available on Raydium’s official liquidity interface. Based on current Raydium incentives for liquidity, there is roughly $48,000 worth of incentive spending per week, primarily in Ray tokens. This expenditure is considerably lower than the protocol’s weekly revenue, which nears $800,000 (not including revenue from creating pools). This disparity underscores that Raydium is operating with a positive cash flow. Competitive Advantages Raydium holds the distinction of being the DEXs with the highest market trading volume on Solana. Its primary strength lies in the bilateral network effects. Same as Uniswap, these effects are amplified by the symbiotic relationships between traders and liquidity providers, as well as between project initiators and traders. The impact of these network effects is especially significant within memecoins. Valuation Insights Due to the lack of historical data before 2023, the valuation is based on a comparison between Raydium’s valuation data from the first half of 2024 and the full year of 2023. With the spike in trading volume this year, despite a rise in the price of Ray tokens, Raydium’s valuation relative to last year has noticeably declined. Furthermore, when compared to other DEXs such as Uniswap, Raydium’s Price to Fees (PF) ratio is still relatively low. Risks and Challenges Although Raydium has demonstrated robust growth in trading volume and revenue in the past half year, its future development is fraught with uncertainties and challenges. Ecosystem Position: Raydium struggles with its positioning within the ecosystem. In the Solana ecosystem, aggregators such as Jupiter wield greater influence, with their trading volumes significantly outstripping those of Raydium. In June 2024, Jupiter’s total trading volume in was $28.2 billion, compared to Raydium’s $16.8 billion. Moreover, memecoin-focused platforms like Pump.fun are gradually overtaking Raydium as the go-to launchpad for projects, with more memecoins choosing to launch via Pump.fun instead of Raydium, despite their ongoing collaboration. Pump.fun is increasingly diminishing Raydium’s influence among projects, and Jupiter has overtaken Raydium in capturing trader engagement. Should this trend continue unaddressed, and should top-tier entities like Pump.fun or Jupiter develop their own DEXs or pivot to competitors, Raydium could face significant challenges. Shifts in Market Trends: Before the memecoin craze swept through Solana, Orca’s share of trading volume was seven times larger than that of Raydium. This cycle, due to Raydium’s standard pools being more friendly to memecoins, Raydium managed to recapture some of its lost market share. Yet, the longevity of the memecoin craze within Solana remains uncertain, as does the future dominance of memecoins on the blockchain. Predicting these trends is challenging. Should the market’s preference for types of crypto assets shift, Raydium’s regained market share could be at risk once again. Token Emission: The circulating ratio of $RAY is 47.2%, relatively low when benchmarked against other DeFi projects. This could mean potential downward pressure on prices as more tokens become unlocked. However, given Raydium’s robust cash flow, selling off these tokens isn’t the sole strategy available. The team could opt to burn the yet-to-be-released tokens, a move that could help mitigate worries about oversupply. Centralization Concerns: Raydium has not initiated a governance system driven by $RAY tokens, leaving the project’s evolution solely in the hands of the core team. This centralization may hinder the distribution of profits that should be attributed to the holders. For instance, decisions on the allocation of buybakced $RAY have yet to be clarified, leaving a critical issue pending. Staking: Lido Lido is the leading liquidity staking within the Ethereum ecosystem. The Beacon Chain’s initiation at the close of 2020 signaled Ethereum’s transition from Proof of Work (PoW) to Proof of Stake (PoS). Initially, the absence of a withdrawal feature for staked assets resulted in a loss of liquidity for staked ETH. Notably, it was not until the Shapella upgrade in April 2023 that withdrawals from the Beacon Chain were enabled, leaving early ETH stakers without liquidity for over two and a half years. Lido was the first to introduce the liquidity staking concept. When users deposit ETH into Lido, they will receive stETH as proof of their stake. Lido facilitated a robust stETH-ETH liquidity pool on Curve, marking the first time users could reliably participate in ETH staking to earn rewards while retaining the flexibility to withdraw their ETH at any time. This breakthrough fueled rapid growth, establishing Lido as the frontrunner in Ethereum’s staking landscape. Regarding its business model, Lido retains 10% of the staking revenue it generates, allocating 5% to staking service providers and managing the remaining 5% through its DAO. Business Overview Lido’s primary operation revolves around providing ETH liquidity staking services. In the past, Lido held the position of the top liquidity staking provider on the Terra network and was the second-largest on the Solana network. It also ventured into expanding its services across other blockchains, including Cosmos and Polygon. Nevertheless, Lido prudently scaled back its operations, choosing to concentrate exclusively on Ethereum network staking. Today, Lido stands as the market leader in ETH staking and boasts the highest TVL among all DeFi protocols. DeFiLlama With the substantial stETH-ETH liquidity created by extensive $LDO incentives, and with investment support from institutions like Paradigm and Dragonfly in April 2021, Lido surpassed its main competitors—centralized exchanges such as Kraken and Coinbase—by the end of 2021, positioning itself at the forefront of the Ethereum staking landscape. Source: Dune Analytics However, this led to concerns about whether Lido’s prominent position could undermine Ethereum’s decentralization. The Ethereum Foundation is considering measures to cap any single entity’s staking share at 33.3% to preserve the network’s decentralized nature. After hitting a high of 32.6% in May 2022, Lido’s market share has oscillated between 28% and 32%. The market share of ETH staking Competitive Advantages Lido’s business strength consists of two main points: Lido’s enduring leadership in the market has cultivated stable expectations, positioning it as the go-to platform for whales and institutions interested in ETH staking. Notable figures like Justin Sun, and Mantle before launching their LST, and many big investors are among Lido’s clientele.Network effects from a wide range of stETH use cases. stETH was fully supported by the head DeFi protocol as early as 2022. And subsequently, DeFi protocols have developed and found ways to integrate with stETH, evidenced by the traction gained by projects like LSTfi in 2023, and others such as Pendle and various LRT initiatives. This extensive adoption has solidified stETH’s role as a foundational yield-generating asset within the Ethereum network. Valuation Insights  Despite a modest dip in market share, Lido’s staking volume has continued to expand, propelled by an increasing overall staking rate of $ETH. Valuation-wise, Lido’s Price to Sales (PS) and Price to Fees (PF) ratios have recently hit all-time lows.  Token Terminal With the successful rollout of the Shapella upgrade, Lido has cemented its position in the market. The profitability indicators, reflecting the “revenue/token incentive” metrics, have shown impressive results, with Lido generating $36.35 million in profits over the last year. Token Terminal This situation has led to anticipation within the community for potential adjustments to the $LDO tokenomics. However, Lido’s de facto leader, Hasu, has repeatedly expressed that the current revenue from the community treasury does not suffice to cover all of Lido DAO’s ongoing expenses over the long haul. He emphasizes that discussions on revenue distribution are premature given the financial landscape. Risks and Challenges Lido faces the following risks and challenges: Competition from newcomers. Lido’s market share has been on the decline since the release of Eigenlayer. New projects equipped with significant token marketing budgets pose a threat to established leaders like Lido, particularly as Lido’s tokens are almost fully circulated.Members of the Ethereum community, including several from the Ethereum Foundation, have long harbored reservations about Lido’s dominant market share in staking. Vitalik Buterin has addressed these concerns directly, publishing an article that explored potential solutions, yet he refrained from endorsing any particular option. For those interested in a deeper dive, you can read our previous analysis: Evaluating Vitalik’s Proposals on Ethereum Staking.On June 28, 2024, the SEC’s allegations against Consensys explicitly classified LST as a security.  The act of minting and buying stETH by a user was characterized by the SEC as “Lido’s issuance and sale of unregistered securities.” Furthermore, Consensys faced accusations for issuance and sale of unregistered securities by providing users with the ETH staking service. Perpetual Exchange: GMX  GMX is a decentralized perpetual exchange, first going live on Arbitrum in September 2021 and subsequently launching on Avalanche in January 2022. The platform functions as a two-sided market: on one side are traders, who can open positions with leverage up to 100x; on the other side are liquidity providers, who supply the liquidity of their assets for trading purposes and serve as counterparties to the traders. In terms of business model, GMX’s revenue streams primarily come from trading fees, which vary between 0.05% and 0.1%, in addition to funding and lending fees charged to traders. GMX distributes 70% of all its revenue to liquidity providers, while the remaining 30% is allocated among $GMX token stakers. Business Overview The field of perpetual trading platforms is characterized by frequent new entrants like Aevo, Hyperliquid, Synfutures, and Drift, which often offer traceable airdrops, and established platforms that provide trade-mining incentives, such as dYdX, Vertex, and RabbitX. Given these factors, trading volume data may not fully reflect the true competitive landscape. Therefore, we will utilize metrics such as Total Value Locked (TVL), Price to Sales (PS), and profits to provide a comparative analysis of GMX alongside its competitors. GMX currently holds a leading position in terms of TVL. Nonetheless, other significant players in the space, such as the established derivatives protocol dYdX, Jupiter Perp with its substantial traffic gateway on Solana, and the forthcoming Hyperliquid, are also showcasing TVL figures that are comparable to GMX’s. DeFiLlama When considering the PS ratio, GMX stands out for its relatively low valuation in the segment of projects that have issued their tokens, focus on perpetual trading and have an average daily trading volume of over $30 million. The only competitor with a lower PS ratio is Vertex, which continues to engage heavily in trade mining incentives. Looking at profit metrics, GMX recorded profits of $6.5 million in the past year, which are less than those of competitors such as dYdX, GNS, and SNX. However, it is critical to understand that this figure was significantly impacted by GMX’s decision to release all 12 million $ARB tokens rewards during the Arbitrum STIP event from November 2023 to March 2024. These tokens had an estimated value of $18 million, based on ARB’s average price at the time, which notably diminished the reported profits. Despite this, the trend in profit accumulation proves GMX’s robust ability to generate profits. Competitive Advantages Relative to the other DeFi projects mentioned, GMX has a comparatively weaker competitive edge. the frequent emergence of new programs on derivatives exchanges in recent years has largely impacted GMX’s trading volume and the marketplace remains congested. Despite these challenges, GMX maintains several key strengths: Robust Support from Arbitrum: GMX, as a native project within the Arbitrum ecosystem, was responsible for nearly half of Arbitrum’s TVL during its peak periods. At that time, virtually every new DeFi project on Arbitrum was tailored to “cater to GLP,” which not only provided significant exposure from the Arbitrum official channels but also resulted in GMX receiving a substantial amount of ARB tokens through various incentive events—including an initial airdrop of 8 million and an additional 12 million from STIP activities. This substantial token accumulation greatly enriched GMX’s treasury and provided crucial marketing resources for GMX, whose tokens are already fully circulated.Established Leadership and Strong Reputation: GMX significantly shaped the “Real DeFi Yield” narrative from late 2022 through early 2023, a rare bright spot in the DeFi space during that bearish market period, and GMX took the opportunity to build a strong brand reputation and attract a dedicated and loyal user base.Economies of Scale Advantage: GMX, as the leading perputual trading platform, is a beneficiary of this scale effect. When liquidity providers (LPs) are large enough, they can accommodate larger trading orders and higher open positions, which in turn generate higher returns for LPs. For example, Andrew Kang, a renowned trader, frequently opened long and short positions worth tens of millions of dollars on GMX. At that time, GMX was almost the only option for placing such large orders on-chain. Valuation Insights GMX has reached full circulation. As highlighted in our earlier cross-industry comparison, GMX presently has the lowest valuation among leading perpetual exchanges.  In a longitudinal comparison with its historical data, GMX’s revenue has demonstrated consistent stability and its PS ratio has tended to fall within a moderately low range Risks and Challenges Intense Competitive Landscape: GMX faces formidable competition not only from well-established DeFi protocols like Synthetix and dYdX, which continue to innovate and drive activity, but also from rising newcomers, such as AEVO, specializing in token swaps, and Hyperliquid, which has not yet issued its tokens, have both gained significant attention and trading volume over the past year. Jupiter Perp, leveraging its substantial access to Solana’s traffic, has managed to match GMX’s TVL and even surpass its trading volumes using a nearly identical mechanism to GMX. GMX is currently gearing up to roll out their V2 version on Solana, yet the overall competitive environment remains highly intense, lacking a defined pattern like the other DeFi tracks. Additionally, the prevalent trade mining incentives lower barriers to user migration, generally leading to lower user loyalty.GMX relies on oracle prices as the price basis for trading and liquidation, which introduces a vulnerability to oracle attacks. In September 2022, GMX suffered a loss of $560,000 due to an oracle attack on AVAX. Despite this, the costs associated with executing such attacks (i.e., manipulating the CEX prices of the tokens involved) are typically prohibitive relative to the gains. To further protect against these risks, the GMX V2 update has introduced measures like segregated pools and adjustments for trading slippage. Other Noteworthy DeFi Projects Beyond the DeFi projects previously discussed, our research has identified other compelling projects within the space, including the well-established stablecoin MakerDAO, the rising star Ethena, and the foremost oracle solution Chainlink. Unfortunately, space constraints prevent a comprehensive presentation of these projects in this document. Moreover, each of these projects encounters its own set of challenges, including: While MakerDAO continues to lead in the decentralized stablecoin sector and boasts a significant base of natural holders—who treat DAI similarly to how they would USDC or USDT—the size of its stablecoin has not advanced, remaining at roughly half of its previous peak in terms of market cap. Furthermore, its reliance on off-chain dollar assets for collateral is progressively eroding the decentralized trust associated with its token. Ethena’s stablecoin, $USDe, sharply contrasts with MakerDAO’s DAI, having surged from zero to $3.6 billion within approximately six months. Despite this impressive growth, Ethena’s business model, which centers on a public fund dedicated to perpetual arbitrage, inherently faces limitations. The substantial scale-up of its stablecoin hinges on secondary market participants’ willingness to purchase its $ENA token at elevated prices, a strategy that underpins the high-yield subsidies necessary to sustain USDe expansion. This slightly Ponzi-like design becomes highly vulnerable during times of poor market sentiment, potentially leading to a downward spiral in both business and token prices. A pivotal moment for Ethena could arise if USDe can genuinely establish itself as a decentralized stablecoin embraced by a substantial base of ‘natural holders,’ thereby shifting from being a public arbitrage fund to a stablecoin operator. However, given that the underlying assets of USDe are largely tied to arbitrage positions on centralized exchanges, it faces significant hurdles related to both ‘decentralized censorship resistance’ and ‘robust institutional endorsement.’ This makes it extremely challenging for USDe to replace DAI and USDT. After making its mark in the DeFi sector, Chainlink is gearing up for an under-the-radar yet potentially massive shift in narrative, driven by financial behemoths like BlackRock that have progressively embraced Web3 technologies. This shift focuses on the integration of Real World Assets (RWA). Apart from advocating for BTC ETF and ETH ETF, one of BlackRock’s standout initiatives this year was the tokenization of a U.S. Treasury bond fund, named ‘Build,’ on Ethereum,  which amassed over $380 million in just six weeks. The experimentation with financial products on blockchain by traditional financial giants will continue, inevitably grappling with issues like the tokenization of off-chain assets and enhancing both on-chain and off-chain communications and interoperability.  Chainlink’s explorations in blockchain interoperability are quite advanced. For instance, in May of this year, Chainlink completed a “Smart Asset Net Worth” project with the Depository Trust and Clearing Corporation (DTCC) and several major U.S. financial institutions. This pilot project aims to establish a standardized process for aggregating and disseminating fund net asset value (NAV) data on private or public blockchains using Chainlink’s interoperability protocol, CCIP. Additionally, in February, asset managers Ark Invest and 21Shares announced the validation of position data by integrating with Chainlink’s Proof of Reserve platform. However, Chainlink still faces challenges with its business value being detached from its tokens. The lack of value capture and rigid application scenarios for $LINK tokens raises concerns that holders will find it difficult to benefit from the growth of Chainlink’s business. Conclusion As with many transformative products, DeFi has traced a distinctive trajectory since its inception. It started with the narrative building of its ‘Genesis Year’ in 2020, followed by a rapid bubble in 2021, and then moved into a phase of disillusionment following the bubble’s burst in the 2022 bear market. Now, with its Product-Market Fit (PMF) robustly established, DeFi is climbing out of the trough of narrative disillusionment, and building its intrinsic value with a solid business. I am convinced that DeFi, characterized by its mature business models and expanding market potential, is worthy of sustained focus and investment.

Altcoins Keep Falling, Time to Refocus on DeFi

By Alex Xu, Research Partner at Mint Ventures & Lawrence Lee, Researcher at Mint Ventures
Introduction
Despite being one of the most mature sectors in the crypto space, DeFi projects have shown disappointing performance in this bull run. Over the past year, the DeFi sector has seen a modest increase of 41.3%, significantly lagging behind the average market growth of 91% and Ethereum’s 75.8% rise.

Source: artemis
Focusing on the data from 2024 alone, the performance of the DeFi sector is hard to say positive, with an overall decline of 11.2%.

Source: artemis

However, from my perspective, in the unusual market context where altcoin prices have collectively dropped following BTC’s new high, some leading DeFi projects may have reached the optimal time for strategic investment. 
In this article, I aim to clarify the current value of DeFi by exploring the following questions:
The reasons behind altcoin’s significant underperformance compared to BTC and Ethereum in this cycleWhy now is the best time to focus on DeFiSome DeFi projects worth paying attention to, along with their sources of value and associated risks
This article does not aim to cover all DeFi projects with investment potential in the market. The DeFi projects mentioned are merely analysis examples and are not financial advice.
Please note that this article reflects my current thinking and may evolve. The opinions here are subjective and may be facts, data, and logical reasoning errors. Feedback and discussions are welcomed.
The Enigma of the Steep Decline in Altcoin Prices
In my view, the disappointing performance of altcoin prices in this cycle can be attributed to three main internal factors within the crypto industry:
Insufficient Demand Growth: There is a lack of attractive business models, and most crypto sectors are far from achieving product-market fit (PMF).Supply-Side Overgrowth: With the infrastructures becoming more robust and entry barriers lowering, new projects have been excessively issued.Persistent Token Unlocking: The continuous unlocking of tokens from low-circulation, high FDV (Fully Diluted Valuation) projects has led to significant selling pressure.
Let’s look at each of the three reasons in detail.
Insufficient Demand Growth: The First Bull Market Lacking Innovative Narratives
In my early March article, “Preparing for Primary Wave: My Periodic Strategy on This Bull Market Cycle,” I pointed out that this bull market lacks the scale of business innovation and narratives seen in the DeFi boom of 2021 and the ICO surge of 2017. Hence, the strategy should be to overweight BTC and ETH (benefiting from the influx of funds from ETFs), and to control the allocation to altcoins.
To date, my observation has proven to be accurate.
The absence of new business stories has reduced the influx of entrepreneurs, investments, users, and funds. More importantly, this has dampened investors’ overall expectations for industry growth. When the market lacks compelling stories like “DeFi will disrupt traditional finance,” “ICO is a new paradigm for innovation and financing,” and “NFTs are revolutionizing the content industry ecosystem,” investors naturally gravitate towards sectors with new narratives, such as AI.
However, I do not support overly pessimistic views. Although we have not yet seen attractive innovations in this cycle, the infrastructure is continuously improving:
The cost of block space has significantly decreased across both Layer 1 (L1) and Layer 2 (L2) solutions.Cross-chain communication solutions are becoming more comprehensive, offering a wide array of options.Wallets upgraded their experiences to become more user-friendly. For example, Coinbase’s smart wallet supports keyless quick creation and recovery, direct calling of CEX balances, and eliminating the need to top up gas, providing users with a better product experience.Solana’s Actions and Blinks features allow interactions with the Solana blockchain to be published in any common internet environment, further shortening the user journey.
These infrastructure is like real-world water, electricity, coal, and roads. They are not the result of innovation but are the soil from which it springs.
Excessive Supply Growth: Over-Issuance of Projects and Continuous Token Vesting of High-Market-Cap Projects
In fact, looking from another angle, although the prices of many altcoins have hit new lows for the year, the total market capitalization of altcoins relative to BTC has not suffered as severely.

Trading view, June 25, 2024
BTC’s price has fallen by approximately 18.4% from its peak, while the total market cap of altcoins has only decreased by 25.5%.
Note: Altcoins’s market cap refers to the total crypto market cap excluding BTC and ETH, namely “Total3” in the Trading View system.

Trading view, June 25, 2024

The limited decline in the total market cap of altcoins occurs against a backdrop of significant expansion in the total quantity and market cap of newly issued altcoins. The chart below clearly illustrates that during this bull market, the growth trend in new tokens is the most rapid in history.

New Tokens by Blockchain
It is important to note that the above data only includes newly-issued tokens on EVM chains, with over 90% issued on the Base chain. In reality, even more new tokens have been issued on Solana. Whether on Solana or Base, most of the newly issued tokens are memecoins.
Among the high market cap memecoins that have emerged in this bull market are:
MemecoinsCirculating Market Cap ($million)Dogwifhat2,040Brett1,660Notcoin1,610DOG‱GO‱TO‱THE‱MOON630Mog Coin560Popcat470Maga410
In addition to memecoins, a large number of infrastructure concept tokens are or will be issued on the exchange this year.
Layer2 Solutions
L2 SolutionsCirculating Market Cap ($billion)FDV ($billion)Starknet0.937.17ZKsync0.613.51Manta Network0.331.02Taiko0.121.9Blast0.482.81
Cross-chain Services
Cross-chain ServicesCirculating Market Cap ($billion)FDV ($billion)Wormhole0.633.48Layerzero0.682.73Zetachain0.231.78Omni Network0.1471.42
Modular Blockchains
Modular BlockchainsCirculating Market Cap ($billion)FDV ($billion)Altlayer0.291.87Dymension0.31.59Saga0.141.5
*The market cap is sourced from Coingecko as of June 28, 2024.
Additionally, many tokens listed on CEXs are facing substantial vesting. These tokens commonly have low circulation ratios, and high FDVs, and have undergone early VC funding rounds, resulting in low token costs for institutions.
The combination of weak demand and narratives, along with over-issuance on the supply side, is an unprecedented situation in the crypto cycle. Despite manual efforts to sustain valuations by further reducing the token circulation ratio at the time of listing – from 41.2% in 2022 to 12.3% in 2024 – and gradually selling to secondary investors, the convergence of these factors has led to an overall downward shift in the valuation of these crypto projects. In 2024, only a few concepts, such as Memecoins, CEX, and DePIN, have managed to maintain positive returns.

MC/FDV of Newly-launched tokens
However, from my perspective, the valuation collapse of high market cap VC Token is a normal market response to various crypto anomalies:
Duplicate creation of Rollups, leading to the ghost-town phenomenon that high-TVL projects are filled with bots but lack authentic usersRaising funds by rebranding terms while providing essentially similar solutions, especially among cross-chain communication servicesLaunching projects based on trends rather than actual user needs, such as numerous AI+Web3 projectsProjects fail to find profitable models and their tokens with no ability to capture value
​​The overall decline in the valuation of these altcoins is a result of the market’s self-correction. It is a healthy process of the bubble bursting, where funds vote with their feet, leading to a market clearing and self-rescue. 
The reality is that most VC coins are not entirely without value; they were simply overpriced, and the market has finally adjusted them to their rightful positions.
The Right Time to Focus on DeFi: PMF Products Emerging from the Bubble
Since 2020, DeFi has officially become a category within the altcoin ecosystem. In the first half of 2021, the Top 100 crypto market cap rankings were dominated by DeFi projects, with a dizzying array of subcategories, all aiming to replicate every existing business model in traditional finance on the blockchain. 
During that year, DeFi was the fundamental module of public chains, and DEXs, lending platforms, stablecoins, and derivatives became the essential components for any new public chain.
However, with the over-issuance of similar projects, numerous hacker attacks or insider jobs, and the rapid collapse of TVL obtained by relying on Ponzi schemes, the once soaring token prices spiraled down to zero.
As we enter the current bull market cycle, the price performance of most surviving DeFi projects has been unsatisfactory, and primary investments in the DeFi sector have dwindled. As is typical at the start of any bull market, investors are most attracted to the new narratives emerging in this cycle, and DeFi does not fit into that type.
However, it is precisely for this reason that DeFi projects, surviving from the bubble, are beginning to look more attractive than other altcoin projects. Specifically:
Business Overview: They have mature business models and profit models, and leading projects have competitive edges
DEXs and derivatives earn trading fees, lending platforms generate income from interest spreads, stablecoins collect stability fees, and staking projects charge fees for their services. These sectors have clear profit models. Leading projects in each sector have organic user demand, have largely moved past the subsidy phase, and some have achieved positive cash flow even after accounting for token emissions.

Rankings of Profitable Crypto Projects

According to statistics from Token Terminal, as of 2024, 12 of the top 20 most profitable protocols are DeFi projects. By category, they include:
Stablecoins: MakerDAO, EthenaLending: Aave, VenusStaking Services: LidoDEX: Uniswap Labs, PancakeSwap, Thena (earning from trading fees)Derivatives: dYdX, Synthetix, MUXYield Aggregators: Convex Finance
These projects have various competitive advantages, which can derive from multi-sided or bilateral network effects, user habits and brand recognition, or unique ecosystem resources. However, the leading DeFi projects in their respective sectors share some common traits: stabilizing market share, fewer later competitors, and service pricing power.
We will analyze thesel DeFi projects in detail later.
Supply Side: Low Emissions, High Circulation Ratio, Minimal Token Unlocking
In the previous section, we noted that one of the main reasons for the continuous decline in the current cycle of altcoin valuation is the high emissions from many projects based on inflated valuations, coupled with the negative expectations from the large-scale unlocking of tokens.
In contrast, leading DeFi projects, due to their earlier launch dates, have mostly passed their peak token emission periods, and institutional tokens have largely been released, resulting in minimal future selling pressure. For example, Aave currently has a token circulation ratio of 91%, Lido’s is 89%, Uniswap’s is 75.3%, MakerDAO’s is 95%, and Convex’s is 81.9%.
This is partly an indication of low future dumping pressure, but it also means that whoever wants to gain control of these projects will have to buy tokens from the market.
Valuation Analysis: Divergence Between Market Attention and Business Metrics, Valuation Levels at Historical Lows
Compared to new concepts like Meme, AI, DePIN, Restaking, and Rollup services, DeFi has gained very little attention in this bull market, and its price performance has been mediocre. However, the core business metrics of leading DeFi projects, such as trading volume, lending scale, and profit levels, have continued to grow. This divergence between price and business metrics has resulted in the valuation levels of some leading DeFi projects reaching historical lows.
Take the lending protocol Aave as an example. While its quarterly revenues (referring to net income, not overall agreement fees) have surpassed the highs of the last cycle and hit all-time highs, its PS ratio (circulating market cap/annualized revenue) has hit an all-time low and is currently at just 17.4x.

Tokenterminal

Regulation: The FIT21 Act is favorable for DeFi compliance and may trigger potential M&A
FIT21, the Financial Innovation and Technology for the 21st Century Act, aims to establish a clear federal regulatory framework for the digital asset market, enhance consumer protection, and promote U.S. leadership in the global digital asset market. Proposed in May 2023 and passed by a wide margin in the House of Representatives on May 22 of this year, this Act clarifies the regulatory framework and rules for market participants. Once officially passed, it will facilitate investment in DeFi projects for both startups and traditional financial entities. Given traditional financial institutions like BlackRock’s recent embrace of crypto assets, such as promoting ETF listings and issuing bond assets on Ethereum, DeFi is likely to be a major focus area for them in the coming years. For traditional financial giants, mergers and acquisitions could be one of the most convenient options, and any sign of relevance, even mere acquisition intentions, will trigger a Defi leading project’s revaluation.
I will analyze the business conditions, competitive advantages, and valuations of selected DeFi projects as the examples. 
Given the multitude of DeFi projects, I will prioritize those with better business development, significant competitive advantages, and more attractive valuations.
Notable DeFi Projects
Lending Protocol: Aave
Aave stands out as one of the most established DeFi projects, having secured its initial funding round in 2017. Post-funding, Aave evolved from a peer-to-peer lending framework (formerly known as Lend) to a sophisticated peer-to-pool model. During the previous bull market, Aave outperformed Compound, a major competitor in its space, and now leads the lending protocols in both market share and market cap 
The core business model of Aave revolves around capturing the spread between borrowing and lending rates. In 2023, Aave introduced its stablecoin, $GHO, which is designed to bolster its interest income. Managing GHO, however, means additional operational costs including marketing and liquidity incentives.
Business Overview
For lending protocols, the active loan volume is a pivotal indicator, serving as the primary revenue stream for such projects. 
The chart below illustrates Aave’s market share in terms of active loan sizes over the last year. Aave’s proportion of active loans has been on an upward trajectory for the past six months, now claiming a significant 61.1% market share. It’s important to note that this percentage might be understated. The graph inadvertently includes a double count of the loan volumes attributed to Morpho’s optimizers which operate on both Aave and Compound.

Tokenterminal

Another vital metric is the protocol’s profitability or its profit margins. Here, profits are calculated as the protocol’s revenue minus token incentives. As illustrated in the subsequent graph, Aave’s protocol profitability has distanced itself from other lending protocols. Aave effectively abandoned the Ponzi model of relying on token subsidies to fuel operations, a strategy still employed by others such as Radiant (indicated by the purple segment in the chart).

Tokenterminal

Competitive Advantages
Aave has four foundational strengths:
1. Ongoing Accumulation of Security Credit: Many new lending protocols face security breaches within their first year of operation. However, Aave has maintained a flawless record with no incidents at the smart contract level since its inception. This track record of safe and stable operations is a crucial factor for DeFi users when selecting a lending platform, particularly for high-volume investors or “whales.” Justin Sun, for instance, is a notable long-term user of Aave.
2. Bilateral Network Effect: Similar to many online platforms, DeFi lending operates as a classic bilateral market where depositors and borrowers form the respective sides of supply and demand. An increase in activity on one side—whether in deposits or loans—naturally boosts activity on the other, setting a high barrier for new entrants. Moreover, the greater the overall liquidity of the platform, the smoother the transitions for both deposits and withdrawals become, and the more likely to be favored by whales, which in turn stimulates the growth of the platform business.
3. Exceptional DAO Governance: Aave has transitioned to a fully decentralized DAO-based governance system, providing greater transparency and deeper community engagement in decision-making compared to centralized management. The Aave DAO boasts a vibrant ecosystem of governance participants, including top venture capitalists, university blockchain clubs, market makers, risk management firms, third-party developers, and financial advisors. These participants are diverse and actively engaged in governance. From the operational results of the project, Aave, as a latecomer to peer-to-peer lending services, has managed to balance growth and security effectively in product development and asset expansion and realized its overtaking of the industry leader Compound. The DAO governance has played a crucial role in this process.
4. Strategic Positioning in the Multi-Chain Ecosystem: Aave has established a strong presence across nearly all EVM-compatible Layer 1 and Layer 2 networks, consistently ranking at the top in terms of Total Value Locked (TVL) on each. The upcoming Aave V4 will enhance multi-chain liquidity integration, amplifying the benefits of cross-chain liquidity flows. The following chart provides more details.

In addition to EVM-compatible chains, Aave is actively assessing other networks like Solana and Aptos, considering potential deployments on these networks in the future.
Valuation Insights
As per Tokenterminal data, Aave has seen its valuation metrics dip to historical lows due to a steady increase in protocol fees and revenues, along with a persistently low token price. The Price to Sales (PS) ratio, which compares the circulating market cap with protocol revenue, stands at 17.44x, while the Price to Fees (PF) ratio, comparing market cap to protocol fees, is at 3.1x.

Tokenterminal

Risks and Challenges
Although Aave has been successfully expanding its share in the lending market, it faces emerging competition from Morpho Blue, a noteworthy modular lending platform. Morpho Blue offers a flexible suite of modular protocols to third parties aiming to establish their own lending markets. This platform allows for the customization of collaterals, borrowing assets, oracles, and risk parameters, enabling the creation of tailored lending environments.
This modular approach has facilitated the entry of numerous new players into the lending space, who have begun to offer lending services. For instance, Gauntlet, previously a risk manager for Aave, opted to sever ties with Aave in favor of launching its own lending market on Morpho Blue.

Morpho BLue

Morpho Block Analytics
Since its launch more than half a year ago, Morpho Blue has experienced rapid growth, now ranking as the fourth-largest lending platform by TVL, just behind Aave, Spark (a MakerDAO-launched copy of Aave v3), and Compound. 
Its expansion on Base has been particularly swift, achieving a TVL of $27 million in less than two months, while Aave’s TVL on Base stands at approximately $59 million.

Morpho Block Analytics
Decentralized Exchanges: Uniswap & Raydium
Uniswap and Raydium are key players within the Ethereum EVM and Solana ecosystems, respectively. Uniswap debuted on the Ethereum mainnet with its V1 version in 2018, but it was the introduction of its V2 in May 2020 that catapulted the platform to prominence. Raydium, on the other hand, made its entry on the Solana network in 2021.
The rationale behind highlighting two distinct entities in the decentralized exchanges sector is their affiliation with the two most populous Web3 ecosystems: the EVM ecosystem, centered around Ethereum—the leading public blockchain—and the fast-expanding Solana ecosystem. Both projects boast unique advantages and face specific challenges. Let’s delve into a detailed analysis of each.
Uniswap
Business Overview
Since its V2 release, Uniswap has maintained its position as the leading decentralized exchange (DEXs) in terms of trading volume across the Ethereum mainnet and other EVM-compatible chains. We focus primarily on two key metrics: trading volume and trading fees.
The chart below illustrates the monthly trading volume share of Uniswap V2 from its launch, excluding trading volumes on non-EVM chains:

Tokenterminal
From the launch of its V2 in May 2020, Uniswap’s market share reached a peak of 78.4% in August 2020 and then declined to a bottom of 36.8% during the most fierce DEX wars in November 2021. It has since rebounded to a stable 56.7%, evidencing its ability to withstand tough competition. 

Tokenterminal
This trend is also mirrored in its share of trading fees; after bottoming out at 36.7% in November 2021, Uniswap’s market share in fees has steadily climbed, now standing at 57.6%.
Remarkably, Uniswap has largely refrained from subsidizing liquidity with tokens, except for brief periods in 2020 on the Ethereum mainnet and at the end of 2022 on the Optimism mainnet. This restraint stands in stark contrast to most other DEXs, which continue to rely on liquidity incentives to this day.
The chart below illustrates the monthly incentives of major DEXs. It can be observed that Sushiswap, Curve, Pancakeswap, and Aerodrome, a project adopts ve(3,3) model and built on Base, all of which at one point had the largest subsidy amounts. However, none of them have managed to secure a higher market share than Uniswap.

Tokenterminal

One persistent critique of Uniswap is that despite not engaging in token incentives, its tokens still lack utility, as the protocol has not activated $UNI as gas fee.
In late February 2024, Erin Koen, a Uniswap developer and governance lead in Uniswap foundation, submitted a proposal to upgrade the protocol. This would introduce a fee structure to benefit $UNI holders who have approved and delegated their tokens, leading to significant community debate. Although the vote was initially planned for May 31, it has been postponed and remains pending. Despite these delays, Uniswap has taken initial steps toward enabling fees and enhancing the utility of $UNI tokens, with the revised contract already developed and audited. Uniswap will have a separate revenue stream from the protocol in the near future.
Additionally, Uniswap Labs started implementing a swap fee in October 2023 for users trading through the official Uniswap website and the Uniswap wallet. The fee is set at 0.15% and involving ETH, USDC, WETH, USDT, DAI, WBTC, agEUR, GUSD, LUSD, EUROC, and XSGD. However, it is important to note that swap between stablecoins and wraps between ETH and WETH are excluded from this fee.
Simply initiating a fee structure on Uniswap’s interface has positioned Uniswap Labs as one of the highest revenue-generating teams within the Web3 space.
With the anticipated activation of protocol layer fees and based on the annualized fees from the first half of 2024, Uniswap could generate around $1.13 billion annually. If the protocol charges a 10% fee ratio, this would translate to an annual protocol revenue of approximately $113 million. 
Additionally, the expected launches of Uniswap X and V4 later this year are set to potentially boost its market share in trading volumes and fees even further.
Competitive Advantages
The competitiveness of Uniswap are underpined by the three key factors:
1. Reputation Among Users: There was considerable skepticism when Uniswap first implemented interface fees last year. Many thought users would swiftly migrate their swaps to aggregators like 1inch to avoid extra fees. Contrary to these expectations, interface fee revenue continued to grow and even outpaced Uniswap’s fee growth for the entire protocol.

Tokenterminal

This data is a strong indication proving the power of user habits on Uniswap. Many users are not caring about this 0.15% fee expense and choosing to stick with their familiar trading routines.
2. Bilateral Network Effect: Uniswap functions as a classic two-sided market. On one side are the traders, and on the other, the liquidity providers (LPs). The more vibrant the trading at a given platform, the more it attracts LPs to offer liquidity, creating a cycle of mutual reinforcement. The other dimension of this bilateral effect involves the traders and the teams deploying initial token liquidity. To ensure their tokens are easily discoverable and tradable, these teams often prefer to establish initial liquidity on well-known DEXs like Uniswap rather than on lesser-known, smaller platforms. This strategy not only enhances the visibility of new tokens but also reinforces trader habits to first seek out Uniswap for new investments, thereby strengthening the two-sided market dynamic between “blockchain projects” and “traders.”
3. Multi-Chain Deployment: Like Aave, Uniswap has been actively expanding its presence across multiple blockchain networks. It is visible on all major EVM chains, consistently ranking among the top decentralized exchanges in terms of trading volume on those networks. 

With the upcoming introduction of Uniswap X, which will enhance support for tradings across different chains, Uniswap’s competitive edge in multi-chain liquidity is poised to grow even stronger.
Valuation Insights
The primary measure for assessing Uniswap’s value is the Price to Fees (PF) ratio, which compares its circulating market cap to its annualized fees. Presently, $UNI tokens are valued within a historically high percentile, likely influenced by market anticipation of the upcoming fee switch upgrade.

Tokenterminal

As for market valuation, Uniswap currently boasts a circulating market cap of nearly $6 billion and a fully diluted valuation of approximately $9.3 billion, indicating a reasonable valuation.
Risks and Challenges
Regulatory Risk: In April 2024, Uniswap was served a Wells Notice by the SEC, signaling potential forthcoming enforcement actions. While the progressive advancement of the FIT21 bill may eventually offer DeFi projects like Uniswap a clearer and more predictable regulatory framework, but given that it will still take a long time for the bill to be voted on and put into place, and the lawsuit from the SEC will put pressure on Uniswap’s operational activities and token prices in the medium term.
Position in DeFi Ecosystem: DEXs are the fundamental layer of liquidity. Traditionally, the upstream entities above DEXs are aggregators (e.g., 1inch, Cowswap, Paraswap), which offer users comparative pricing across chains to identify optimal trading routes. This model to a certain extent inhibit the downstream DEX of the user trading behavior of the charges and pricing ability. With the development of blockchain, wallets with built-in trading functions have emerged as higher-level infrastructure. With the adoption of ‘intent-based’ models, DEXs are expected to serve as invisible sources of liquidity, possibly diminishing the direct use of platforms like Uniswap in favor of a comprehensive “comparison shopping mode.” Aware of these dynamics, Uniswap is making concerted efforts to ascend within the ecosystem, notably through promoting its own wallet and launching Uniswap X to become more aggregated, aiming to enhance its strategic positioning.
Raydium
Business Overview
We will closely analyze Raydium’s trading volume and fees. Raydium has a significant advantage over Uniswap due to its early implementation of protocol fees, resulting in robust cash flows. Consequently, Raydium’s protocol revenue will also be a major focus of our analysis.
Reviewing Raydium’s trading volume, which have surged since October 2023 due to the thriving Solana ecosystem, we see a peak in March when volume reached $47.5 billion—about 52.7% of Uniswap’s trading volume for the same month.

Flipside
In terms of market share, Raydium’s trading volume on the Solana has consistently increased since September 2023 and now represents 62.8% of all trading volume within the Solana ecosystem. Raydium’s dominance in Solana even exceeds Uniswap’s influence within the Ethereum ecosystem.

Dune Analytics

Raydium’s impressive ascent in market share, rising from less than 10% during a slump to over 60%, can largely be attributed to the sustained Meme hype in this bull cycle. Raydium utilizes two types of liquidity pools: standard AMMs and CPMMs. The standard AMM model, similar to Uniswap V2, features evenly distributed liquidity suitable for assets with high volatility. In contrast, the CPMM model, akin to Uniswap V3, allows liquidity providers to set specific ranges for their liquidity, resulting in a system that is more flexible yet more complex.
Raydium’s competitor, Orca, opts for a concentrated liquidity pool model similar to Uniswap V3, but Raydium’s standard AMM model proves more conducive for memecoins who need to provide and allocate liquidity in large volumes every day. This has made Raydium the go-to liquidity platform for memecoins.
Additionally, Solana has become the leading incubator for memecoins during this bull market, witnessing the creation of thousands of new memecoins each day since November. These memecoins have become the driving force behind the thriving Solana ecosystem, significantly boosting Raydium’s business expansion.

Dune Analytics
As indicated by the chart, in December 2023, Raydium introduced 19,664 new tokens within a week, in stark contrast to 89 new tokens on Orca. Theoretically, Orca’s concentrated liquidity mechanism could emulate traditional AMMs by setting liquidity to span “the full range.” However, this method lacks the simplicity of Raydium’s standard pool model. 
This is further evidenced by Raydium’s trading data, which shows that 94.3% of its trading volume stems from standard pools, largely driven by memecoins.
Additionally, Raydium operates as a bilateral market similar to Uniswap, catering to both projects and individual users. The more retail traders on Raydium encourages memecoins to establish their initial liquidity on the platform. This dynamic spurs users and user-support tools (like various memecoin-tracking bots on telegram) to conduct their trading through Raydium, establishing a self-perpetuating cycle that significantly extends Raydium’s lead over Orca. 
Regarding swap fees, Raydium accrued about $300 million in the first half of 2024, which is 9.3 times the fees it collected throughout all of 2023.

Flipside
Raydium’s standard AMM pools charge a swap fee of 0.25%, with 0.22% of that fee distributed to liquidity providers (LPs), and the remaining 0.03% allocated for the buyback of $Ray, the native token of Raydium. In the CPMM pools, the fee ratio are customizable and can be freely set at 1%, 0.25%, 0.05%, or 0.01%, with the LP receiving 84% of the trading fees and the remaining 16% is split between buying back Ray tokens (12%) and contributing to the treasury (4%).

Flipside
In the first half of 2024, Raydium used about $20.98 million of its protocol revenue to buy back $Ray tokens, an amount 10.5 times the total spent on buybacks in 2023.
Besides trading fees, Raydium charges for creating new pools: 0.4 SOL for a standard AMM pool and 0.15 SOL for a CPMM pool. Raydium’s daily earnings from these pool creation fees average around 775 SOL. Calculated at the $SOL on June 30th, 2024, the total revenue should be $108,000. These funds are directed towards protocol development and maintenance, serving as operational income for the team, rather than being deposited into the treasury or used for $Ray buybacks.

Flipside

Similar to most decentralized exchanges, Raydium continues to offer incentives for liquidity providing. While there is no data available for the amounts of these incentives, one can estimate the current value of the incentives being provided to active liquidity pools by analyzing the data available on Raydium’s official liquidity interface.

Based on current Raydium incentives for liquidity, there is roughly $48,000 worth of incentive spending per week, primarily in Ray tokens. This expenditure is considerably lower than the protocol’s weekly revenue, which nears $800,000 (not including revenue from creating pools). This disparity underscores that Raydium is operating with a positive cash flow.
Competitive Advantages
Raydium holds the distinction of being the DEXs with the highest market trading volume on Solana. Its primary strength lies in the bilateral network effects. Same as Uniswap, these effects are amplified by the symbiotic relationships between traders and liquidity providers, as well as between project initiators and traders. The impact of these network effects is especially significant within memecoins.
Valuation Insights
Due to the lack of historical data before 2023, the valuation is based on a comparison between Raydium’s valuation data from the first half of 2024 and the full year of 2023.

With the spike in trading volume this year, despite a rise in the price of Ray tokens, Raydium’s valuation relative to last year has noticeably declined. Furthermore, when compared to other DEXs such as Uniswap, Raydium’s Price to Fees (PF) ratio is still relatively low.
Risks and Challenges
Although Raydium has demonstrated robust growth in trading volume and revenue in the past half year, its future development is fraught with uncertainties and challenges.
Ecosystem Position: Raydium struggles with its positioning within the ecosystem. In the Solana ecosystem, aggregators such as Jupiter wield greater influence, with their trading volumes significantly outstripping those of Raydium. In June 2024, Jupiter’s total trading volume in was $28.2 billion, compared to Raydium’s $16.8 billion. Moreover, memecoin-focused platforms like Pump.fun are gradually overtaking Raydium as the go-to launchpad for projects, with more memecoins choosing to launch via Pump.fun instead of Raydium, despite their ongoing collaboration. Pump.fun is increasingly diminishing Raydium’s influence among projects, and Jupiter has overtaken Raydium in capturing trader engagement. Should this trend continue unaddressed, and should top-tier entities like Pump.fun or Jupiter develop their own DEXs or pivot to competitors, Raydium could face significant challenges.
Shifts in Market Trends: Before the memecoin craze swept through Solana, Orca’s share of trading volume was seven times larger than that of Raydium. This cycle, due to Raydium’s standard pools being more friendly to memecoins, Raydium managed to recapture some of its lost market share. Yet, the longevity of the memecoin craze within Solana remains uncertain, as does the future dominance of memecoins on the blockchain. Predicting these trends is challenging. Should the market’s preference for types of crypto assets shift, Raydium’s regained market share could be at risk once again.
Token Emission: The circulating ratio of $RAY is 47.2%, relatively low when benchmarked against other DeFi projects. This could mean potential downward pressure on prices as more tokens become unlocked. However, given Raydium’s robust cash flow, selling off these tokens isn’t the sole strategy available. The team could opt to burn the yet-to-be-released tokens, a move that could help mitigate worries about oversupply.
Centralization Concerns: Raydium has not initiated a governance system driven by $RAY tokens, leaving the project’s evolution solely in the hands of the core team. This centralization may hinder the distribution of profits that should be attributed to the holders. For instance, decisions on the allocation of buybakced $RAY have yet to be clarified, leaving a critical issue pending.
Staking: Lido
Lido is the leading liquidity staking within the Ethereum ecosystem. The Beacon Chain’s initiation at the close of 2020 signaled Ethereum’s transition from Proof of Work (PoW) to Proof of Stake (PoS). Initially, the absence of a withdrawal feature for staked assets resulted in a loss of liquidity for staked ETH. Notably, it was not until the Shapella upgrade in April 2023 that withdrawals from the Beacon Chain were enabled, leaving early ETH stakers without liquidity for over two and a half years.
Lido was the first to introduce the liquidity staking concept. When users deposit ETH into Lido, they will receive stETH as proof of their stake. Lido facilitated a robust stETH-ETH liquidity pool on Curve, marking the first time users could reliably participate in ETH staking to earn rewards while retaining the flexibility to withdraw their ETH at any time. This breakthrough fueled rapid growth, establishing Lido as the frontrunner in Ethereum’s staking landscape.
Regarding its business model, Lido retains 10% of the staking revenue it generates, allocating 5% to staking service providers and managing the remaining 5% through its DAO.
Business Overview
Lido’s primary operation revolves around providing ETH liquidity staking services. In the past, Lido held the position of the top liquidity staking provider on the Terra network and was the second-largest on the Solana network. It also ventured into expanding its services across other blockchains, including Cosmos and Polygon. Nevertheless, Lido prudently scaled back its operations, choosing to concentrate exclusively on Ethereum network staking. Today, Lido stands as the market leader in ETH staking and boasts the highest TVL among all DeFi protocols.

DeFiLlama
With the substantial stETH-ETH liquidity created by extensive $LDO incentives, and with investment support from institutions like Paradigm and Dragonfly in April 2021, Lido surpassed its main competitors—centralized exchanges such as Kraken and Coinbase—by the end of 2021, positioning itself at the forefront of the Ethereum staking landscape.

Source: Dune Analytics
However, this led to concerns about whether Lido’s prominent position could undermine Ethereum’s decentralization. The Ethereum Foundation is considering measures to cap any single entity’s staking share at 33.3% to preserve the network’s decentralized nature. After hitting a high of 32.6% in May 2022, Lido’s market share has oscillated between 28% and 32%.

The market share of ETH staking

Competitive Advantages
Lido’s business strength consists of two main points:
Lido’s enduring leadership in the market has cultivated stable expectations, positioning it as the go-to platform for whales and institutions interested in ETH staking. Notable figures like Justin Sun, and Mantle before launching their LST, and many big investors are among Lido’s clientele.Network effects from a wide range of stETH use cases. stETH was fully supported by the head DeFi protocol as early as 2022. And subsequently, DeFi protocols have developed and found ways to integrate with stETH, evidenced by the traction gained by projects like LSTfi in 2023, and others such as Pendle and various LRT initiatives. This extensive adoption has solidified stETH’s role as a foundational yield-generating asset within the Ethereum network.
Valuation Insights 
Despite a modest dip in market share, Lido’s staking volume has continued to expand, propelled by an increasing overall staking rate of $ETH. Valuation-wise, Lido’s Price to Sales (PS) and Price to Fees (PF) ratios have recently hit all-time lows. 

Token Terminal
With the successful rollout of the Shapella upgrade, Lido has cemented its position in the market. The profitability indicators, reflecting the “revenue/token incentive” metrics, have shown impressive results, with Lido generating $36.35 million in profits over the last year.

Token Terminal

This situation has led to anticipation within the community for potential adjustments to the $LDO tokenomics. However, Lido’s de facto leader, Hasu, has repeatedly expressed that the current revenue from the community treasury does not suffice to cover all of Lido DAO’s ongoing expenses over the long haul. He emphasizes that discussions on revenue distribution are premature given the financial landscape.
Risks and Challenges
Lido faces the following risks and challenges:
Competition from newcomers. Lido’s market share has been on the decline since the release of Eigenlayer. New projects equipped with significant token marketing budgets pose a threat to established leaders like Lido, particularly as Lido’s tokens are almost fully circulated.Members of the Ethereum community, including several from the Ethereum Foundation, have long harbored reservations about Lido’s dominant market share in staking. Vitalik Buterin has addressed these concerns directly, publishing an article that explored potential solutions, yet he refrained from endorsing any particular option. For those interested in a deeper dive, you can read our previous analysis: Evaluating Vitalik’s Proposals on Ethereum Staking.On June 28, 2024, the SEC’s allegations against Consensys explicitly classified LST as a security.  The act of minting and buying stETH by a user was characterized by the SEC as “Lido’s issuance and sale of unregistered securities.” Furthermore, Consensys faced accusations for issuance and sale of unregistered securities by providing users with the ETH staking service.
Perpetual Exchange: GMX 
GMX is a decentralized perpetual exchange, first going live on Arbitrum in September 2021 and subsequently launching on Avalanche in January 2022. The platform functions as a two-sided market: on one side are traders, who can open positions with leverage up to 100x; on the other side are liquidity providers, who supply the liquidity of their assets for trading purposes and serve as counterparties to the traders.
In terms of business model, GMX’s revenue streams primarily come from trading fees, which vary between 0.05% and 0.1%, in addition to funding and lending fees charged to traders. GMX distributes 70% of all its revenue to liquidity providers, while the remaining 30% is allocated among $GMX token stakers.
Business Overview
The field of perpetual trading platforms is characterized by frequent new entrants like Aevo, Hyperliquid, Synfutures, and Drift, which often offer traceable airdrops, and established platforms that provide trade-mining incentives, such as dYdX, Vertex, and RabbitX. Given these factors, trading volume data may not fully reflect the true competitive landscape. Therefore, we will utilize metrics such as Total Value Locked (TVL), Price to Sales (PS), and profits to provide a comparative analysis of GMX alongside its competitors.
GMX currently holds a leading position in terms of TVL. Nonetheless, other significant players in the space, such as the established derivatives protocol dYdX, Jupiter Perp with its substantial traffic gateway on Solana, and the forthcoming Hyperliquid, are also showcasing TVL figures that are comparable to GMX’s.

DeFiLlama

When considering the PS ratio, GMX stands out for its relatively low valuation in the segment of projects that have issued their tokens, focus on perpetual trading and have an average daily trading volume of over $30 million. The only competitor with a lower PS ratio is Vertex, which continues to engage heavily in trade mining incentives.

Looking at profit metrics, GMX recorded profits of $6.5 million in the past year, which are less than those of competitors such as dYdX, GNS, and SNX. However, it is critical to understand that this figure was significantly impacted by GMX’s decision to release all 12 million $ARB tokens rewards during the Arbitrum STIP event from November 2023 to March 2024. These tokens had an estimated value of $18 million, based on ARB’s average price at the time, which notably diminished the reported profits. Despite this, the trend in profit accumulation proves GMX’s robust ability to generate profits.

Competitive Advantages
Relative to the other DeFi projects mentioned, GMX has a comparatively weaker competitive edge. the frequent emergence of new programs on derivatives exchanges in recent years has largely impacted GMX’s trading volume and the marketplace remains congested. Despite these challenges, GMX maintains several key strengths:
Robust Support from Arbitrum: GMX, as a native project within the Arbitrum ecosystem, was responsible for nearly half of Arbitrum’s TVL during its peak periods. At that time, virtually every new DeFi project on Arbitrum was tailored to “cater to GLP,” which not only provided significant exposure from the Arbitrum official channels but also resulted in GMX receiving a substantial amount of ARB tokens through various incentive events—including an initial airdrop of 8 million and an additional 12 million from STIP activities. This substantial token accumulation greatly enriched GMX’s treasury and provided crucial marketing resources for GMX, whose tokens are already fully circulated.Established Leadership and Strong Reputation: GMX significantly shaped the “Real DeFi Yield” narrative from late 2022 through early 2023, a rare bright spot in the DeFi space during that bearish market period, and GMX took the opportunity to build a strong brand reputation and attract a dedicated and loyal user base.Economies of Scale Advantage: GMX, as the leading perputual trading platform, is a beneficiary of this scale effect. When liquidity providers (LPs) are large enough, they can accommodate larger trading orders and higher open positions, which in turn generate higher returns for LPs. For example, Andrew Kang, a renowned trader, frequently opened long and short positions worth tens of millions of dollars on GMX. At that time, GMX was almost the only option for placing such large orders on-chain.
Valuation Insights
GMX has reached full circulation. As highlighted in our earlier cross-industry comparison, GMX presently has the lowest valuation among leading perpetual exchanges. 
In a longitudinal comparison with its historical data, GMX’s revenue has demonstrated consistent stability and its PS ratio has tended to fall within a moderately low range

Risks and Challenges
Intense Competitive Landscape: GMX faces formidable competition not only from well-established DeFi protocols like Synthetix and dYdX, which continue to innovate and drive activity, but also from rising newcomers, such as AEVO, specializing in token swaps, and Hyperliquid, which has not yet issued its tokens, have both gained significant attention and trading volume over the past year. Jupiter Perp, leveraging its substantial access to Solana’s traffic, has managed to match GMX’s TVL and even surpass its trading volumes using a nearly identical mechanism to GMX. GMX is currently gearing up to roll out their V2 version on Solana, yet the overall competitive environment remains highly intense, lacking a defined pattern like the other DeFi tracks. Additionally, the prevalent trade mining incentives lower barriers to user migration, generally leading to lower user loyalty.GMX relies on oracle prices as the price basis for trading and liquidation, which introduces a vulnerability to oracle attacks. In September 2022, GMX suffered a loss of $560,000 due to an oracle attack on AVAX. Despite this, the costs associated with executing such attacks (i.e., manipulating the CEX prices of the tokens involved) are typically prohibitive relative to the gains. To further protect against these risks, the GMX V2 update has introduced measures like segregated pools and adjustments for trading slippage.
Other Noteworthy DeFi Projects
Beyond the DeFi projects previously discussed, our research has identified other compelling projects within the space, including the well-established stablecoin MakerDAO, the rising star Ethena, and the foremost oracle solution Chainlink. Unfortunately, space constraints prevent a comprehensive presentation of these projects in this document. Moreover, each of these projects encounters its own set of challenges, including:
While MakerDAO continues to lead in the decentralized stablecoin sector and boasts a significant base of natural holders—who treat DAI similarly to how they would USDC or USDT—the size of its stablecoin has not advanced, remaining at roughly half of its previous peak in terms of market cap. Furthermore, its reliance on off-chain dollar assets for collateral is progressively eroding the decentralized trust associated with its token.
Ethena’s stablecoin, $USDe, sharply contrasts with MakerDAO’s DAI, having surged from zero to $3.6 billion within approximately six months. Despite this impressive growth, Ethena’s business model, which centers on a public fund dedicated to perpetual arbitrage, inherently faces limitations. The substantial scale-up of its stablecoin hinges on secondary market participants’ willingness to purchase its $ENA token at elevated prices, a strategy that underpins the high-yield subsidies necessary to sustain USDe expansion. This slightly Ponzi-like design becomes highly vulnerable during times of poor market sentiment, potentially leading to a downward spiral in both business and token prices. A pivotal moment for Ethena could arise if USDe can genuinely establish itself as a decentralized stablecoin embraced by a substantial base of ‘natural holders,’ thereby shifting from being a public arbitrage fund to a stablecoin operator. However, given that the underlying assets of USDe are largely tied to arbitrage positions on centralized exchanges, it faces significant hurdles related to both ‘decentralized censorship resistance’ and ‘robust institutional endorsement.’ This makes it extremely challenging for USDe to replace DAI and USDT.
After making its mark in the DeFi sector, Chainlink is gearing up for an under-the-radar yet potentially massive shift in narrative, driven by financial behemoths like BlackRock that have progressively embraced Web3 technologies. This shift focuses on the integration of Real World Assets (RWA). Apart from advocating for BTC ETF and ETH ETF, one of BlackRock’s standout initiatives this year was the tokenization of a U.S. Treasury bond fund, named ‘Build,’ on Ethereum,  which amassed over $380 million in just six weeks. The experimentation with financial products on blockchain by traditional financial giants will continue, inevitably grappling with issues like the tokenization of off-chain assets and enhancing both on-chain and off-chain communications and interoperability. 
Chainlink’s explorations in blockchain interoperability are quite advanced. For instance, in May of this year, Chainlink completed a “Smart Asset Net Worth” project with the Depository Trust and Clearing Corporation (DTCC) and several major U.S. financial institutions. This pilot project aims to establish a standardized process for aggregating and disseminating fund net asset value (NAV) data on private or public blockchains using Chainlink’s interoperability protocol, CCIP. Additionally, in February, asset managers Ark Invest and 21Shares announced the validation of position data by integrating with Chainlink’s Proof of Reserve platform. However, Chainlink still faces challenges with its business value being detached from its tokens. The lack of value capture and rigid application scenarios for $LINK tokens raises concerns that holders will find it difficult to benefit from the growth of Chainlink’s business.
Conclusion
As with many transformative products, DeFi has traced a distinctive trajectory since its inception. It started with the narrative building of its ‘Genesis Year’ in 2020, followed by a rapid bubble in 2021, and then moved into a phase of disillusionment following the bubble’s burst in the 2022 bear market. Now, with its Product-Market Fit (PMF) robustly established, DeFi is climbing out of the trough of narrative disillusionment, and building its intrinsic value with a solid business.

I am convinced that DeFi, characterized by its mature business models and expanding market potential, is worthy of sustained focus and investment.
The Next ICP? Quilibrium Brings A New Narrative for Decentralized ComputingBy Lydia Wu, Researcher at Mint Ventures Please note: As Quilibrium’s mainnet has not yet launched, and public information is limited, the descriptions of its incentive mechanisms, tokenomics, financing, and roadmap are based on public resources and may change in the future. This article is intended for research and educational purposes and should not be taken as investment advice. We welcome any feedback. Key Insights  Theses Quilibrium aims to bridge the gap between the computational capabilities of the traditional internet and the decentralized nature of blockchain, establishing a unique decentralized cloud computing architecture. This synthesis offers a balanced approach, leveraging strengths from both worlds.Quilibrium has developed a database-driven operating system that resonates with the workflows of traditional software development, potentially attracting mainstream software developers. This system also supports Web3 developers in crafting sophisticated crypto applications, providing a versatile foundation for diverse development needs.The design of Quilibrium’s architecture places a strong emphasis on security and privacy, appealing to enterprises seeking to adopt cryptographic technologies while protecting sensitive data. For individual users, the early success of Farcaster showcases the potential of decentralized applications to engage users and drive profitability.Cassie Heart, the Founder and CEO of Quilibrium, is a former senior engineer at Coinbase and developer of Farcaster, leading a team recognized for its profound expertise, consistent performance, and innovative approach. Risks  The project is in its early stages, with its mainnet yet to be launched. Additionally, the complexity of the project means that both its technological feasibility and market demand remain untested.In the short term, Quilibrium could encounter competitive challenges from the more established Arweave AO, particularly in terms of user perception and developer engagement.Furthermore, the lack of clearly defined tokenomics and the possibility of fluctuations in the token release rate present significant risks for investors. Valuation Given that Quilibrium is in its early stages, an accurate valuation is not currently feasible. However, when examining the circulating and fully diluted market cap, Quilibrium appears competitively valued compared to other market players with similar concepts. Business Analysis Quilibrium defines itself as a “decentralized internet layer protocol, providing the creature comforts of Cloud without sacrificing privacy or scalability,” and as a “decentralized Platform-as-a-Service(PaaS) solution.” This section explores Quilibrium’s business model by addressing the following key questions: What issues exist with traditional internet cloud computing? Why is there a need for new decentralized computing platforms? What makes Quilibrium unique compared to other popular blockchain architectures? Cassie Heart on Farcaster Business Positioning Start with Computing In both Web2 and Web3, “computing” is a vital concept, acting as the catalyst for the development, execution, and scaling of applications.  In traditional Internet frameworks, computing tasks are usually handled by centralized servers. The emergence of cloud computing has improved the scalability, accessibility, and cost-efficiency of these tasks, increasingly becoming the dominant form of computing. Cloud services offered by major providers are generally categorized into three main types: Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS). Each category meets different user needs and capabilities, offering varying levels of resource control. While SaaS is most familiar to end-users, PaaS and IaaS are primarily targeted at developers. Made by Lydia@MintVentures Data source: S2 Lab, Made by Lydia@Mint Ventures In mainstream blockchains like Ethereum, computing is handled by decentralized nodes. This approach operates without central servers, with each node executing tasks locally and ensuring data accuracy and consistency through a consensus mechanism. However, the capability and speed of decentralized computing often fall short compared to traditional cloud services.  Quilibrium aims to balance the robust computing power and scalability of the traditional internet with the decentralization of blockchain, paving the way for new possibilities in application development. Cassie Heart’s Live Streaming Centralization in Computer Systems For most end users, the centralization of computer systems is not easily perceivable, as they mainly interact with the hardware layer. PCs, smartphones, and other devices are distributed globally and run independently under individual control, giving the impression that hardware-level systems are not centralized. However, in contrast to this hardware dispersion, computer systems are significantly more centralized at the network architecture and cloud computing service levels. In the first quarter of 2024, Amazon AWS, Microsoft Azure, and Google Cloud collectively held over 67% of the cloud service market share, far outpacing newer entrants. Source: Synergy Research Group Moreover, benefiting from the AI boom, cloud service providers continue to grow stronger. Microsoft Azure, as the exclusive cloud provider for OpenAI, has reversed its previous slow performance, now showing accelerated growth. In Microsoft’s fiscal third quarter of 2024 (the first calendar quarter of 2024), Azure and other cloud services saw a revenue increase of 31%, surpassing market expectations of 28.6%. Data Source: Microsoft, Made by Lydia@MintVentures In addition to market competition, privacy and security concerns related to centralized computing systems are increasingly in the spotlight. Each outage at major cloud service providers has a widespread impact. Between 2010 and 2019, AWS experienced 22 sudden outages, averaging 2.4 outages per year. These disruptions not only affected Amazon’s e-commerce business but also impacted the online services of companies reliant on AWS, including Robinhood, Disney, Netflix, and Nintendo. Introduction of Decentralized Computing In this context, the need for decentralized computing is increasingly emphasized. As centralized cloud service providers move towards distributed architectures—by replicating data and services in multiple locations to prevent single points of failure and enhancing performance through edge storage—the focus of decentralized computing narratives has shifted towards data security, privacy, scalability, and cost-efficiency.  Let’s explore several concepts of decentralized computing proposed by various projects, all aiming to disperse data storage and processing to create a globally distributed computing platform that supports the development of decentralized applications: World Computer: This generally refers to Ethereum, which provides a global smart contract execution environment. Its key functionalities include decentralized computing and the uniform execution of smart contracts worldwide.Internet Computer: Typically refers to the ICP developed by the Dfinity Foundation, with its goal of enhancing the Internet’s capabilities to enable decentralized applications to operate directly on the Internet.Hyper Parallel Computer: Generally refers to the AO protocol introduced by Arweave, a distributed computing system built on the Arweave network, noted for its high parallelism and fault tolerance. It’s important to note that ICP, AO, and Quilibrium are not traditional blockchains in the usual sense. They do not rely on a linear block structure but maintain fundamental principles like decentralization and data immutability. They can be seen as natural expansions of blockchain technology. Although ICP has not yet fully achieved its ambitious goals, the advent of AO and Quilibrium has opened up new possibilities that could influence the future of Web3. The table below contrasts the technical features and application focuses of the three platforms, helping readers evaluate “whether Quilibrium might follow in ICP’s footsteps” and highlighting the differences between Quilibrium, another pioneering solution in decentralized computing, and AO, often dubbed the “Ethereum killer.” Consensus Mechanism In traditional blockchains, the consensus mechanism functions at a fundamental and abstract level, determining how the network achieves consensus, processes and verifies transactions, and executes other activities. The selection of different consensus mechanisms influences key aspects of the network such as security, speed, scalability, and decentralization. Quilibrium’s consensus mechanism, known as “Proof of Meaningful Work” (PoMW), mandates that miners engage in genuinely beneficial tasks for the network, including data storage, data indexing, and network maintenance. The design of the PoMW consensus mechanism integrates multiple fields such as cryptography, multi-party computation, decentralized systems, database architecture, and graph theory. It aims to lessen dependence on single resources like energy or capital, preserve the network’s decentralized nature, and uphold security and scalability as the network expands. The incentive mechanism is crucial for the efficient functioning of the consensus mechanism. Quilibrium’s incentive allocation is dynamic, adjusting based on network conditions to align incentives with current demands. Additionally, Quilibrium introduces a multi-proof mechanism, allowing a node to verify multiple data segments, ensuring the network remains operational even with a shortage of nodes and core resources. To help your understanding, we can use a simplified formula to calculate the ultimate earnings or miners, where the unit reward adjusts dynamically with the network size. Earnings = Score × Unit Reward The score is calculated based on various factors, and the formula is as follows: The parameters are defined as follows: Time in Mesh for Topic: Longer engagement and greater stability yield a higher score.First Message Deliveries for Topic: A higher number of initial message deliveries increases the score.Mesh Message Delivery Rate/Failures for Topic: Nodes with higher delivery rates and fewer failures receive higher scores.Invalid Messages for Topic: Fewer instances of delivering invalid messages lead to a higher score.  The weighted sum of these four parameters is subject to a topic cap (TC), which serves to keep the score within a certain range and prevent unfair scoring due to excessively high individual parameters. Application-Specific Score: Defined by the particular application.IP Collocation Factor: Higher scores are awarded to nodes with fewer others sharing the same IP address. Quilibrium Dashboard Quilibrium currently operates over 60,000 nodes, and the actual earnings of these nodes may vary due to the differing parameter weights between versions. From version v1.4.19 onwards, miners can view their earnings in real-time, although the earnings can only be claimed once the mainnet goes live. Network Architecture Quilibrium’s core business is its decentralized PaaS solution, characterized by a network model that includes communication, storage, data querying and management, and operating systems. This section highlights the distinctive aspects of its design compared to other blockchains. Readers interested in technical specifics and methodologies can refer to the official documents and whitepaper. Communication Communication serves as the foundational component of the Quilibrium network and consists of four parts: a. Key Generation Quilibrium introduces a key generation mechanism called PCAS (Planted Clique Addressing Scheme), based on graph theory. Like traditional blockchain technology, PCAS utilizes asymmetric encryption: each user possesses a public key and a private key. The public key is used for encrypting messages or verifying signatures and can be made public, while the private key is used for decrypting messages or creating signatures and is kept private. The main differences lie in the methods of key generation, forms, and application scenarios. b. End-to-End Encryption End-to-end encryption (E2EE) is a vital component that ensures secure communication between nodes. With E2EE, only the parties directly involved in the communication can view the plaintext data. Systems or intermediaries that assist in message delivery are unable to read the contents. Quilibrium utilizes an end-to-end encryption method known as Triple-Ratchet, which provides enhanced security over traditional ECDH schemes. Unlike traditional schemes that use a single static key or periodically update the key, the Triple-Ratchet protocol updates the key after each communication session. This ensures forward secrecy, post-compromise security, deniability, and replay protection, and supports out-of-order message delivery. This method is particularly suited for group communications, although it is relatively more complex and computationally costly. c. Shuffled Lattice Routing Mixnets function as a black box that receives messages from senders and forwards them to recipients, ensuring that external attackers, even with access to information outside the black box, cannot correlate senders with recipients. Quilibrium utilizes Random Permutation Matrix (RPM) technology, which forms a shuffled network architecture that is complex in structure and resistant to both external and internal attacks, thereby providing enhanced anonymity, security, and scalability. d. Peer-to-Peer Communication GossipSub is a peer-to-peer messaging protocol that operates on a publish/subscribe model, extensively employed in blockchain technologies and decentralized applications (DApps). Quilibrium’s BlossomSub protocol enhances the traditional GossipSub framework, focusing on improving privacy safeguards, increasing resilience against Sybil attacks, and enhancing overall network performance. Storage Most traditional blockchains use cryptographic hash functions for foundational data integrity verification and rely on consensus mechanisms to maintain network consistency. This method presents two primary limitations: There is no built-in verification for the duration of storage, lacking direct defenses against timestamps or computational capability-based attacks.The separation between storage and consensus mechanisms can result in challenges related to data synchronization and consistency. Quilibrium’s storage strategy employs a Verifiable Delay Function (VDF) to create a timestamp-dependent chain structure that integrates storage with consensus mechanisms. This approach has several key features: Input processing: Utilizing hash functions like SHA256 and SHAKE128 to process inputs ensures that minor data variations result in substantial changes in hash values, enhancing resistance to tampering and facilitating easier verification.Delay Assurance: The computation is deliberately designed to be lengthy, with each task relying on the results of the previous step and unable to be hastened by increased computing power. This ensures outputs are based on consistent timestamp calculations. As the process is non-parallelizable, any attempts to recompute or alter the publicly disclosed VDF outcomes would take significant time, allowing network participants to detect and react.Quick verification: Verifying a VDF result takes less time than generating it, requiring only a handful of mathematical tests or some supplementary data to confirm its validity. Quilibrium Whitepaper This timestamp-based proof chain structure operates independently of block generation in traditional blockchains, theoretically reducing the incidence of MEV attacks and front-running behaviors. Data Query and Management Traditional blockchains primarily employ straightforward key-value storage or Merkle Trees for data management, which generally restricts their capability to represent complex relationships and facilitate advanced queries. Additionally, most existing blockchain systems lack inherent privacy safeguards during query execution by nodes, setting the stage for the development of privacy-enhancing technologies like zero-knowledge proofs. ​​Quilibrium has introduced “Oblivious Hypergraph”, integrating hypergraph structures with Oblivious Transfer technology, which facilitates complex querying capabilities while preserving data privacy. Specifically: Hypergraph Structure: This allows for edges that connect multiple vertices, significantly improving the ability to depict complex relationships. It can directly map various database models, enabling the representation and querying of any type of data relationship.Oblivious Transfer Technology: This ensures that nodes processing the data remain unaware of the actual content being accessed, thereby enhancing privacy during the querying process. Operating System The operating system is not a native concept in the blockchain space. Most traditional blockchains focus on consensus mechanisms and data immutability, often lacking complex operating systems. For instance, while Ethereum supports smart contracts, its operating system functions are quite basic, primarily limited to transaction processing and state management. Quilibrium has engineered an operating system that leverages its hypergraph database and has implemented regular operating system primitives such as file systems, schedulers, IPC mechanisms, message queues, and control key management. This integration directly with the database enhances the capability to develop sophisticated decentralized applications. Quilibrium Whitepaper Programming Languages Quilibrium has chosen Go as its primary programming language, augmented by Rust and JavaScript. Go is favored for its robust capability in handling concurrent tasks, straightforward syntax, and vibrant developer community. According to the Tiobe index, Go has climbed significantly in popularity, reaching the 7th position in the latest June rankings. Other notable blockchain projects that utilize Go for core development include Ethereum, Polygon, and Cosmos. Quilibrium Tiobe Project Overview Development Path and Roadmap Quilibrium released its white paper in December 2022, outlining a roadmap that is segmented into three distinct phases: Dusk, Equinox, and Event Horizon. Currently, in its early stages, Quilibrium development team conducts network updates and iterations bi-weekly. The network is now on version v1.4.20, with plans to skip the 1.5 phase of the roadmap, transitioning directly from version 1.4 to 2.0. Version 2.0, marking the mainnet release, concludes the Dusk phase and is slated for official release in late July, enabling the bridging of $QUIL tokens. According to provisional plans, the Equinox and Event Horizon phases will facilitate more sophisticated applications, such as streaming services and AI/ML model training. Team Members and Funding Cassie Heart, the founder and CEO of Quilibrium, brings a robust background with over 12 years of experience in software development and blockchain technology, previously holding a position as a senior software engineer at Coinbase.  Her disapproval of centralized social media platforms has led her to choose Farcaster as the primary platform for both her personal and Quilibrium’s official communications. On Farcaster, Cassie’s profile has attracted over 310,000 followers, including prominent individuals such as Ethereum’s founder, Vitalik Buterin. She is also an active contributor to Farcaster’s development. The Quilibrium project, initiated in April 2023, has shown consistent progress. According to the development dashboard, the team consists of 24 developers, under Cassie Heart’s leadership. The founder and CEO of Quilibrium, Cassie Heart, previously served as a senior software engineer at Coinbase, bringing over 12 years of experience in software development and blockchain technology.  Due to her disapproval of centralized social media platforms, both her personal and Quilibrium’s official project accounts are predominantly active on Farcaster. Cassie’s Farcaster profile boasts over 310,000 followers, including notable figures like Ethereum’s founder, Vitalik. Additionally, Cassie contributes to the development of Farcaster. According to the development dashboard for Quilibrium, the project started in April 2023 and has progressed steadily ever since. The dashboard lists 24 developers, led by Cassie Heart. Quilibrium The Quilibrium team has not publicly disclosed its financial backing or the investors involved. Tokenomics $QUIL is the native token of Quilibrium, with a completely fair launch. All tokens are generated through node operations. According to Cassie, the team manages a small fraction of the nodes, yet holds less than 1% of the total tokens. $QUIL operates without a fixed tokenomics and has no cap on the total supply. However, the supply is dynamically adjusted based on the rate of network adoption—more tokens are released as incentives for node operation as the network expands; conversely, if growth slows, the rate of token release is reduced accordingly. The following table outlines projections made by both the team and community members regarding the token emissions schedule. Currently, 340 million tokens are in circulation, with the anticipated final supply expected to converge at around 2 billion, subject to the ecosystem’s evolution. Source: @petejcrypto Risks At this stage, Quilibrium faces several potential risks: The project is in its early stages, with its mainnet yet to be launched. Additionally, the complexity of the project means that both its technological feasibility and market demand remain untested.In the short term, Quilibrium could encounter competitive challenges from the more established Arweave AO, particularly in terms of user perception and developer engagement.Furthermore, the lack of clearly defined tokenomics and the possibility of fluctuations in the token release rate present significant risks for investors. Valuation Valuing public-blockchain-type infrastructures is inherently complex, considering multiple factors like TVL, active on-chain addresses, dApps, and the developer community. Quilibrium remains in its early stages, and the token $AO of Arweave AO is not open for trading, preventing us from determining an accurate valuation. For reference, we list the circulating market value and fully diluted market value of projects with some conceptual overlap with Quilibrium (data as of June 23, 2024). Source: CoinGecko, Last updated time: June 23th, 2024 References and Acknowledgments This report is greatly supported by the reviews and feedback from @PleaseCallMeWhy, @ImDavidWeb3, and Connor. Reading List https://quilibrium.com/quilibrium.pdfhttps://paragraph.xyz/@quilibrium.comhttps://dashboard.quilibrium.com/https://www.youtube.com/watch?v=Ye677-FkgXE&ab_channel=CassandraHearthttps://dune.com/cincauhangus/quilibriumhttps://source.quilibrium.com/quilibrium/ceremonyclient/-/graphs/main?ref_type=headshttps://www.tiobe.com/tiobe-index/https://www.blocmates.com/meal-deal-research-reports/quilibrium-crypto-not-blockchain-long-live-the-internethttps://www.statista.com/chart/18819/worldwide-market-share-of-leading-cloud-infrastructure-service-providers/https://s2-labs.com/admin-tutorials/cloud-service-models/https://medium.com/@permadao/%E5%8E%BB%E4%B8%AD%E5%BF%83%E5%8C%96%E4%BA%91%E6%9C%8D%E5%8A%A1%E8%BF%9B%E5%8C%96%E5%8F%B2-%E4%BB%8E-dfinity-ic-%E5%88%B0-arweave-ao-839b09b4f3ffhttps://www.microsoft.com/en-us/investor/earnings/FY-2024-Q3/press-release-webcasthttps://x.com/perma_daoCN/status/1798565157435830416https://x.com/Pow2wer/status/1802455254065402106

The Next ICP? Quilibrium Brings A New Narrative for Decentralized Computing

By Lydia Wu, Researcher at Mint Ventures

Please note: As Quilibrium’s mainnet has not yet launched, and public information is limited, the descriptions of its incentive mechanisms, tokenomics, financing, and roadmap are based on public resources and may change in the future. This article is intended for research and educational purposes and should not be taken as investment advice. We welcome any feedback.
Key Insights 
Theses
Quilibrium aims to bridge the gap between the computational capabilities of the traditional internet and the decentralized nature of blockchain, establishing a unique decentralized cloud computing architecture. This synthesis offers a balanced approach, leveraging strengths from both worlds.Quilibrium has developed a database-driven operating system that resonates with the workflows of traditional software development, potentially attracting mainstream software developers. This system also supports Web3 developers in crafting sophisticated crypto applications, providing a versatile foundation for diverse development needs.The design of Quilibrium’s architecture places a strong emphasis on security and privacy, appealing to enterprises seeking to adopt cryptographic technologies while protecting sensitive data. For individual users, the early success of Farcaster showcases the potential of decentralized applications to engage users and drive profitability.Cassie Heart, the Founder and CEO of Quilibrium, is a former senior engineer at Coinbase and developer of Farcaster, leading a team recognized for its profound expertise, consistent performance, and innovative approach.
Risks 
The project is in its early stages, with its mainnet yet to be launched. Additionally, the complexity of the project means that both its technological feasibility and market demand remain untested.In the short term, Quilibrium could encounter competitive challenges from the more established Arweave AO, particularly in terms of user perception and developer engagement.Furthermore, the lack of clearly defined tokenomics and the possibility of fluctuations in the token release rate present significant risks for investors.
Valuation
Given that Quilibrium is in its early stages, an accurate valuation is not currently feasible. However, when examining the circulating and fully diluted market cap, Quilibrium appears competitively valued compared to other market players with similar concepts.
Business Analysis
Quilibrium defines itself as a “decentralized internet layer protocol, providing the creature comforts of Cloud without sacrificing privacy or scalability,” and as a “decentralized Platform-as-a-Service(PaaS) solution.” This section explores Quilibrium’s business model by addressing the following key questions:
What issues exist with traditional internet cloud computing? Why is there a need for new decentralized computing platforms? What makes Quilibrium unique compared to other popular blockchain architectures?

Cassie Heart on Farcaster

Business Positioning
Start with Computing
In both Web2 and Web3, “computing” is a vital concept, acting as the catalyst for the development, execution, and scaling of applications. 
In traditional Internet frameworks, computing tasks are usually handled by centralized servers. The emergence of cloud computing has improved the scalability, accessibility, and cost-efficiency of these tasks, increasingly becoming the dominant form of computing.
Cloud services offered by major providers are generally categorized into three main types: Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS). Each category meets different user needs and capabilities, offering varying levels of resource control. While SaaS is most familiar to end-users, PaaS and IaaS are primarily targeted at developers.

Made by Lydia@MintVentures

Data source: S2 Lab, Made by Lydia@Mint Ventures

In mainstream blockchains like Ethereum, computing is handled by decentralized nodes. This approach operates without central servers, with each node executing tasks locally and ensuring data accuracy and consistency through a consensus mechanism. However, the capability and speed of decentralized computing often fall short compared to traditional cloud services. 
Quilibrium aims to balance the robust computing power and scalability of the traditional internet with the decentralization of blockchain, paving the way for new possibilities in application development.

Cassie Heart’s Live Streaming

Centralization in Computer Systems
For most end users, the centralization of computer systems is not easily perceivable, as they mainly interact with the hardware layer. PCs, smartphones, and other devices are distributed globally and run independently under individual control, giving the impression that hardware-level systems are not centralized.
However, in contrast to this hardware dispersion, computer systems are significantly more centralized at the network architecture and cloud computing service levels. In the first quarter of 2024, Amazon AWS, Microsoft Azure, and Google Cloud collectively held over 67% of the cloud service market share, far outpacing newer entrants.

Source: Synergy Research Group
Moreover, benefiting from the AI boom, cloud service providers continue to grow stronger. Microsoft Azure, as the exclusive cloud provider for OpenAI, has reversed its previous slow performance, now showing accelerated growth. In Microsoft’s fiscal third quarter of 2024 (the first calendar quarter of 2024), Azure and other cloud services saw a revenue increase of 31%, surpassing market expectations of 28.6%.

Data Source: Microsoft, Made by Lydia@MintVentures

In addition to market competition, privacy and security concerns related to centralized computing systems are increasingly in the spotlight. Each outage at major cloud service providers has a widespread impact. Between 2010 and 2019, AWS experienced 22 sudden outages, averaging 2.4 outages per year. These disruptions not only affected Amazon’s e-commerce business but also impacted the online services of companies reliant on AWS, including Robinhood, Disney, Netflix, and Nintendo.
Introduction of Decentralized Computing
In this context, the need for decentralized computing is increasingly emphasized. As centralized cloud service providers move towards distributed architectures—by replicating data and services in multiple locations to prevent single points of failure and enhancing performance through edge storage—the focus of decentralized computing narratives has shifted towards data security, privacy, scalability, and cost-efficiency. 
Let’s explore several concepts of decentralized computing proposed by various projects, all aiming to disperse data storage and processing to create a globally distributed computing platform that supports the development of decentralized applications:
World Computer: This generally refers to Ethereum, which provides a global smart contract execution environment. Its key functionalities include decentralized computing and the uniform execution of smart contracts worldwide.Internet Computer: Typically refers to the ICP developed by the Dfinity Foundation, with its goal of enhancing the Internet’s capabilities to enable decentralized applications to operate directly on the Internet.Hyper Parallel Computer: Generally refers to the AO protocol introduced by Arweave, a distributed computing system built on the Arweave network, noted for its high parallelism and fault tolerance.
It’s important to note that ICP, AO, and Quilibrium are not traditional blockchains in the usual sense. They do not rely on a linear block structure but maintain fundamental principles like decentralization and data immutability. They can be seen as natural expansions of blockchain technology. Although ICP has not yet fully achieved its ambitious goals, the advent of AO and Quilibrium has opened up new possibilities that could influence the future of Web3.
The table below contrasts the technical features and application focuses of the three platforms, helping readers evaluate “whether Quilibrium might follow in ICP’s footsteps” and highlighting the differences between Quilibrium, another pioneering solution in decentralized computing, and AO, often dubbed the “Ethereum killer.”

Consensus Mechanism
In traditional blockchains, the consensus mechanism functions at a fundamental and abstract level, determining how the network achieves consensus, processes and verifies transactions, and executes other activities. The selection of different consensus mechanisms influences key aspects of the network such as security, speed, scalability, and decentralization.
Quilibrium’s consensus mechanism, known as “Proof of Meaningful Work” (PoMW), mandates that miners engage in genuinely beneficial tasks for the network, including data storage, data indexing, and network maintenance. The design of the PoMW consensus mechanism integrates multiple fields such as cryptography, multi-party computation, decentralized systems, database architecture, and graph theory. It aims to lessen dependence on single resources like energy or capital, preserve the network’s decentralized nature, and uphold security and scalability as the network expands.
The incentive mechanism is crucial for the efficient functioning of the consensus mechanism. Quilibrium’s incentive allocation is dynamic, adjusting based on network conditions to align incentives with current demands. Additionally, Quilibrium introduces a multi-proof mechanism, allowing a node to verify multiple data segments, ensuring the network remains operational even with a shortage of nodes and core resources.
To help your understanding, we can use a simplified formula to calculate the ultimate earnings or miners, where the unit reward adjusts dynamically with the network size.
Earnings = Score × Unit Reward
The score is calculated based on various factors, and the formula is as follows:

The parameters are defined as follows:
Time in Mesh for Topic: Longer engagement and greater stability yield a higher score.First Message Deliveries for Topic: A higher number of initial message deliveries increases the score.Mesh Message Delivery Rate/Failures for Topic: Nodes with higher delivery rates and fewer failures receive higher scores.Invalid Messages for Topic: Fewer instances of delivering invalid messages lead to a higher score. 
The weighted sum of these four parameters is subject to a topic cap (TC), which serves to keep the score within a certain range and prevent unfair scoring due to excessively high individual parameters.
Application-Specific Score: Defined by the particular application.IP Collocation Factor: Higher scores are awarded to nodes with fewer others sharing the same IP address.

Quilibrium Dashboard

Quilibrium currently operates over 60,000 nodes, and the actual earnings of these nodes may vary due to the differing parameter weights between versions. From version v1.4.19 onwards, miners can view their earnings in real-time, although the earnings can only be claimed once the mainnet goes live.
Network Architecture
Quilibrium’s core business is its decentralized PaaS solution, characterized by a network model that includes communication, storage, data querying and management, and operating systems. This section highlights the distinctive aspects of its design compared to other blockchains. Readers interested in technical specifics and methodologies can refer to the official documents and whitepaper.
Communication
Communication serves as the foundational component of the Quilibrium network and consists of four parts:
a. Key Generation
Quilibrium introduces a key generation mechanism called PCAS (Planted Clique Addressing Scheme), based on graph theory. Like traditional blockchain technology, PCAS utilizes asymmetric encryption: each user possesses a public key and a private key. The public key is used for encrypting messages or verifying signatures and can be made public, while the private key is used for decrypting messages or creating signatures and is kept private. The main differences lie in the methods of key generation, forms, and application scenarios.

b. End-to-End Encryption
End-to-end encryption (E2EE) is a vital component that ensures secure communication between nodes. With E2EE, only the parties directly involved in the communication can view the plaintext data. Systems or intermediaries that assist in message delivery are unable to read the contents.
Quilibrium utilizes an end-to-end encryption method known as Triple-Ratchet, which provides enhanced security over traditional ECDH schemes. Unlike traditional schemes that use a single static key or periodically update the key, the Triple-Ratchet protocol updates the key after each communication session. This ensures forward secrecy, post-compromise security, deniability, and replay protection, and supports out-of-order message delivery. This method is particularly suited for group communications, although it is relatively more complex and computationally costly.
c. Shuffled Lattice Routing
Mixnets function as a black box that receives messages from senders and forwards them to recipients, ensuring that external attackers, even with access to information outside the black box, cannot correlate senders with recipients.
Quilibrium utilizes Random Permutation Matrix (RPM) technology, which forms a shuffled network architecture that is complex in structure and resistant to both external and internal attacks, thereby providing enhanced anonymity, security, and scalability.
d. Peer-to-Peer Communication
GossipSub is a peer-to-peer messaging protocol that operates on a publish/subscribe model, extensively employed in blockchain technologies and decentralized applications (DApps). Quilibrium’s BlossomSub protocol enhances the traditional GossipSub framework, focusing on improving privacy safeguards, increasing resilience against Sybil attacks, and enhancing overall network performance.
Storage
Most traditional blockchains use cryptographic hash functions for foundational data integrity verification and rely on consensus mechanisms to maintain network consistency. This method presents two primary limitations:
There is no built-in verification for the duration of storage, lacking direct defenses against timestamps or computational capability-based attacks.The separation between storage and consensus mechanisms can result in challenges related to data synchronization and consistency.
Quilibrium’s storage strategy employs a Verifiable Delay Function (VDF) to create a timestamp-dependent chain structure that integrates storage with consensus mechanisms. This approach has several key features:
Input processing: Utilizing hash functions like SHA256 and SHAKE128 to process inputs ensures that minor data variations result in substantial changes in hash values, enhancing resistance to tampering and facilitating easier verification.Delay Assurance: The computation is deliberately designed to be lengthy, with each task relying on the results of the previous step and unable to be hastened by increased computing power. This ensures outputs are based on consistent timestamp calculations. As the process is non-parallelizable, any attempts to recompute or alter the publicly disclosed VDF outcomes would take significant time, allowing network participants to detect and react.Quick verification: Verifying a VDF result takes less time than generating it, requiring only a handful of mathematical tests or some supplementary data to confirm its validity.

Quilibrium Whitepaper
This timestamp-based proof chain structure operates independently of block generation in traditional blockchains, theoretically reducing the incidence of MEV attacks and front-running behaviors.
Data Query and Management
Traditional blockchains primarily employ straightforward key-value storage or Merkle Trees for data management, which generally restricts their capability to represent complex relationships and facilitate advanced queries. Additionally, most existing blockchain systems lack inherent privacy safeguards during query execution by nodes, setting the stage for the development of privacy-enhancing technologies like zero-knowledge proofs.
​​Quilibrium has introduced “Oblivious Hypergraph”, integrating hypergraph structures with Oblivious Transfer technology, which facilitates complex querying capabilities while preserving data privacy. Specifically:
Hypergraph Structure: This allows for edges that connect multiple vertices, significantly improving the ability to depict complex relationships. It can directly map various database models, enabling the representation and querying of any type of data relationship.Oblivious Transfer Technology: This ensures that nodes processing the data remain unaware of the actual content being accessed, thereby enhancing privacy during the querying process.
Operating System
The operating system is not a native concept in the blockchain space. Most traditional blockchains focus on consensus mechanisms and data immutability, often lacking complex operating systems. For instance, while Ethereum supports smart contracts, its operating system functions are quite basic, primarily limited to transaction processing and state management.
Quilibrium has engineered an operating system that leverages its hypergraph database and has implemented regular operating system primitives such as file systems, schedulers, IPC mechanisms, message queues, and control key management. This integration directly with the database enhances the capability to develop sophisticated decentralized applications.

Quilibrium Whitepaper

Programming Languages
Quilibrium has chosen Go as its primary programming language, augmented by Rust and JavaScript. Go is favored for its robust capability in handling concurrent tasks, straightforward syntax, and vibrant developer community. According to the Tiobe index, Go has climbed significantly in popularity, reaching the 7th position in the latest June rankings. Other notable blockchain projects that utilize Go for core development include Ethereum, Polygon, and Cosmos.

Quilibrium

Tiobe
Project Overview
Development Path and Roadmap
Quilibrium released its white paper in December 2022, outlining a roadmap that is segmented into three distinct phases: Dusk, Equinox, and Event Horizon.
Currently, in its early stages, Quilibrium development team conducts network updates and iterations bi-weekly. The network is now on version v1.4.20, with plans to skip the 1.5 phase of the roadmap, transitioning directly from version 1.4 to 2.0. Version 2.0, marking the mainnet release, concludes the Dusk phase and is slated for official release in late July, enabling the bridging of $QUIL tokens.
According to provisional plans, the Equinox and Event Horizon phases will facilitate more sophisticated applications, such as streaming services and AI/ML model training.
Team Members and Funding
Cassie Heart, the founder and CEO of Quilibrium, brings a robust background with over 12 years of experience in software development and blockchain technology, previously holding a position as a senior software engineer at Coinbase. 
Her disapproval of centralized social media platforms has led her to choose Farcaster as the primary platform for both her personal and Quilibrium’s official communications. On Farcaster, Cassie’s profile has attracted over 310,000 followers, including prominent individuals such as Ethereum’s founder, Vitalik Buterin. She is also an active contributor to Farcaster’s development.
The Quilibrium project, initiated in April 2023, has shown consistent progress. According to the development dashboard, the team consists of 24 developers, under Cassie Heart’s leadership.
The founder and CEO of Quilibrium, Cassie Heart, previously served as a senior software engineer at Coinbase, bringing over 12 years of experience in software development and blockchain technology. 
Due to her disapproval of centralized social media platforms, both her personal and Quilibrium’s official project accounts are predominantly active on Farcaster. Cassie’s Farcaster profile boasts over 310,000 followers, including notable figures like Ethereum’s founder, Vitalik. Additionally, Cassie contributes to the development of Farcaster.
According to the development dashboard for Quilibrium, the project started in April 2023 and has progressed steadily ever since. The dashboard lists 24 developers, led by Cassie Heart.

Quilibrium

The Quilibrium team has not publicly disclosed its financial backing or the investors involved.
Tokenomics
$QUIL is the native token of Quilibrium, with a completely fair launch. All tokens are generated through node operations. According to Cassie, the team manages a small fraction of the nodes, yet holds less than 1% of the total tokens.
$QUIL operates without a fixed tokenomics and has no cap on the total supply. However, the supply is dynamically adjusted based on the rate of network adoption—more tokens are released as incentives for node operation as the network expands; conversely, if growth slows, the rate of token release is reduced accordingly.
The following table outlines projections made by both the team and community members regarding the token emissions schedule. Currently, 340 million tokens are in circulation, with the anticipated final supply expected to converge at around 2 billion, subject to the ecosystem’s evolution.

Source: @petejcrypto
Risks
At this stage, Quilibrium faces several potential risks:
The project is in its early stages, with its mainnet yet to be launched. Additionally, the complexity of the project means that both its technological feasibility and market demand remain untested.In the short term, Quilibrium could encounter competitive challenges from the more established Arweave AO, particularly in terms of user perception and developer engagement.Furthermore, the lack of clearly defined tokenomics and the possibility of fluctuations in the token release rate present significant risks for investors.
Valuation
Valuing public-blockchain-type infrastructures is inherently complex, considering multiple factors like TVL, active on-chain addresses, dApps, and the developer community. Quilibrium remains in its early stages, and the token $AO of Arweave AO is not open for trading, preventing us from determining an accurate valuation.
For reference, we list the circulating market value and fully diluted market value of projects with some conceptual overlap with Quilibrium (data as of June 23, 2024).

Source: CoinGecko, Last updated time: June 23th, 2024
References and Acknowledgments
This report is greatly supported by the reviews and feedback from @PleaseCallMeWhy, @ImDavidWeb3, and Connor.
Reading List
https://quilibrium.com/quilibrium.pdfhttps://paragraph.xyz/@quilibrium.comhttps://dashboard.quilibrium.com/https://www.youtube.com/watch?v=Ye677-FkgXE&ab_channel=CassandraHearthttps://dune.com/cincauhangus/quilibriumhttps://source.quilibrium.com/quilibrium/ceremonyclient/-/graphs/main?ref_type=headshttps://www.tiobe.com/tiobe-index/https://www.blocmates.com/meal-deal-research-reports/quilibrium-crypto-not-blockchain-long-live-the-internethttps://www.statista.com/chart/18819/worldwide-market-share-of-leading-cloud-infrastructure-service-providers/https://s2-labs.com/admin-tutorials/cloud-service-models/https://medium.com/@permadao/%E5%8E%BB%E4%B8%AD%E5%BF%83%E5%8C%96%E4%BA%91%E6%9C%8D%E5%8A%A1%E8%BF%9B%E5%8C%96%E5%8F%B2-%E4%BB%8E-dfinity-ic-%E5%88%B0-arweave-ao-839b09b4f3ffhttps://www.microsoft.com/en-us/investor/earnings/FY-2024-Q3/press-release-webcasthttps://x.com/perma_daoCN/status/1798565157435830416https://x.com/Pow2wer/status/1802455254065402106
Emerging Trends in Crypto AI Sector: Key Catalysts, Development Frameworks and Top ProjectsBy Alex XuResearch Partner at Mint Ventures Introduction This cycle of the crypto bull market has been the most uninspiring in terms of commercial innovation. Unlike the previous bull market, which saw phenomenal trends like DeFi, NFTs, and GameFi, this cycle lacks significant industry hotspots. Consequently, there has been sluggish growth in user base, industry investment, and developer activity. This trend is also evident in the price of crypto assets. Over the entire cycle, most altcoins, including ETH, have consistently lost value relative to BTC. The valuation of smart contract platforms is largely driven by the prosperity of their applications. When innovation in application development stagnates, it becomes challenging for the valuation of public chains to rise. However, artificial intelligence (AI), as a relatively new sector in the crypto business landscape, could benefit from the explosive growth and ongoing hotspots in the broader commercial world. This gives AI projects within the crypto space the potential to attract significant incremental attention. In the IO.NET report published by Mint Ventures in April, the necessity of integrating AI with crypto was thoroughly analyzed. The advantages of crypto-economic solutions—such as determinism, efficient resource allocation, and trustlessness—could potentially address the three major challenges of AI: randomness, resource intensity, and the difficulty in distinguishing between human and machine. In the AI sector of the crypto economy, I want to discuss and explore several critical issues in this article, including: Emerging or potentially explosive narratives exist in the crypto AI sector.The catalytic paths and logical frameworks of these narratives.Crypto + AI Projects.The risks and uncertainties involved in the development of the crypto + AI sector. Please note that this article reflects my current thinking and may evolve. The opinions here are subjective and there may be errors in facts, data, and logical reasoning. This is not financial advice, but feedback and discussions are welcomed. The Next Wave of Narratives in the Crypto AI Sector Before diving into the emerging trends in the crypto AI sector, let’s first examine the current leading narratives. Based on market cap, those with a valuation exceeding $1 billion include: Computing PowerRender Network ($RNDR): holding circulating market cap of $3.85 billion, Akash: with a circulating market cap of $1.2 billionIO.NET: recently valued at $1 billion in its latest financing round.Algorithm NetworksBittensor ($TAO): Boast a circulating market cap of $2.97 billion.AI AgentsFetch.ai ($FET): reaches a pre-merger circulating market cap of $2.1 billion *Data Updated Time: May 24, 2024. Beyond the fields mentioned above, which AI sector will produce the next project with a market cap exceeding $1 billion? I believe this can be speculated from two perspectives: the “industrial supply side” narrative and the “GPT moment” narrative. Examining The Opportunities in The Energy and Data Field from The Perspective of Industrial Supply Side From the perspective of the industrial supply side, four key driving forces behind AI development are: Algorithms: High-quality algorithms can execute training and inference tasks more efficiently.Computing Power: Both model training and inference demand substantial computing power provided by GPU hardware. This requirement represents a major industrial bottleneck, with the current chip shortage driving up prices for mid-to-high-end chips.Energy: AI data centers require significant energy consumption. Beyond the electricity needed for GPUs to perform computational tasks, substantial energy is also needed for cooling the GPUs. In large data centers, cooling systems alone account for about 40% of total energy consumption.Data: Enhancing the performance of large models necessitates expanding training parameters, leading to a massive demand for high-quality data. Regarding the four industrial driving forces mentioned above, the algorithm and computing power sectors already have crypto projects with circulating market cap exceeding $1 billion. However, the energy and data sectors have yet to see projects reach similar market caps. Actually, shortages in the supply of energy and data may soon emerge, potentially becoming the next industry hotspots and driving the surge of related projects in the crypto space. Let’s begin with energy part. On February 29, 2024, Elon Musk remarked at the Bosch ConnectedWorld 2024 conference, “I predicted the chip shortage more than a year ago, and the next shortage will be electricity. I think next year you will see that they just can’t find enough electricity to run all the chips.” According to specific data, the Stanford University Institute for Human-Centered Artificial Intelligence, led by Fei-Fei Li, publishes the “AI Index Report” annually. In their 2022 report on the AI industry for 2021, the research group estimated that AI energy consumption that year accounted for only 0.9% of global electricity demand, putting limited pressure on energy and the environment. However, in 2023, the International Energy Agency (IEA) summarized 2022 by stating that global data centers consumed approximately 460 terawatt-hours (TWh) of electricity, accounting for 2% of global electricity demand. They also predicted that by 2026, global data center energy consumption will be at least 620 TWh, potentially reaching up to 1050 TWh. In reality, the International Energy Agency’s estimates remain conservative, as numerous AI projects poised to launch will demand significantly more energy than anticipated in 2023.  For example, Microsoft and OpenAI are planning the Stargate project. This ambitious initiative is set to commence in 2028 and be completed around 2030. The project aims to build a supercomputer equipped with millions of dedicated AI chips, providing OpenAI with unprecedented computing power to advance its research in artificial intelligence, particularly large language models. The estimated cost of this project exceeds $100 billion, which is 100 times the cost of current large data centers. The energy consumption for the Stargate project alone is expected to reach 50 TWh. As a result, OpenAI’s founder Sam Altman stated at the Davos Forum this January: “Future artificial intelligence will require energy breakthroughs, as the electricity consumed by AI will far exceed expectations.” Following computing power and energy, the next major shortage in the rapidly growing AI industry is likely to be data. In fact, the shortage of high-quality data necessary for AI has already become a reality. Through the ongoing evolution of GPT, we have largely grasped the pattern of enhancing the capabilities of large language models—by expanding model parameters and training data, the capabilities of these models can be exponentially increased. This process shows no immediate technical bottleneck. However, high-quality and publicly available data are likely to become increasingly scarce in the future. AI products may face supply-demand conflicts similar to those experienced with chips and energy. Firstly, there is an increase in disputes over data ownership. On December 27, 2023, The New York Times filed a lawsuit against OpenAI and Microsoft in the U.S. District Court, alleging that they used millions of its articles without permission to train the GPT model. The New York Times is seeking billions of dollars in statutory and actual damages for the “illegal copying and use of uniquely valuable works” and is demanding the destruction of all models and training data that include its copyrighted materials. At the end of March 2024, The New York Times issued a new statement, expanding its accusations beyond OpenAI to include Google and Meta. The statement claimed that OpenAI had used a speech recognition tool called Whisper to transcribe a large number of YouTube videos into text, which was then used to train GPT-4. The New York Times argued that it has become common practice for large companies to employ underhanded tactics in training their AI models. They also pointed out that Google is engaged in similar practices, converting YouTube video content into text for their model training, essentially infringing on the rights of video content creators. The lawsuit between The New York Times and OpenAI, dubbed the first “AI copyright case,” is unlikely to be resolved quickly due to its complexity and the profound impact it could have on the future of content and the AI industry. One potential outcome is an out-of-court settlement, with deep-pocketed Microsoft and OpenAI paying a significant amount in compensation. However, future disputes over data copyright will inevitably drive up the overall cost of high-quality data. Furthermore, Google, as the world’s largest search engine, has been reported to be considering charging fees for its search services—not for the general public, but for AI companies. Source: Reuters Google’s search engine servers hold vast amounts of content—essentially, all the content that has appeared on web pages since the 21st century. AI-driven search products, such as Perplexity and Kimi and Meta Sota developed by Chinese companies, process the data retrieved from these searches through AI and then deliver it to users. Introducing charges for AI companies to access search engine data will undoubtedly raise the cost of obtaining data. Furthermore, AI giants are not just focusing on public data; they are also targeting non-public internal data. Photobucket, a long-established image and video hosting website, once boasted 70 million users and nearly half of the U.S. online photo market share in the early 2000s. However, with the rise of social media, Photobucket’s user base has significantly dwindled, now standing at only 2 million active users, each paying a steep annual fee of $399. According to its user agreement and privacy policy, accounts inactive for more than a year are reclaimed, granting Photobucket the right to use the uploaded images and videos. Photobucket’s CEO, Ted Leonard, disclosed that their 1.3 billion photos and videos are extremely valuable for training generative AI models. He is currently negotiating with several tech companies to sell this data, with prices ranging from 5 cents to 1 dollar per photo and more than 1 dollar per video. Leonard estimates that Photobucket’s data could be worth over 1 billion dollars. The research team EPOCH, which specializes in AI development trends, published a report titled “Will we run out of data? An analysis of the limits of scaling datasets in Machine Learning.” This report, based on the 2022 usage of data in machine learning and the generation of new data, while also considering the growth of computing resources, concluded that high-quality text data could be exhausted between February 2023 and 2026, and image data might run out between 2030 and 2060. Without significant improvements in data utilization efficiency or the emergence of new data sources, the current trend of large machine learning models that depend on massive datasets could slow down. Considering the current trend of AI giants buying data at high prices, it seems that free, high-quality text data has indeed run dry, validating EPOCH’s prediction from two years ago.  Concurrently, solutions to the “AI data shortage” are emerging, specifically AI-data-as-a-service. Defined.ai is one such company that offers customized, high-quality real data for AI companies.  Examples of Data Types on Defined.ai The business model of Defined.ai works as follows: AI companies specify their data requirements, such as needing images with a certain resolution quality, free from blurriness and overexposure, and with authentic content. Companies can also request specific themes based on their training tasks, like nighttime photos of traffic cones, parking lots, and signposts to enhance AI’s night scene recognition. The public can accept these tasks, upload their photos, which are then reviewed by Defined.ai. Approved images are paid for, typically $1-2 per high-quality image, $5-7 per short video clip, and $100-300 for a high-quality video over 10 minutes. Text is compensated at $1 per thousand words, with task completers earning about 20% of the fees. This approach to data provision could become a new crowdsourcing business akin to “data labeling.” Global task distribution, economic incentives, data asset pricing, circulation, and privacy protection, with everyone able to participate, sound very much like a business model suited to the Web3 paradigm. Analyzing The Crypto + AI Projects from The Perspective of Industrial Supply The attention generated by the chip shortage has extended into the crypto industry, positioning decentralized computing power as the most popular and highest-valued AI sector to date.  If the supply and demand conflicts in the AI industry for energy and data become acute in the next 1-2 years, what narrative-related projects are currently present in the crypto industry? Let’s start with energy-concept projects.  Currently, energy projects listed on major centralized exchanges (CEX) are very scarce, with Power Ledger and its native token $POWR being the sole example. Power Ledger was established in 2017 as a blockchain-based comprehensive energy platform aimed at decentralizing energy trading. It promotes direct electricity trading among individuals and communities, supports the widespread adoption of renewable energy, and ensures transaction transparency and efficiency through smart contracts. Initially, Power Ledger operated on a consortium chain adapted from Ethereum. In the second half of 2023, Power Ledger updated its whitepaper and launched its own comprehensive public chain, based on Solana’s technical framework, to handle high-frequency microtransactions in the distributed energy market. Power Ledger’s primary business areas currently include: Energy Trading: Enabling users to buy and sell electricity directly in a peer-to-peer way, particularly from renewable sources.Environmental Product Trading: Facilitating the trading of carbon credits and renewable energy certificates, as well as financing based on environmental products.Public Chain Operations: Attracting application developers to build on the Power Ledger blockchain, with transaction fees paid in $POWR tokens. The current circulating market cap of the Power Ledger is $170 million, with a fully diluted market cap of $320 million. In comparison to energy-concept crypto projects, there is a richer variety of targets in the data sector. Listed below are the data sector projects I am currently following, which have been listed on at least one major CEX, such as Binance, OKX, or Coinbase, arranged by fully diluted valuation (FDV) from low to high: 1. Streamr ($DATA) Streamr’s value proposition is to build a decentralized real-time data network where users can freely trade and share data while retaining full control over their own information. Through its data marketplace, Streamr aims to enable data producers to sell data streams directly to interested consumers, eliminating the need for intermediaries, thus reducing costs and increasing efficiency. Source: https://streamr.network/hub/projects In real-world applications, Streamr has collaborated with another Web3 vehicle hardware project, DIMO, to collect data such as temperature and air pressure through DIMO hardware sensors installed in vehicles. This data is then transmitted as weather data streams to organizations that need it. Unlike other data projects, Streamr focuses more on IoT and hardware sensor data. Besides the DIMO vehicle data, other notable projects include real-time traffic data streams in Helsinki. Consequently, Streamr’s token, $DATA, experienced a significant surge, doubling its value in a single day during the peak of the Depin concept last December.  Currently, Streamr’s circulating market cap is $44 million, with a fully diluted market cap of $58 million. 2. Covalent ($CQT) Unlike other data projects, Covalent focuses on providing blockchain data. The Covalent network reads data from blockchain nodes via RPC, processes and organizes it, and creates an efficient query database. This allows Covalent users to quickly retrieve the information they need without performing complex queries directly on blockchain nodes. Such services are referred to as “blockchain data indexing.” Covalent primarily serves enterprise customers, including various DeFi protocols, and many centralized crypto companies such as Consensys (the parent company of MetaMask), CoinGecko (a well-known crypto asset tracking site), Rotki (a tax tool), and Rainbow (a crypto wallet). Additionally, traditional financial industry giants like Fidelity and Ernst & Young are also among Covalent’s clients. According to Covalent’s official disclosures, the project’s revenue from data services has already surpassed that of the leading project in the same field, The Graph. The Web3 industry, with its integrated, transparent, authentic, and real-time on-chain data, is poised to become a high-quality data source for specialized AI scenarios and specific “small AI models.” Covalent, as a data provider, has already started offering data for various AI scenarios and has introduced verifiable structured data tailored for AI applications. Source: Solutions on Covalent For instance, Covalent provides data for the on-chain smart trading platform SmartWhales, which uses AI to identify profitable trading patterns and addresses. Entendre Finance leverages Covalent’s structured data, processed by AI technology for real-time insights, anomaly detection, and predictive analytics. Currently, the main application scenarios for Covalent’s on-chain data services are predominantly in the financial field. However, as Web3 products and data types continue to diversify, the use cases for on-chain data are expected to expand further. The circulating market cap of Covalent is $150 million, with a fully diluted market cap of $235 million, offering a noticeable valuation advantage compared to The Graph, a leading project in the blockchain data indexing sector. 3. Hivemapper ($Honey) Among all data types, video data typically commands the highest price. Hivemapper can provide AI companies with both video and map information. Hivemapper is a decentralized global mapping project that aims to create a detailed, dynamic, and accessible map system through blockchain technology and community contributions. Participants capture map data using dashcams and add it to the open-source Hivemapper data network, earning $HONEY tokens as rewards for their contributions. To enhance network effects and reduce interaction costs, Hivemapper is built on Solana. Hivemapper was founded in 2015 with the original vision of creating maps using drones. However, this approach proved difficult to scale, leading the company to shift towards using dashcams and smartphones to capture geographic data, thereby reducing the cost of global map creation. Compared to street view and mapping software like Google Maps, Hivemapper leverages an incentive network and crowdsourcing model to more efficiently expand map coverage, maintain the freshness of real-world map data, and enhance video quality. Before the surge in AI demand for data, Hivemapper’s main customers included the autonomous driving departments of automotive companies, navigation service providers, governments, insurance companies, and real estate firms. Today, Hivemapper can provide extensive road and environmental data to AI and large models through APIs. By continuously updating image and road feature data streams, AI and ML models will be better equipped to translate this data into enhanced capabilities, enabling them to perform tasks related to geographic location and visual judgment more effectively. Source: Hivemapper Blog Currently, the circulating market cap of $Honey, the  native token of Hivemapper, is $120 million, with a fully diluted market cap of $496 million.  Besides the aforementioned projects, other notable projects in the data sector include: 1. The Graph ($GRT): With a circulating market cap of $3.2 billion and a fully diluted valuation (FDV) of $3.7 billion, The Graph provides blockchain data indexing services similar to Covalent. 2. Ocean Protocol ($OCEAN): Ocean Protocol has a circulating market cap of $670 million and an FDV of $1.45 billion. The project aims to facilitate the exchange and monetization of data and data-related services through its open-source protocol. Ocean Protocol connects data consumers with data providers, ensuring trust, transparency, and traceability in data sharing. The project is set to merge with Fetch.ai and SingularityNET, with the token converting to $ASI. The Reappearance of the GPT Moment and the Advent of General Artificial Intelligence In my view, the “AI sector” in the crypto industry truly began in 2023, the right year ChatGPT shocked the world. The rapid surge of crypto AI projects was largely driven by the “wave of enthusiasm” following the explosive growth of the AI industry. Despite the continuous upgrades in capabilities with models like GPT-4 and GPT-turbo, and the impressive video creation abilities demonstrated by Sora, as well as the rapid development of large language models beyond OpenAI, it’s undeniable that the technological advancements in AI are causing diminishing cognitive shock to the public. People are gradually adopting AI tools, and large-scale job replacements have yet to materialize. Will we witness another “GPT moment” in the future, a leap in development that shocks the public and makes them realize that their lives and work will be fundamentally changed? This moment could be the arrival of general artificial intelligence (AGI). AGI, or artificial general intelligence, refers to machines that possess human-like general cognitive abilities, capable of solving a wide range of complex problems, rather than being limited to specific tasks. AGI systems have high levels of abstract thinking, extensive background knowledge, comprehensive common-sense reasoning, causal understanding, and cross-disciplinary transfer learning abilities. AGI performs at the level of the best humans across various fields and, in terms of overall capability, completely surpasses even the most outstanding human groups. In fact, whether depicted in science fiction novels, games, films, or through the public’s expectations following the rapid rise of GPT, society has long anticipated the emergence of AGI that surpasses human cognitive levels. In other words, GPT itself is a precursor to AGI, a harbinger of general artificial intelligence. The reason GPT has such a profound industrial impact and psychological shock is that its deployment and performance have far exceeded public expectations. People did not anticipate that an AI system capable of passing the Turing test would arrive so quickly and with such impressive capabilities. In fact, artificial general intelligence (AGI) may once again create a “GPT moment” within the next 1-2 years: just as people are becoming accustomed to using GPT as an assistant, they may soon discover that AI has evolved beyond merely being an assistant. It could independently tackle highly creative and challenging tasks, including solving problems that have stumped top human scientists for decades. On April 8th of this year, Elon Musk was interviewed by Nicolai Tangen, the Chief Investment Officer of Norway’s sovereign wealth fund, and he discussed the timeline for the emergence of AGI.  Musk stated, “If we define AGI as being smarter than the smartest humans, I think it is very likely to appear by 2025.” According to Elon Musk’s prediction, it would take at most another year and a half for AGI to arrive. However, he added a condition: “provided that electricity and hardware can keep up.” The benefits of AGI’s arrival are obvious. It means that human productivity will make a significant leap forward, and many scientific problems that have stumped us for decades will be resolved. If we define “the smartest humans” as Nobel Prize winners, it means that, provided we have enough energy, computing power, and data, we could have countless tireless “Nobel laureates” working around the clock to tackle the most challenging scientific problems. However, Nobel Prize winners are not as rare as one in hundreds of millions. Their abilities and intellect are often at the level of top university professors. However, due to probability and luck, they chose the right direction, persisted, and achieved results. Many of their equally capable peers might have won Nobel Prizes in a parallel universe of scientific research. Unfortunately, there are still not enough top university professors involved in scientific breakthroughs, so the speed of “exploring all the correct directions in scientific research” remains very slow. With AGI, and given sufficient energy and computing power, we could have an unlimited number of “Nobel laureate-level” AGIs conducting in-depth exploration in any potential direction for scientific breakthroughs. The speed of technological advancement would increase exponentially. This acceleration would lead to a hundredfold increase in resources we currently consider expensive and scarce over the next 10 to 20 years, such as food production, new materials, medicines, and high-quality education. The cost of acquiring these resources would decrease dramatically. We would be able to support a larger population with fewer resources, and per capita wealth would increase rapidly. Global GDP Trend Made by World Bank This might sound somewhat sensational, so let’s consider two examples. These examples were also used in my previous research report on IO.NET: In 2018, Nobel Laureate in Chemistry, Frances Arnold, said during her award ceremony, “Today we can for all practical purposes read, write, and edit any sequence of DNA, but we cannot compose it. ” Fast forward five years to 2023, a team of researchers from Stanford University and Salesforce Research, an AI-focused startup, made a publication in “Nature Biotechnology.” Utilizing a large language model refined from GPT-3, they generated an entirely new catalog of 1 million proteins. Among these, they discovered two proteins with distinct structures, both endowed with antibacterial function, potentially paving the way for new bacterial resistance strategies beyond traditional antibiotics. This signifies a monumental leap in overcoming the hurdles of protein creation with AI’s assistance.Before this, the artificial intelligence algorithm AlphaFold predicted the structures of nearly all 2.14 billion protein types on Earth within 18 months—a milestone that amplifies the achievements of structural biologists throughout history by several magnitudes. The transformation is on the way, and the arrival of AGI will further accelerate this process. However, the arrival of AGI also presents enormous challenges. AGI will not only replace a large number of knowledge workers, but also those in physical service industries, which are currently considered to be “less impacted by AI.” As robotic technology matures and new materials lower production costs, the proportion of jobs replaced by machines and software will rapidly increase. When this happens, two issues that once seemed very distant will quickly surface: The employment and income challenges of a large unemployed populationHow to distinguish between AI and humans in a world where AI is ubiquitous Worldcoin and Worldchain is attempting to provide solutions by implementing a universal basic income (UBI) system to ensure basic income for the public, and using iris-based biometrics to distinguish between humans and AI. In fact, UBI is not just a theoretical concept; it has been tested in real-world practice. Countries such as Finland and England have conducted UBI experiments, while political parties in Canada, Spain, and India are actively proposing and promoting similar initiatives. The advantage of using a biometric identification and blockchain model for UBI distribution lies in its global reach, providing broader coverage of the population. Furthermore, the user network expanded through income distribution can support other business models, such as financial services (DeFi), social networking, and task crowdsourcing, creating synergy within the network’s commercial ecosystem. One of the notable projects addressing the impact of AGI’s arrival is Worldcoin ($WLD), with a circulating market cap of $1.03 billion and a fully diluted market cap of $47.2 billion. Risks and Uncertainties on Crypto AI Narratives Unlike many research reports previously released by Mint Ventures, this article contains a significant degree of subjectivity in its narrative forecasting and predictions. Readers should view the content of this article as a speculative discussion rather than a forecast of the future. The narrative forecasts mentioned above face numerous uncertainties that could lead to incorrect assumptions. These risks or influencing factors include but are not limited to: Energy Risk: Rapid Decrease in Energy Consumption Due to GPU Upgrades Despite the surging energy demand for AI, chip manufacturers like NVIDIA are continually upgrading their hardware to deliver higher computing power with lower energy consumption. For instance, in March 2024, NVIDIA released the new generation AI computing card GB200, which integrates two B200 GPUs and one Grace CPU. Its training performance is four times that of the previous mainstream AI GPU H100, and its inference performance is seven times that of the H100, while requiring only one-quarter of the energy consumption of the H100. Nonetheless, the appetite for AI-driven power continues to grow. With the decrease in unit energy consumption and the further expansion of AI application scenarios and demand, the total energy consumption might actually increase. Data Risk: Project Q* and “Self-Generated Data” There is a rumored project within OpenAI known as “Q*,” mentioned in internal communications to employees. According to Reuters, citing insiders at OpenAI, this could represent a significant breakthrough on OpenAI’s path to achieving superintelligence or artificial general intelligence (AGI). Q* is rumored to solve previously unseen mathematical problems through abstraction and generate its own data for training large models, without needing real-world data input. If this rumor is true, the bottleneck of large AI model training being constrained by the lack of high-quality data would be eliminated. AGI Arrival: OpenAI’s Concerns Whether AGI will truly arrive by 2025, as Musk predicts, remains uncertain, but it is only a matter of time. Worldcoin, as a direct beneficiary of the AGI narrative, faces its biggest concern from OpenAI, given that it is widely regarded as the “shadow token of OpenAI.” In the early hours of May 14, OpenAI presented the latest performance of GPT-4o and 19 other versions of large language models in comprehensive task scores at their Spring New Product Launch. According to the table, GPT-4o scored 1310, which visually appears significantly higher than the others. However, in terms of total score, it is only 4.5% higher than the second-place GPT-4 turbo, 4.9% higher than Google’s Gemini 1.5 Pro in fourth place, and 5.1% higher than Anthropic’s Claude3 Opus in fifth place. Since GPT-3.5 first stunned the world, only a little over a year has passed, and OpenAI’s competitors have already closed the gap significantly (despite GPT-5 not yet being released, which is expected to happen this year). The question of whether OpenAI can maintain its industry-leading position in the future is becoming increasingly uncertain. If OpenAI’s leading advantage and dominance are diluted or even surpassed, then the narrative value of Worldcoin as OpenAI’s shadow token will also diminish. In addition to Worldcoin’s iris authentication solution, more and more competitors are entering the market. For instance, the palm scan ID project Humanity Protocol has recently completed a new funding round, raising $30 million at a $1 billion valuation. LayerZero Labs has also announced that it will operate on Humanity and join its validator node network, using ZK proofs to authenticate credentials. Conclusion In conclusion, while I have extrapolated potential future narratives for the crypto AI sector, it is important to recognize that it differs from native crypto sectors like DeFi. It is largely a product of the AI hype spilling over into the crypto world. Many of the current projects have not yet proven their business models, and many projects are more like AI-themed memes (e.g., $RNDR resembles an NVIDIA meme, Worldcoin resembles an OpenAI meme). Readers should approach this cautiously.

Emerging Trends in Crypto AI Sector: Key Catalysts, Development Frameworks and Top Projects

By Alex XuResearch Partner at Mint Ventures

Introduction
This cycle of the crypto bull market has been the most uninspiring in terms of commercial innovation. Unlike the previous bull market, which saw phenomenal trends like DeFi, NFTs, and GameFi, this cycle lacks significant industry hotspots. Consequently, there has been sluggish growth in user base, industry investment, and developer activity.
This trend is also evident in the price of crypto assets. Over the entire cycle, most altcoins, including ETH, have consistently lost value relative to BTC. The valuation of smart contract platforms is largely driven by the prosperity of their applications. When innovation in application development stagnates, it becomes challenging for the valuation of public chains to rise.
However, artificial intelligence (AI), as a relatively new sector in the crypto business landscape, could benefit from the explosive growth and ongoing hotspots in the broader commercial world. This gives AI projects within the crypto space the potential to attract significant incremental attention.
In the IO.NET report published by Mint Ventures in April, the necessity of integrating AI with crypto was thoroughly analyzed. The advantages of crypto-economic solutions—such as determinism, efficient resource allocation, and trustlessness—could potentially address the three major challenges of AI: randomness, resource intensity, and the difficulty in distinguishing between human and machine.
In the AI sector of the crypto economy, I want to discuss and explore several critical issues in this article, including:
Emerging or potentially explosive narratives exist in the crypto AI sector.The catalytic paths and logical frameworks of these narratives.Crypto + AI Projects.The risks and uncertainties involved in the development of the crypto + AI sector.
Please note that this article reflects my current thinking and may evolve. The opinions here are subjective and there may be errors in facts, data, and logical reasoning. This is not financial advice, but feedback and discussions are welcomed.
The Next Wave of Narratives in the Crypto AI Sector
Before diving into the emerging trends in the crypto AI sector, let’s first examine the current leading narratives. Based on market cap, those with a valuation exceeding $1 billion include:
Computing PowerRender Network ($RNDR): holding circulating market cap of $3.85 billion, Akash: with a circulating market cap of $1.2 billionIO.NET: recently valued at $1 billion in its latest financing round.Algorithm NetworksBittensor ($TAO): Boast a circulating market cap of $2.97 billion.AI AgentsFetch.ai ($FET): reaches a pre-merger circulating market cap of $2.1 billion
*Data Updated Time: May 24, 2024.
Beyond the fields mentioned above, which AI sector will produce the next project with a market cap exceeding $1 billion?
I believe this can be speculated from two perspectives: the “industrial supply side” narrative and the “GPT moment” narrative.
Examining The Opportunities in The Energy and Data Field from The Perspective of Industrial Supply Side
From the perspective of the industrial supply side, four key driving forces behind AI development are:
Algorithms: High-quality algorithms can execute training and inference tasks more efficiently.Computing Power: Both model training and inference demand substantial computing power provided by GPU hardware. This requirement represents a major industrial bottleneck, with the current chip shortage driving up prices for mid-to-high-end chips.Energy: AI data centers require significant energy consumption. Beyond the electricity needed for GPUs to perform computational tasks, substantial energy is also needed for cooling the GPUs. In large data centers, cooling systems alone account for about 40% of total energy consumption.Data: Enhancing the performance of large models necessitates expanding training parameters, leading to a massive demand for high-quality data.
Regarding the four industrial driving forces mentioned above, the algorithm and computing power sectors already have crypto projects with circulating market cap exceeding $1 billion. However, the energy and data sectors have yet to see projects reach similar market caps.
Actually, shortages in the supply of energy and data may soon emerge, potentially becoming the next industry hotspots and driving the surge of related projects in the crypto space.
Let’s begin with energy part.
On February 29, 2024, Elon Musk remarked at the Bosch ConnectedWorld 2024 conference, “I predicted the chip shortage more than a year ago, and the next shortage will be electricity. I think next year you will see that they just can’t find enough electricity to run all the chips.”
According to specific data, the Stanford University Institute for Human-Centered Artificial Intelligence, led by Fei-Fei Li, publishes the “AI Index Report” annually. In their 2022 report on the AI industry for 2021, the research group estimated that AI energy consumption that year accounted for only 0.9% of global electricity demand, putting limited pressure on energy and the environment. However, in 2023, the International Energy Agency (IEA) summarized 2022 by stating that global data centers consumed approximately 460 terawatt-hours (TWh) of electricity, accounting for 2% of global electricity demand. They also predicted that by 2026, global data center energy consumption will be at least 620 TWh, potentially reaching up to 1050 TWh.
In reality, the International Energy Agency’s estimates remain conservative, as numerous AI projects poised to launch will demand significantly more energy than anticipated in 2023. 
For example, Microsoft and OpenAI are planning the Stargate project. This ambitious initiative is set to commence in 2028 and be completed around 2030. The project aims to build a supercomputer equipped with millions of dedicated AI chips, providing OpenAI with unprecedented computing power to advance its research in artificial intelligence, particularly large language models. The estimated cost of this project exceeds $100 billion, which is 100 times the cost of current large data centers.
The energy consumption for the Stargate project alone is expected to reach 50 TWh.
As a result, OpenAI’s founder Sam Altman stated at the Davos Forum this January: “Future artificial intelligence will require energy breakthroughs, as the electricity consumed by AI will far exceed expectations.”
Following computing power and energy, the next major shortage in the rapidly growing AI industry is likely to be data.
In fact, the shortage of high-quality data necessary for AI has already become a reality.
Through the ongoing evolution of GPT, we have largely grasped the pattern of enhancing the capabilities of large language models—by expanding model parameters and training data, the capabilities of these models can be exponentially increased. This process shows no immediate technical bottleneck.
However, high-quality and publicly available data are likely to become increasingly scarce in the future. AI products may face supply-demand conflicts similar to those experienced with chips and energy.
Firstly, there is an increase in disputes over data ownership.
On December 27, 2023, The New York Times filed a lawsuit against OpenAI and Microsoft in the U.S. District Court, alleging that they used millions of its articles without permission to train the GPT model. The New York Times is seeking billions of dollars in statutory and actual damages for the “illegal copying and use of uniquely valuable works” and is demanding the destruction of all models and training data that include its copyrighted materials.
At the end of March 2024, The New York Times issued a new statement, expanding its accusations beyond OpenAI to include Google and Meta. The statement claimed that OpenAI had used a speech recognition tool called Whisper to transcribe a large number of YouTube videos into text, which was then used to train GPT-4. The New York Times argued that it has become common practice for large companies to employ underhanded tactics in training their AI models. They also pointed out that Google is engaged in similar practices, converting YouTube video content into text for their model training, essentially infringing on the rights of video content creators.
The lawsuit between The New York Times and OpenAI, dubbed the first “AI copyright case,” is unlikely to be resolved quickly due to its complexity and the profound impact it could have on the future of content and the AI industry. One potential outcome is an out-of-court settlement, with deep-pocketed Microsoft and OpenAI paying a significant amount in compensation. However, future disputes over data copyright will inevitably drive up the overall cost of high-quality data.
Furthermore, Google, as the world’s largest search engine, has been reported to be considering charging fees for its search services—not for the general public, but for AI companies.

Source: Reuters

Google’s search engine servers hold vast amounts of content—essentially, all the content that has appeared on web pages since the 21st century. AI-driven search products, such as Perplexity and Kimi and Meta Sota developed by Chinese companies, process the data retrieved from these searches through AI and then deliver it to users. Introducing charges for AI companies to access search engine data will undoubtedly raise the cost of obtaining data.
Furthermore, AI giants are not just focusing on public data; they are also targeting non-public internal data.

Photobucket, a long-established image and video hosting website, once boasted 70 million users and nearly half of the U.S. online photo market share in the early 2000s. However, with the rise of social media, Photobucket’s user base has significantly dwindled, now standing at only 2 million active users, each paying a steep annual fee of $399. According to its user agreement and privacy policy, accounts inactive for more than a year are reclaimed, granting Photobucket the right to use the uploaded images and videos. Photobucket’s CEO, Ted Leonard, disclosed that their 1.3 billion photos and videos are extremely valuable for training generative AI models. He is currently negotiating with several tech companies to sell this data, with prices ranging from 5 cents to 1 dollar per photo and more than 1 dollar per video. Leonard estimates that Photobucket’s data could be worth over 1 billion dollars.
The research team EPOCH, which specializes in AI development trends, published a report titled “Will we run out of data? An analysis of the limits of scaling datasets in Machine Learning.” This report, based on the 2022 usage of data in machine learning and the generation of new data, while also considering the growth of computing resources, concluded that high-quality text data could be exhausted between February 2023 and 2026, and image data might run out between 2030 and 2060. Without significant improvements in data utilization efficiency or the emergence of new data sources, the current trend of large machine learning models that depend on massive datasets could slow down.
Considering the current trend of AI giants buying data at high prices, it seems that free, high-quality text data has indeed run dry, validating EPOCH’s prediction from two years ago. 
Concurrently, solutions to the “AI data shortage” are emerging, specifically AI-data-as-a-service.
Defined.ai is one such company that offers customized, high-quality real data for AI companies. 

Examples of Data Types on Defined.ai

The business model of Defined.ai works as follows: AI companies specify their data requirements, such as needing images with a certain resolution quality, free from blurriness and overexposure, and with authentic content. Companies can also request specific themes based on their training tasks, like nighttime photos of traffic cones, parking lots, and signposts to enhance AI’s night scene recognition. The public can accept these tasks, upload their photos, which are then reviewed by Defined.ai. Approved images are paid for, typically $1-2 per high-quality image, $5-7 per short video clip, and $100-300 for a high-quality video over 10 minutes. Text is compensated at $1 per thousand words, with task completers earning about 20% of the fees. This approach to data provision could become a new crowdsourcing business akin to “data labeling.”
Global task distribution, economic incentives, data asset pricing, circulation, and privacy protection, with everyone able to participate, sound very much like a business model suited to the Web3 paradigm.
Analyzing The Crypto + AI Projects from The Perspective of Industrial Supply
The attention generated by the chip shortage has extended into the crypto industry, positioning decentralized computing power as the most popular and highest-valued AI sector to date. 
If the supply and demand conflicts in the AI industry for energy and data become acute in the next 1-2 years, what narrative-related projects are currently present in the crypto industry?
Let’s start with energy-concept projects. 
Currently, energy projects listed on major centralized exchanges (CEX) are very scarce, with Power Ledger and its native token $POWR being the sole example.
Power Ledger was established in 2017 as a blockchain-based comprehensive energy platform aimed at decentralizing energy trading. It promotes direct electricity trading among individuals and communities, supports the widespread adoption of renewable energy, and ensures transaction transparency and efficiency through smart contracts. Initially, Power Ledger operated on a consortium chain adapted from Ethereum. In the second half of 2023, Power Ledger updated its whitepaper and launched its own comprehensive public chain, based on Solana’s technical framework, to handle high-frequency microtransactions in the distributed energy market. Power Ledger’s primary business areas currently include:
Energy Trading: Enabling users to buy and sell electricity directly in a peer-to-peer way, particularly from renewable sources.Environmental Product Trading: Facilitating the trading of carbon credits and renewable energy certificates, as well as financing based on environmental products.Public Chain Operations: Attracting application developers to build on the Power Ledger blockchain, with transaction fees paid in $POWR tokens.
The current circulating market cap of the Power Ledger is $170 million, with a fully diluted market cap of $320 million.
In comparison to energy-concept crypto projects, there is a richer variety of targets in the data sector.
Listed below are the data sector projects I am currently following, which have been listed on at least one major CEX, such as Binance, OKX, or Coinbase, arranged by fully diluted valuation (FDV) from low to high:
1. Streamr ($DATA )
Streamr’s value proposition is to build a decentralized real-time data network where users can freely trade and share data while retaining full control over their own information. Through its data marketplace, Streamr aims to enable data producers to sell data streams directly to interested consumers, eliminating the need for intermediaries, thus reducing costs and increasing efficiency.

Source: https://streamr.network/hub/projects

In real-world applications, Streamr has collaborated with another Web3 vehicle hardware project, DIMO, to collect data such as temperature and air pressure through DIMO hardware sensors installed in vehicles. This data is then transmitted as weather data streams to organizations that need it.
Unlike other data projects, Streamr focuses more on IoT and hardware sensor data. Besides the DIMO vehicle data, other notable projects include real-time traffic data streams in Helsinki. Consequently, Streamr’s token, $DATA , experienced a significant surge, doubling its value in a single day during the peak of the Depin concept last December. 
Currently, Streamr’s circulating market cap is $44 million, with a fully diluted market cap of $58 million.
2. Covalent ($CQT)
Unlike other data projects, Covalent focuses on providing blockchain data. The Covalent network reads data from blockchain nodes via RPC, processes and organizes it, and creates an efficient query database. This allows Covalent users to quickly retrieve the information they need without performing complex queries directly on blockchain nodes. Such services are referred to as “blockchain data indexing.”
Covalent primarily serves enterprise customers, including various DeFi protocols, and many centralized crypto companies such as Consensys (the parent company of MetaMask), CoinGecko (a well-known crypto asset tracking site), Rotki (a tax tool), and Rainbow (a crypto wallet). Additionally, traditional financial industry giants like Fidelity and Ernst & Young are also among Covalent’s clients. According to Covalent’s official disclosures, the project’s revenue from data services has already surpassed that of the leading project in the same field, The Graph.
The Web3 industry, with its integrated, transparent, authentic, and real-time on-chain data, is poised to become a high-quality data source for specialized AI scenarios and specific “small AI models.” Covalent, as a data provider, has already started offering data for various AI scenarios and has introduced verifiable structured data tailored for AI applications.

Source: Solutions on Covalent

For instance, Covalent provides data for the on-chain smart trading platform SmartWhales, which uses AI to identify profitable trading patterns and addresses. Entendre Finance leverages Covalent’s structured data, processed by AI technology for real-time insights, anomaly detection, and predictive analytics.
Currently, the main application scenarios for Covalent’s on-chain data services are predominantly in the financial field. However, as Web3 products and data types continue to diversify, the use cases for on-chain data are expected to expand further.
The circulating market cap of Covalent is $150 million, with a fully diluted market cap of $235 million, offering a noticeable valuation advantage compared to The Graph, a leading project in the blockchain data indexing sector.
3. Hivemapper ($Honey)
Among all data types, video data typically commands the highest price. Hivemapper can provide AI companies with both video and map information. Hivemapper is a decentralized global mapping project that aims to create a detailed, dynamic, and accessible map system through blockchain technology and community contributions. Participants capture map data using dashcams and add it to the open-source Hivemapper data network, earning $HONEY tokens as rewards for their contributions. To enhance network effects and reduce interaction costs, Hivemapper is built on Solana.
Hivemapper was founded in 2015 with the original vision of creating maps using drones. However, this approach proved difficult to scale, leading the company to shift towards using dashcams and smartphones to capture geographic data, thereby reducing the cost of global map creation.
Compared to street view and mapping software like Google Maps, Hivemapper leverages an incentive network and crowdsourcing model to more efficiently expand map coverage, maintain the freshness of real-world map data, and enhance video quality.
Before the surge in AI demand for data, Hivemapper’s main customers included the autonomous driving departments of automotive companies, navigation service providers, governments, insurance companies, and real estate firms. Today, Hivemapper can provide extensive road and environmental data to AI and large models through APIs. By continuously updating image and road feature data streams, AI and ML models will be better equipped to translate this data into enhanced capabilities, enabling them to perform tasks related to geographic location and visual judgment more effectively.

Source: Hivemapper Blog

Currently, the circulating market cap of $Honey, the  native token of Hivemapper, is $120 million, with a fully diluted market cap of $496 million. 
Besides the aforementioned projects, other notable projects in the data sector include:
1. The Graph ($GRT ): With a circulating market cap of $3.2 billion and a fully diluted valuation (FDV) of $3.7 billion, The Graph provides blockchain data indexing services similar to Covalent.
2. Ocean Protocol ($OCEAN): Ocean Protocol has a circulating market cap of $670 million and an FDV of $1.45 billion. The project aims to facilitate the exchange and monetization of data and data-related services through its open-source protocol. Ocean Protocol connects data consumers with data providers, ensuring trust, transparency, and traceability in data sharing. The project is set to merge with Fetch.ai and SingularityNET, with the token converting to $ASI.
The Reappearance of the GPT Moment and the Advent of General Artificial Intelligence
In my view, the “AI sector” in the crypto industry truly began in 2023, the right year ChatGPT shocked the world. The rapid surge of crypto AI projects was largely driven by the “wave of enthusiasm” following the explosive growth of the AI industry.
Despite the continuous upgrades in capabilities with models like GPT-4 and GPT-turbo, and the impressive video creation abilities demonstrated by Sora, as well as the rapid development of large language models beyond OpenAI, it’s undeniable that the technological advancements in AI are causing diminishing cognitive shock to the public. People are gradually adopting AI tools, and large-scale job replacements have yet to materialize.
Will we witness another “GPT moment” in the future, a leap in development that shocks the public and makes them realize that their lives and work will be fundamentally changed?
This moment could be the arrival of general artificial intelligence (AGI).
AGI, or artificial general intelligence, refers to machines that possess human-like general cognitive abilities, capable of solving a wide range of complex problems, rather than being limited to specific tasks. AGI systems have high levels of abstract thinking, extensive background knowledge, comprehensive common-sense reasoning, causal understanding, and cross-disciplinary transfer learning abilities. AGI performs at the level of the best humans across various fields and, in terms of overall capability, completely surpasses even the most outstanding human groups.
In fact, whether depicted in science fiction novels, games, films, or through the public’s expectations following the rapid rise of GPT, society has long anticipated the emergence of AGI that surpasses human cognitive levels. In other words, GPT itself is a precursor to AGI, a harbinger of general artificial intelligence.
The reason GPT has such a profound industrial impact and psychological shock is that its deployment and performance have far exceeded public expectations. People did not anticipate that an AI system capable of passing the Turing test would arrive so quickly and with such impressive capabilities.
In fact, artificial general intelligence (AGI) may once again create a “GPT moment” within the next 1-2 years: just as people are becoming accustomed to using GPT as an assistant, they may soon discover that AI has evolved beyond merely being an assistant. It could independently tackle highly creative and challenging tasks, including solving problems that have stumped top human scientists for decades.
On April 8th of this year, Elon Musk was interviewed by Nicolai Tangen, the Chief Investment Officer of Norway’s sovereign wealth fund, and he discussed the timeline for the emergence of AGI. 
Musk stated, “If we define AGI as being smarter than the smartest humans, I think it is very likely to appear by 2025.”
According to Elon Musk’s prediction, it would take at most another year and a half for AGI to arrive. However, he added a condition: “provided that electricity and hardware can keep up.”
The benefits of AGI’s arrival are obvious.
It means that human productivity will make a significant leap forward, and many scientific problems that have stumped us for decades will be resolved. If we define “the smartest humans” as Nobel Prize winners, it means that, provided we have enough energy, computing power, and data, we could have countless tireless “Nobel laureates” working around the clock to tackle the most challenging scientific problems.
However, Nobel Prize winners are not as rare as one in hundreds of millions. Their abilities and intellect are often at the level of top university professors. However, due to probability and luck, they chose the right direction, persisted, and achieved results. Many of their equally capable peers might have won Nobel Prizes in a parallel universe of scientific research. Unfortunately, there are still not enough top university professors involved in scientific breakthroughs, so the speed of “exploring all the correct directions in scientific research” remains very slow.
With AGI, and given sufficient energy and computing power, we could have an unlimited number of “Nobel laureate-level” AGIs conducting in-depth exploration in any potential direction for scientific breakthroughs. The speed of technological advancement would increase exponentially. This acceleration would lead to a hundredfold increase in resources we currently consider expensive and scarce over the next 10 to 20 years, such as food production, new materials, medicines, and high-quality education. The cost of acquiring these resources would decrease dramatically. We would be able to support a larger population with fewer resources, and per capita wealth would increase rapidly.

Global GDP Trend Made by World Bank

This might sound somewhat sensational, so let’s consider two examples. These examples were also used in my previous research report on IO.NET:
In 2018, Nobel Laureate in Chemistry, Frances Arnold, said during her award ceremony, “Today we can for all practical purposes read, write, and edit any sequence of DNA, but we cannot compose it. ” Fast forward five years to 2023, a team of researchers from Stanford University and Salesforce Research, an AI-focused startup, made a publication in “Nature Biotechnology.” Utilizing a large language model refined from GPT-3, they generated an entirely new catalog of 1 million proteins. Among these, they discovered two proteins with distinct structures, both endowed with antibacterial function, potentially paving the way for new bacterial resistance strategies beyond traditional antibiotics. This signifies a monumental leap in overcoming the hurdles of protein creation with AI’s assistance.Before this, the artificial intelligence algorithm AlphaFold predicted the structures of nearly all 2.14 billion protein types on Earth within 18 months—a milestone that amplifies the achievements of structural biologists throughout history by several magnitudes.
The transformation is on the way, and the arrival of AGI will further accelerate this process.
However, the arrival of AGI also presents enormous challenges.
AGI will not only replace a large number of knowledge workers, but also those in physical service industries, which are currently considered to be “less impacted by AI.” As robotic technology matures and new materials lower production costs, the proportion of jobs replaced by machines and software will rapidly increase.
When this happens, two issues that once seemed very distant will quickly surface:
The employment and income challenges of a large unemployed populationHow to distinguish between AI and humans in a world where AI is ubiquitous
Worldcoin and Worldchain is attempting to provide solutions by implementing a universal basic income (UBI) system to ensure basic income for the public, and using iris-based biometrics to distinguish between humans and AI.
In fact, UBI is not just a theoretical concept; it has been tested in real-world practice. Countries such as Finland and England have conducted UBI experiments, while political parties in Canada, Spain, and India are actively proposing and promoting similar initiatives.
The advantage of using a biometric identification and blockchain model for UBI distribution lies in its global reach, providing broader coverage of the population. Furthermore, the user network expanded through income distribution can support other business models, such as financial services (DeFi), social networking, and task crowdsourcing, creating synergy within the network’s commercial ecosystem.
One of the notable projects addressing the impact of AGI’s arrival is Worldcoin ($WLD), with a circulating market cap of $1.03 billion and a fully diluted market cap of $47.2 billion.
Risks and Uncertainties on Crypto AI Narratives
Unlike many research reports previously released by Mint Ventures, this article contains a significant degree of subjectivity in its narrative forecasting and predictions. Readers should view the content of this article as a speculative discussion rather than a forecast of the future. The narrative forecasts mentioned above face numerous uncertainties that could lead to incorrect assumptions. These risks or influencing factors include but are not limited to:
Energy Risk: Rapid Decrease in Energy Consumption Due to GPU Upgrades
Despite the surging energy demand for AI, chip manufacturers like NVIDIA are continually upgrading their hardware to deliver higher computing power with lower energy consumption. For instance, in March 2024, NVIDIA released the new generation AI computing card GB200, which integrates two B200 GPUs and one Grace CPU. Its training performance is four times that of the previous mainstream AI GPU H100, and its inference performance is seven times that of the H100, while requiring only one-quarter of the energy consumption of the H100. Nonetheless, the appetite for AI-driven power continues to grow. With the decrease in unit energy consumption and the further expansion of AI application scenarios and demand, the total energy consumption might actually increase.
Data Risk: Project Q* and “Self-Generated Data”
There is a rumored project within OpenAI known as “Q*,” mentioned in internal communications to employees. According to Reuters, citing insiders at OpenAI, this could represent a significant breakthrough on OpenAI’s path to achieving superintelligence or artificial general intelligence (AGI). Q* is rumored to solve previously unseen mathematical problems through abstraction and generate its own data for training large models, without needing real-world data input. If this rumor is true, the bottleneck of large AI model training being constrained by the lack of high-quality data would be eliminated.
AGI Arrival: OpenAI’s Concerns
Whether AGI will truly arrive by 2025, as Musk predicts, remains uncertain, but it is only a matter of time. Worldcoin, as a direct beneficiary of the AGI narrative, faces its biggest concern from OpenAI, given that it is widely regarded as the “shadow token of OpenAI.”
In the early hours of May 14, OpenAI presented the latest performance of GPT-4o and 19 other versions of large language models in comprehensive task scores at their Spring New Product Launch. According to the table, GPT-4o scored 1310, which visually appears significantly higher than the others. However, in terms of total score, it is only 4.5% higher than the second-place GPT-4 turbo, 4.9% higher than Google’s Gemini 1.5 Pro in fourth place, and 5.1% higher than Anthropic’s Claude3 Opus in fifth place.

Since GPT-3.5 first stunned the world, only a little over a year has passed, and OpenAI’s competitors have already closed the gap significantly (despite GPT-5 not yet being released, which is expected to happen this year). The question of whether OpenAI can maintain its industry-leading position in the future is becoming increasingly uncertain. If OpenAI’s leading advantage and dominance are diluted or even surpassed, then the narrative value of Worldcoin as OpenAI’s shadow token will also diminish.
In addition to Worldcoin’s iris authentication solution, more and more competitors are entering the market. For instance, the palm scan ID project Humanity Protocol has recently completed a new funding round, raising $30 million at a $1 billion valuation. LayerZero Labs has also announced that it will operate on Humanity and join its validator node network, using ZK proofs to authenticate credentials.
Conclusion
In conclusion, while I have extrapolated potential future narratives for the crypto AI sector, it is important to recognize that it differs from native crypto sectors like DeFi. It is largely a product of the AI hype spilling over into the crypto world. Many of the current projects have not yet proven their business models, and many projects are more like AI-themed memes (e.g., $RNDR resembles an NVIDIA meme, Worldcoin resembles an OpenAI meme). Readers should approach this cautiously.
A New Solana-based AI + DePIN Project: A Brief Analysis of Upcoming-Tokenlaunch IO.NETBy Alex Xu, Research Partner at Mint Ventures Introduction In our last report, we mentioned that compared to the previous two cycles, the current cryptocurrency bull run is missing the new business models and asset narratives. Artificial Intelligence (AI) is one of the novel narratives in the Web3 space this cycle. This article delves into the hot AI project of the year, IO.NET, and organizes thoughts on the following two questions: The necessity of AI+Web3 in the commercial landscapeThe necessity and challenges of deploying a decentralized computing network I will organize key information about the representative project in the decentralized AI computing network: IO.NET, including product design, competitive landscape, and project background. I will also speculate on the project’s valuation metrics. The insights on The Business Logic Behind The Convergence of AI and Web3 part draw inspiration from “The Real Merge” by Michael Rinko, a research analyst at Delphi Delphi. This analysis assimilates and references ideas from his work, highly recommended for further reading The Real Merge. Please note that this article reflects my current thinking and may evolve. The opinions here are subjective and there may be errors in facts, data, and logical reasoning. This is not financial advice, but feedback and discussions are welcomed. The Business Logic Behind The Convergence of AI and Web3 2023: The “Annus Mirabilis” for AI Reflecting on the annals of human development, it’s clear that technological breakthroughs catalyze profound transformations – from daily life to the industrial landscapes and the march of civilization itself. In human history, there are two significant years, namely 1666 and 1905, which are now celebrated as the “Annus Mirabilis” in the history of science. The year 1666 earned its title due to Isaac Newton’s cascade of scientific breakthroughs. In a single year, he pioneered the branch of physics known as optics, founded the mathematical discipline of calculus, and derived the law of gravitation, which is a foundational law of modern natural science. Any one of these contributions was foundational to the scientific development of humanity over the next century, significantly accelerating the overall progress of science. The other landmark year is 1905, when a mere 26-year-old Einstein published four papers in quick succession in “Annalen der Physik,” covering the photoelectric effect, setting the stage for quantum mechanics; Brownian motion, providing a pivotal framework for stochastic process analysis; the theory of special relativity; and the mass-energy equivalence, encapsulated in the equation E=MC^2. Looking back, each of these papers is considered to surpass the average level of Nobel Prize-winning work in physics—a distinction Einstein himself received for his work on the photoelectric effect. These contributions collectively propelled humanity several strides forward in the journey of civilization. The year 2023, recently behind us, is poised to be celebrated as another “Miracle Year,” thanks in large part to the emergence of ChatGPT. Viewing 2023 as a “Miracle Year” in human technology history isn’t just about acknowledging the strides made in natural language processing and generation by ChatGPT. It’s also about recognizing a clear pattern in the advancement of large language models—the realization that by expanding model parameters and training datasets, we can achieve exponential enhancements in model performance. Moreover, it seems boundless in the short term, assuming computing power keeps pace. This capability extends far beyond language comprehension and conversation generation; it can be widely applied across various scientific fields. Taking the application of large language models in the biological sector as an example:  In 2018, Nobel Laureate in Chemistry, Frances Arnold, said during her award ceremony, “Today we can for all practical purposes read, write, and edit any sequence of DNA, but we cannot compose it. ” Fast forward five years to 2023, a team of researchers from Stanford University and Salesforce Research, an AI-focused startup, made a publication in “Nature Biotechnology.” Utilizing a large language model refined from GPT-3, they generated an entirely new catalog of 1 million proteins. Among these, they discovered two proteins with distinct structures, both endowed with antibacterial function, potentially paving the way for new bacterial resistance strategies beyond traditional antibiotics. This signifies a monumental leap in overcoming the hurdles of protein creation with AI’s assistance.Before this, the artificial intelligence algorithm AlphaFold predicted the structures of nearly all 2.14 billion protein types on Earth within 18 months—a milestone that amplifies the achievements of structural biologists throughout history by several magnitudes. The integration of AI models promises to transform industries drastically. From the hard-tech realms of biotech, material science, and drug discovery to the cultural spheres of law and the arts, a transformative wave is set to reshape these fields, with 2023 marking the beginning of it all. It’s widely acknowledged that the past century has witnessed an exponential rise in humanity’s ability to generate wealth. The swift advancement of AI technologies is expected to accelerate this process. Global Total GDP Trend, Data Source: World Bank Group Merging AI and Crypto To grasp the inherent need for the fusion of AI and crypto, it’s insightful to look at how their distinct features complement each other. The Symbiosis of AI and Crypto Features AI is distinguished by three main qualities: Stochasticity: AI is stochastic, with its content production mechanism being a difficult-to-replicate, enigmatic black box, making its outputs inherently stochastic.Resource Intensive: AI is a resource-intensive industry, requiring significant amounts of energy, chips, and computing power.Human-like Intelligence: AI is (soon to be) capable of passing the Turing test, making it increasingly difficult to distinguish between humans and AI. ※ On October 30, 2023, researchers from the University of California, San Diego, unveiled the Turing test scores for GPT-3.5 and GPT-4.0. The latter achieved a score of 41%, narrowly missing the pass mark of 50% by just 9 percentage points, with humans scoring 63% on the same test. The essence of this Turing test lies in how many participants perceive their chat partner to be human. A score above 50% indicates that a majority believes they are interacting with a human, not a machine, thereby deeming the AI to have successfully passed the Turing test as at least half of the people could not distinguish it from a human. As AI paves the way for groundbreaking advancements in human productivity, it simultaneously introduces profound challenges to our society, specifically: How to verify and control the stochasticity of AI, turning it into an advantage rather than a flawHow to bridge the vast requirements for energy and computing power that AI demandsHow to distinguish between humans and AI Crypto and blockchain technology could offer the ideal solution to the challenges posed by AI, characterized by three key attributes: Determinism: Operations are based on blockchain, code, and smart contracts, with clear rules and boundaries. Inputs lead to predictable outputs, ensuring a high level of determinism.Efficient Resource Allocation: The crypto economy has fostered a vast, global, and free market, enabling swift pricing, fundraising, and transfer of resources. The presence of tokens further accelerates market supply and demand alignment, rapidly achieving critical mass through incentivization.Trustlessness: With public ledgers and open-source code, anyone can easily verify operations, creating a “trustless” system. Furthermore, Zero-Knowledge (ZK) technology further ensures that privacy is maintained during these verification processes. To demonstrate the complementarity between AI and the crypto economy, let’s delve into three examples. Example A: Overcoming Stochasticity with AI Agents Powered by the Crypto Economy AI Agents are intelligent programs designed to perform tasks on behalf of humans according to their directives, with Fetch.AI being a notable example in this domain. Imagine we task our AI agent with executing a financial operation, such as “investing $1000 in BTC.” The AI agent could face two distinct scenarios: Scenario 1: The agent is required to interact with traditional financial entities (e.g., BlackRock) to buy BTC ETFs, encountering many compatibility issues with centralized organizations, including KYC procedures, document verification, login processes, and identity authentication, all of which are notably burdensome at present. Scenario 2: When operating within the native crypto economy, the process becomes simplified. The agent could directly carry out the transaction through Uniswap or a similar trading aggregator, employing your account to sign in and confirm the order, and consequently acquiring WBTC or other variants of wrapped BTC. This procedure is efficient and streamlined. Essentially, this is the function currently served by various Trading Bots, acting as basic AI agents with a focus on trading activities. With further development and integration of AI, these bots will fulfill more intricate trading objectives. For instance, they might monitor 100 smart money addresses on the blockchain, assess their trading strategies and success rates, allocate 10% of their funds to copy their trades over a week, halt operations if the returns are unfavorable, and deduce the potential reasons for these strategies. AI thrives within blockchain systems, fundamentally because the rules of the crypto economy are explicitly defined, and the system allows for permissionlessness. Operating under clear guidelines significantly reduces the risks tied to AI’s inherent stochasticity. For example, AI’s dominance over humans in chess and video games stems from the fact that these environments are closed sandboxes with straightforward rules. Conversely, advancements in autonomous driving have been more gradual. The open-world challenges are more complex, and our tolerance for AI’s unpredictable problem-solving in such scenarios is markedly lower. Example B: Resource Consolidation via Token Incentives The formidable global hash network backing BTC, boasting a current total hash rate of 576.70 EH/s, outstrips the cumulative computing power might of any country’s supercomputers. This growth is propelled by simple and fair incentives within the network. BTC Hashrate Trend Moreover, DePIN projects like Mobile, are exploring token incentives to cultivate a market on both the supply side and demand side to foster network effects. The forthcoming focus of this article, IO.NET, is a platform designed to aggregate AI computing power, hoping to unlock the latent potential of AI computing power through a token model. Example C: Leveraging Open Source and ZK Proof to Differentiate Humans from AI While Protecting Privacy Worldcoin, a Web3 project co-founded by OpenAI’s Sam Altman, employs a novel approach to identity verification. Utilizing a hardware device known as Orb, it leverages human iris biometrics to produce unique and anonymous hash values via Zero-Knowledge (ZK) technology, differentiating humans from AI. In early March 2024, the Web3 art project Drip started to implement Worldcoin ID to authenticate real humans and allocate rewards. Worldcoin has recently open-sourced its iris hardware, Orb, ensuring the security and privacy of biometric data. Overall, due to the determinism of code and cryptography, the resource circulation and fundraising advantages brought by permissionless and token-based mechanisms, alongside the trustless nature based on open-source code and public ledgers, the crypto economy has become a significant potential solution for the challenges that human society faces with AI. The most immediate and commercially demanding challenge is the extreme thirst for computational resources required by AI products, primarily driven by a substantial need for chips and computing power. This is also the main reason why distributed computing power projects have led the gains during this bull market cycle in the overall AI sector. The Business Imperative for Decentralized Computing AI requires substantial computational resources, necessary for both model training and inference tasks. It has been well-documented in the training of large language models that once the scale of data parameters is substantial, these models begin to exhibit unprecedented capabilities. The exponential improvements seen from one ChatGPT generation to the next are driven by an exponential growth in computational demands for model training. Research from DeepMind and Stanford University indicates that across various large language models, when handling different tasks—be it computation, Persian question answering, or natural language understanding—the models only approximate random guessing unless the training involves significantly scaled-up model parameters (and by extension, computational loads). Any task’s performance remains nearly random until computational efforts reach 10^22 FLOPs. Beyond this critical threshold, task performance improves dramatically across any language model. *FLOPs refer to floating-point operations per second, a measure of computing performance. Emergent Abilities of Large Language Models Emergent Abilities of Large Language Models The principle of “achieving miracles with great effort” in computing power, both in theory and verified in practice, inspired OpenAI’s founder, Sam Altman, to propose an ambitious plan to raise $7 trillion. This fund is intended to establish a chip factory that would exceed the current capabilities of TSMC by tenfold (estimated to cost $1.5 trillion), with the remaining funds allocated for chip production and model training. In addition to the computational demands of training AI models, the inference processes also require considerable computing power, albeit less than training. This ongoing need for chips and computational resources has become a standard reality for players in the AI field. In contrast to centralized AI computing providers like Amazon Web Services, Google Cloud Platform, and Microsoft’s Azure, decentralized AI computing offers several compelling value propositions: Accessibility: Gaining access to computing chips through services like AWS, GCP, or Azure typically requires weeks, and the most popular GPU models are frequently out of stock. Additionally, consumers are usually bound by lengthy, rigid contracts with these large corporations. Distributed computing platforms, on the other hand, provide flexible hardware options with enhanced accessibility.Cost Efficiency: By leveraging idle chips and incorporating token subsidies from network protocols for chip and computing power providers, decentralized computing networks can offer computing power at reduced costs.Censorship Resistance: The supply of cutting-edge chips is currently dominated by major technology companies, and with the United States government ramping up scrutiny of AI computing services, the ability to obtain computing power in a decentralized, flexible, and unrestricted manner is increasingly becoming a clear necessity. This is a core value proposition of web3-based computing platforms. If fossil fuels were the lifeblood of the Industrial Age, then computing power may well be the lifeblood of the new digital era ushered in by AI, making the supply of computing power an infrastructure for the AI age. Similarly to how stablecoins have emerged as a vigorous derivative of fiat currency in the Web3 epoch, might the distributed computing market evolve into a burgeoning segment within the fast-expanding AI computing market? This is still an emerging market, and much remains to be seen. However, several factors could potentially drive the narrative or market adoption of decentralized computing: Persistent GPU Supply Challenges: The ongoing supply constraints for GPUs might encourage developers to explore decentralized computing platforms.Regulatory Expansion: Accessing AI computing services from major cloud platforms involves thorough KYC processes and scrutiny. This could lead to greater adoption of decentralized computing platforms, particularly in areas facing restrictions or sanctions.Token Price Incentives: Increases in token prices during bull markets could enhance the value of subsidies offered to GPU providers by platforms, attracting more vendors to the market, increasing its scale, and lowering costs for consumers. At the same time, the challenges faced by decentralized computing platforms are also quite evident: Technical and Engineering ChallengesProof of Work Issues: The computations in deep learning models, due to the hierarchical structure where the output of each layer is used as the input for the next, verifying the validity of computations requires executing all prior work, which is neither simple nor efficient. To tackle this, decentralized computing platforms need to either develop new algorithms or employ approximate verification techniques that offer probabilistic assurance of results, rather than absolute determinism.Parallelization Challenges: Decentralized computing platforms draw upon a diverse array of chip suppliers, each typically offering limited computing power. Completing the training or inference tasks of an AI model by a single chip supplier quickly is nearly impossible. Therefore, tasks must be decomposed and distributed using parallelization to shorten the overall completion time. This approach, however, introduces several complications, including how tasks are broken down (particularly complex deep learning tasks), data dependencies, and the extra connectivity costs between devices.Privacy Protection Issues: How can one ensure that the data and models of the client are not disclosed to the recipient of the tasks? Regulatory Compliance ChallengesDecentralized computing platforms, due to their permissionless nature in the supply and demand markets, can appeal to certain customers as a key selling point. However, as AI regulatory frameworks evolve, these platforms may increasingly become targets of governmental scrutiny. Moreover, some GPU vendors are concerned about whether their leased computing resources are being used by sanctioned businesses or individuals. In summary, the primary users of decentralized computing platforms are mostly professional developers or small to medium-sized enterprises. Unlike cryptocurrency and NFT investors, these clients prioritize the stability and continuity of the services provided by the platforms, and pricing is not necessarily their foremost concern. Decentralized computing platforms have a long journey to go before they can win widespread acceptance from this discerning user base. Next, we will delve into the details and perform an analysis of IO.NET, a new decentralized computing power project in this cycle. We will also compare it with similar projects to estimate its potential market valuation after its launch. Decentralized AI Computing Platform: IO.NET Project Overview IO.NET is a decentralized computing network that has established a two-sided market around chips. On the supply side, there are globally distributed computing powers, primarily GPUs, but also CPUs and Apple’s integrated GPUs (iGPUs). The demand side consists of AI engineers seeking to complete AI model training or inference tasks. The official IO.NET website states their vision: Our MissionPutting together one million GPUs in a DePIN – decentralized physical infrastructure network. Compared to traditional cloud AI computing services, this platform highlights several key advantages: Flexible Configuration: AI engineers have the freedom to select and assemble the necessary chips into a “cluster” tailored to their specific computing tasks.Rapid Deployment: Unlike the lengthy approval and wait times associated with centralized providers such as AWS, deployment on this platform can be completed in just seconds, allowing for immediate task commencement.Cost Efficiency: The service costs are up to 90% lower than those offered by mainstream providers. Furthermore, IO.NET plans to launch additional services in the future, such as an AI model store. Product Mechanism and Business Metrics Product Mechanisms and Deployment Experience Similar to major platforms like Amazon Cloud, Google Cloud, and Alibaba Cloud, IO.NET offers a computing service known as IO Cloud. This service operates through a distributed and decentralized network of chips that supports the execution of Python-based machine-learning code for AI and machine-learning applications. The basic business module of IO Cloud is called Clusters——self-coordinating groups of GPUs designed to efficiently handle computing tasks. AI engineers have the flexibility to customize the clusters to meet their specific needs. The user interface of IO.NET is highly user-friendly. If you’re looking to deploy your own chip cluster for AI computing tasks, simply navigate to the Clusters page on the platform, where you can effortlessly configure your desired chip cluster according to your requirements. Cluster Page on IO.NET First, you need to select your cluster type, with three options available: General: Provides a general environment, suitable for early stages of a project where specific resource requirements are not yet clear.Train: A cluster designed specifically for the training and fine-tuning of machine learning models. This option provides additional GPU resources, higher memory capacity, and/or faster network connections to accommodate these intensive computing tasks.Inference: A cluster designed for low-latency inference and high-load work. In the context of machine learning, inference refers to using trained models to predict or analyze new datasets and provide feedback. Therefore, this option focuses on optimizing latency and throughput to support real-time or near-real-time data processing needs. Next, you need to choose a supplier for your cluster. IO.NET has partnerships with Render Network and the Filecoin miner network, allowing users to select chips from IO.NET or other two networks as the supply source for their computing clusters. This effectively positions IO.NET as an aggregator (note: Filecoin services are temporarily offline). It’s worth noting that IO.NET currently has over 200,000 GPUs available online, while Render Network has over 3,700 GPUs available. Following this, you’ll proceed to the hardware selection phase of your cluster. Presently, IO.NET lists only GPUs as the available hardware option, excluding CPUs or Apple’s iGPUs (M1, M2, etc.), with the GPUs primarily consisting of NVIDIA products. Among the officially listed and available GPU hardware options, based on the data tested by me on the day,  the total number of online GPUs available within the IO.NET network was 206,001. The GPU with the highest availability was the GeForce RTX 4090, with 45,250 units, followed by the GeForce RTX 3090 Ti, with 30,779 units. Furthermore, there are 7,965 units of the highly efficient A100-SXM4-80GB chip (each priced above $15,000) available online, which is more efficient for AI computing tasks such as machine learning, deep learning, and scientific computing. The NVIDIA H100 80GB HBM3, which is designed from the ground up for AI (with a market price of over $40,000), delivers training performance that is 3.3 times greater and inference performance that is 4.5 times higher than the A100. Currently, there are 86 units available online. Once the hardware type for the cluster has been chosen, users will need to specify further details such as the geographic location of the cluster, connectivity speed, the number of GPUs, and the duration. Finally, IO.NET will calculate a detailed bill based on your selected options. As an illustration, consider the following cluster configuration: Cluster Type: General16 A100-SXM4-80GB GPUsConnectivity tier: High SpeedGeographic location: United StatesDuration: 1 week The total bill for this configuration is $3311.6, with the hourly rental price per card being $1.232. The hourly rental price for a single A100-SXM4-80GB on Amazon Web Services, Google Cloud, and Microsoft Azure is $5.12, $5.07, and $3.67 respectively (data sourced from Cloud GPU Comparison, actual prices may vary depending on contract details). Consequently, when it comes to cost, IO.NET offers chip computing power at prices much lower than those of mainstream providers. Additionally, the flexibility in supply and procurement options makes IO.NET an attractive choice for many users. Business Overview Supply Side As of April 4th, 2024, official figures reveal that IO.NET had a total GPU supply of 371,027 units and a CPU supply of 42,321 units on the supply side. In addition, Render Network, as a partner, had an additional 9,997 GPUs and 776 CPUs connected to the network’s supply. Data Source: io.net At the time of writing, 214,387 of the GPUs integrated with IO.NET were online, resulting in an online rate of 57.8%. The online rate for GPUs coming from Render Network was 45.1%. What does this data on the supply side imply? To provide a benchmark, let’s bring in Akash Network, a more seasoned decentralized computing project. Akash Network launched its mainnet as early as 2020, initially focusing on decentralized services for CPUs and storage. It rolled out a testnet for GPU services in June 2023 and subsequently launched the mainnet for decentralized GPU computing power in September of the same year. Akash Network GPU Capacity According to official data from Akash, even though the supply side has been growing continuously since the launch of its GPU network, the total number of GPUs connected to the network remains only 365. When evaluating the volume of GPU supply, IO.NET vastly exceeds Akash Network, operating on a dramatically larger scale. IO.NET has established itself as the largest supply side in the decentralized GPU computing power sector. Demand Side From the demand side, IO.NET is still in the early stages of market cultivation, with a relatively small total volume of computation tasks being executed on its network. The majority of GPUs are online but idle, showing a workload percentage of 0%. Only four chip types—the A100 PCIe 80GB K8S, RTX A6000 K8S, RTX A4000 K8S, and H100 80GB HBM3—are actively engaged in processing tasks, and among these, only the A100 PCIe 80GB K8S is experiencing a workload above 20%. The network’s official stress level reported for the day stood at 0%, indicating that a significant portion of the GPU supply is currently in an online but idle state. Financially, IO.NET has accrued $586,029 in service fees to date, with $3,200 of that total generated on the most recent day. Source: io.net The financials concerning network settlement fees, both in terms of total and daily transaction volumes, align closely with those of Akash. However, it’s important to note that the bulk of Akash’s revenue is derived from its CPU offerings, with an inventory exceeding 20,000 CPUs. Source: Akash Network Stats Additionally, IO.NET has disclosed detailed data for AI inference tasks processed by the network. As of the latest report, the platform has successfully processed and validated over 230,000 inference tasks, though most of this volume stems from BC8.AI, a project sponsored by IO.NET. Source: io.net IO.NET’s supply side is expanding efficiently, driven by expectations surrounding an airdrop and a community event known as “Ignition.” This initiative has rapidly attracted a significant amount of AI computing power. On the demand side, however, expansion remains nascent with insufficient organic demand. The reasons behind this sluggish demand—whether due to uninitiated consumer outreach efforts or unstable service experiences leading to limited large-scale adoption—require further evaluation. Given the challenges in quickly closing the gap in AI computing capabilities, many AI engineers and projects are exploring alternatives, potentially increasing interest in decentralized service providers. Moreover, IO.NET has not yet implemented economic incentives or activities to boost demand, and as the product experience continues to improve, the anticipated equilibrium between supply and demand holds promise for the future. Team Background and Fundraising Overview Team Profile The core team of IO.NET initially focused on quantitative trading. Up until June 2022, they were engaged in creating institutional-level quantitative trading systems for equities and cryptocurrencies. Driven by the system backend’s demand for computing power, the team began exploring the potential of decentralized computing and ultimately focused on the specific issue of reducing the cost of GPU computing services. Founder & CEO: Ahmad Shadid Before founding IO.NET, Ahmad Shadid had worked in quantitative finance and financial engineering, and he is also a volunteer at the Ethereum Foundation. CMO & Chief Strategy Officer: Garrison Yang Garrison Yang officially joined IO.NET in March 2024. Before that, he was the VP of Strategy and Growth at Avalanche and is an alumnus of the University of California, Santa Barbara. COO: Tory Green Tory Green serves as the Chief Operating Officer of IO.NET. He was previously the COO of Hum Capital and the Director of Business Development and Strategy at Fox Mobile Group. He graduated from Stanford University. IO.NET’s LinkedIn profile indicates that the team is headquartered in New York, USA, with a branch office in San Francisco, and employs over 50 staff members. Funding Overview IO.NET has only publicly announced one funding round—a Series A completed in March this year with a valuation of $1 billion, through which they successfully raised $30 million. This round was led by Hack VC, with participation from other investors including Multicoin Capital, Delphi Digital, Foresight Ventures, Animoca Brands, Continue Capital, Solana Ventures, Aptos, LongHash Ventures, OKX Ventures, Amber Group, SevenX Ventures, and ArkStream Capital. Notably, the investment from the Aptos Foundation might have influenced the BC8.AI project’s decision to switch from using Solana for its settlement and accounting processes to the similarly high-performance Layer 1 blockchain, Aptos. Valuation Estimation According to previous statements by founder and CEO Ahmad Shadid, IO.NET is set to launch its token by the end of April 2024. IO.NET has two benchmark projects that serve as references for valuation: Render Network and Akash Network, both of which are representative decentralized computing projects. There are two principal methods to derive an estimate of IO.NET’s market cap:  The Price-to-Sales (P/S) ratio, which compares the FDV to the revenue; FDV-to-Chip Ratio (M/C Ratio) We will start by examining the potential valuation using the Price-to-Sales ratio: Examining the price-to-sales ratio, Akash represents the conservative end of IO.NET’s estimated valuation spectrum, while Render provides a high-end benchmark, positing an FDV ranging from $1.67 billion to $5.93 billion. However, given the updates to the IO.NET project, its more compelling narrative, coupled with its smaller initial market cap and a broader supply base, suggest its FDV could well surpass that of Render Network. Turning to another valuation comparison perspective, namely the “FDV-to-Chip Ratio”. In the context of a market where demand for AI computing power exceeds supply, the most crucial element of decentralized AI computing networks is the scale of GPU supply. Therefore, we can use the “FDV-to-Chip Ratio,” which is the ratio of the project’s fully diluted value to the number of chips within the network, to infer the possible valuation range of IO.NET, providing readers with a reference. Utilizing the market-to-chip ratio to calculate IO.NET’s valuation range places us between $20.6 billion and $197.5 billion, with Render Network setting the upper benchmark and Akash Network the lower. Enthusiasts of the IO.NET project might see this as a highly optimistic estimation of market cap. It is important to consider the current vast number of chips online for IO.NET, stimulated by airdrop expectations and incentive activities. The actual online count of the supply after the project officially launches still requires observation.  Overall, valuations derived from the price-to-sales ratio could offer more reliable insights. IO.NET, built upon Solana and graced with the convergence of AI and DePIN, is on the cusp of its token launch. The anticipation is palpable as we stand by to witness the impact on its market cap post-launch. Reference The Real MergeUnderstanding the Intersection of Crypto and AI

A New Solana-based AI + DePIN Project: A Brief Analysis of Upcoming-Tokenlaunch IO.NET

By Alex Xu, Research Partner at Mint Ventures
Introduction
In our last report, we mentioned that compared to the previous two cycles, the current cryptocurrency bull run is missing the new business models and asset narratives. Artificial Intelligence (AI) is one of the novel narratives in the Web3 space this cycle. This article delves into the hot AI project of the year, IO.NET, and organizes thoughts on the following two questions:
The necessity of AI+Web3 in the commercial landscapeThe necessity and challenges of deploying a decentralized computing network
I will organize key information about the representative project in the decentralized AI computing network: IO.NET, including product design, competitive landscape, and project background. I will also speculate on the project’s valuation metrics.
The insights on The Business Logic Behind The Convergence of AI and Web3 part draw inspiration from “The Real Merge” by Michael Rinko, a research analyst at Delphi Delphi. This analysis assimilates and references ideas from his work, highly recommended for further reading The Real Merge.
Please note that this article reflects my current thinking and may evolve. The opinions here are subjective and there may be errors in facts, data, and logical reasoning. This is not financial advice, but feedback and discussions are welcomed.
The Business Logic Behind The Convergence of AI and Web3
2023: The “Annus Mirabilis” for AI
Reflecting on the annals of human development, it’s clear that technological breakthroughs catalyze profound transformations – from daily life to the industrial landscapes and the march of civilization itself.
In human history, there are two significant years, namely 1666 and 1905, which are now celebrated as the “Annus Mirabilis” in the history of science.
The year 1666 earned its title due to Isaac Newton’s cascade of scientific breakthroughs. In a single year, he pioneered the branch of physics known as optics, founded the mathematical discipline of calculus, and derived the law of gravitation, which is a foundational law of modern natural science. Any one of these contributions was foundational to the scientific development of humanity over the next century, significantly accelerating the overall progress of science.
The other landmark year is 1905, when a mere 26-year-old Einstein published four papers in quick succession in “Annalen der Physik,” covering the photoelectric effect, setting the stage for quantum mechanics; Brownian motion, providing a pivotal framework for stochastic process analysis; the theory of special relativity; and the mass-energy equivalence, encapsulated in the equation E=MC^2. Looking back, each of these papers is considered to surpass the average level of Nobel Prize-winning work in physics—a distinction Einstein himself received for his work on the photoelectric effect. These contributions collectively propelled humanity several strides forward in the journey of civilization.
The year 2023, recently behind us, is poised to be celebrated as another “Miracle Year,” thanks in large part to the emergence of ChatGPT.
Viewing 2023 as a “Miracle Year” in human technology history isn’t just about acknowledging the strides made in natural language processing and generation by ChatGPT. It’s also about recognizing a clear pattern in the advancement of large language models—the realization that by expanding model parameters and training datasets, we can achieve exponential enhancements in model performance. Moreover, it seems boundless in the short term, assuming computing power keeps pace.
This capability extends far beyond language comprehension and conversation generation; it can be widely applied across various scientific fields. Taking the application of large language models in the biological sector as an example: 
In 2018, Nobel Laureate in Chemistry, Frances Arnold, said during her award ceremony, “Today we can for all practical purposes read, write, and edit any sequence of DNA, but we cannot compose it. ” Fast forward five years to 2023, a team of researchers from Stanford University and Salesforce Research, an AI-focused startup, made a publication in “Nature Biotechnology.” Utilizing a large language model refined from GPT-3, they generated an entirely new catalog of 1 million proteins. Among these, they discovered two proteins with distinct structures, both endowed with antibacterial function, potentially paving the way for new bacterial resistance strategies beyond traditional antibiotics. This signifies a monumental leap in overcoming the hurdles of protein creation with AI’s assistance.Before this, the artificial intelligence algorithm AlphaFold predicted the structures of nearly all 2.14 billion protein types on Earth within 18 months—a milestone that amplifies the achievements of structural biologists throughout history by several magnitudes.
The integration of AI models promises to transform industries drastically. From the hard-tech realms of biotech, material science, and drug discovery to the cultural spheres of law and the arts, a transformative wave is set to reshape these fields, with 2023 marking the beginning of it all.
It’s widely acknowledged that the past century has witnessed an exponential rise in humanity’s ability to generate wealth. The swift advancement of AI technologies is expected to accelerate this process.

Global Total GDP Trend, Data Source: World Bank Group

Merging AI and Crypto
To grasp the inherent need for the fusion of AI and crypto, it’s insightful to look at how their distinct features complement each other.
The Symbiosis of AI and Crypto Features
AI is distinguished by three main qualities:
Stochasticity: AI is stochastic, with its content production mechanism being a difficult-to-replicate, enigmatic black box, making its outputs inherently stochastic.Resource Intensive: AI is a resource-intensive industry, requiring significant amounts of energy, chips, and computing power.Human-like Intelligence: AI is (soon to be) capable of passing the Turing test, making it increasingly difficult to distinguish between humans and AI.
※ On October 30, 2023, researchers from the University of California, San Diego, unveiled the Turing test scores for GPT-3.5 and GPT-4.0. The latter achieved a score of 41%, narrowly missing the pass mark of 50% by just 9 percentage points, with humans scoring 63% on the same test. The essence of this Turing test lies in how many participants perceive their chat partner to be human. A score above 50% indicates that a majority believes they are interacting with a human, not a machine, thereby deeming the AI to have successfully passed the Turing test as at least half of the people could not distinguish it from a human.
As AI paves the way for groundbreaking advancements in human productivity, it simultaneously introduces profound challenges to our society, specifically:
How to verify and control the stochasticity of AI, turning it into an advantage rather than a flawHow to bridge the vast requirements for energy and computing power that AI demandsHow to distinguish between humans and AI
Crypto and blockchain technology could offer the ideal solution to the challenges posed by AI, characterized by three key attributes:
Determinism: Operations are based on blockchain, code, and smart contracts, with clear rules and boundaries. Inputs lead to predictable outputs, ensuring a high level of determinism.Efficient Resource Allocation: The crypto economy has fostered a vast, global, and free market, enabling swift pricing, fundraising, and transfer of resources. The presence of tokens further accelerates market supply and demand alignment, rapidly achieving critical mass through incentivization.Trustlessness: With public ledgers and open-source code, anyone can easily verify operations, creating a “trustless” system. Furthermore, Zero-Knowledge (ZK) technology further ensures that privacy is maintained during these verification processes.
To demonstrate the complementarity between AI and the crypto economy, let’s delve into three examples.
Example A: Overcoming Stochasticity with AI Agents Powered by the Crypto Economy
AI Agents are intelligent programs designed to perform tasks on behalf of humans according to their directives, with Fetch.AI being a notable example in this domain. Imagine we task our AI agent with executing a financial operation, such as “investing $1000 in BTC.” The AI agent could face two distinct scenarios:
Scenario 1: The agent is required to interact with traditional financial entities (e.g., BlackRock) to buy BTC ETFs, encountering many compatibility issues with centralized organizations, including KYC procedures, document verification, login processes, and identity authentication, all of which are notably burdensome at present.
Scenario 2: When operating within the native crypto economy, the process becomes simplified. The agent could directly carry out the transaction through Uniswap or a similar trading aggregator, employing your account to sign in and confirm the order, and consequently acquiring WBTC or other variants of wrapped BTC. This procedure is efficient and streamlined. Essentially, this is the function currently served by various Trading Bots, acting as basic AI agents with a focus on trading activities. With further development and integration of AI, these bots will fulfill more intricate trading objectives. For instance, they might monitor 100 smart money addresses on the blockchain, assess their trading strategies and success rates, allocate 10% of their funds to copy their trades over a week, halt operations if the returns are unfavorable, and deduce the potential reasons for these strategies.
AI thrives within blockchain systems, fundamentally because the rules of the crypto economy are explicitly defined, and the system allows for permissionlessness. Operating under clear guidelines significantly reduces the risks tied to AI’s inherent stochasticity. For example, AI’s dominance over humans in chess and video games stems from the fact that these environments are closed sandboxes with straightforward rules. Conversely, advancements in autonomous driving have been more gradual. The open-world challenges are more complex, and our tolerance for AI’s unpredictable problem-solving in such scenarios is markedly lower.
Example B: Resource Consolidation via Token Incentives
The formidable global hash network backing BTC, boasting a current total hash rate of 576.70 EH/s, outstrips the cumulative computing power might of any country’s supercomputers. This growth is propelled by simple and fair incentives within the network.

BTC Hashrate Trend

Moreover, DePIN projects like Mobile, are exploring token incentives to cultivate a market on both the supply side and demand side to foster network effects. The forthcoming focus of this article, IO.NET, is a platform designed to aggregate AI computing power, hoping to unlock the latent potential of AI computing power through a token model.
Example C: Leveraging Open Source and ZK Proof to Differentiate Humans from AI While Protecting Privacy
Worldcoin, a Web3 project co-founded by OpenAI’s Sam Altman, employs a novel approach to identity verification. Utilizing a hardware device known as Orb, it leverages human iris biometrics to produce unique and anonymous hash values via Zero-Knowledge (ZK) technology, differentiating humans from AI. In early March 2024, the Web3 art project Drip started to implement Worldcoin ID to authenticate real humans and allocate rewards.

Worldcoin has recently open-sourced its iris hardware, Orb, ensuring the security and privacy of biometric data.

Overall, due to the determinism of code and cryptography, the resource circulation and fundraising advantages brought by permissionless and token-based mechanisms, alongside the trustless nature based on open-source code and public ledgers, the crypto economy has become a significant potential solution for the challenges that human society faces with AI.
The most immediate and commercially demanding challenge is the extreme thirst for computational resources required by AI products, primarily driven by a substantial need for chips and computing power.
This is also the main reason why distributed computing power projects have led the gains during this bull market cycle in the overall AI sector.
The Business Imperative for Decentralized Computing
AI requires substantial computational resources, necessary for both model training and inference tasks.
It has been well-documented in the training of large language models that once the scale of data parameters is substantial, these models begin to exhibit unprecedented capabilities. The exponential improvements seen from one ChatGPT generation to the next are driven by an exponential growth in computational demands for model training.
Research from DeepMind and Stanford University indicates that across various large language models, when handling different tasks—be it computation, Persian question answering, or natural language understanding—the models only approximate random guessing unless the training involves significantly scaled-up model parameters (and by extension, computational loads). Any task’s performance remains nearly random until computational efforts reach 10^22 FLOPs. Beyond this critical threshold, task performance improves dramatically across any language model.
*FLOPs refer to floating-point operations per second, a measure of computing performance.

Emergent Abilities of Large Language Models

Emergent Abilities of Large Language Models

The principle of “achieving miracles with great effort” in computing power, both in theory and verified in practice, inspired OpenAI’s founder, Sam Altman, to propose an ambitious plan to raise $7 trillion. This fund is intended to establish a chip factory that would exceed the current capabilities of TSMC by tenfold (estimated to cost $1.5 trillion), with the remaining funds allocated for chip production and model training.
In addition to the computational demands of training AI models, the inference processes also require considerable computing power, albeit less than training. This ongoing need for chips and computational resources has become a standard reality for players in the AI field.
In contrast to centralized AI computing providers like Amazon Web Services, Google Cloud Platform, and Microsoft’s Azure, decentralized AI computing offers several compelling value propositions:
Accessibility: Gaining access to computing chips through services like AWS, GCP, or Azure typically requires weeks, and the most popular GPU models are frequently out of stock. Additionally, consumers are usually bound by lengthy, rigid contracts with these large corporations. Distributed computing platforms, on the other hand, provide flexible hardware options with enhanced accessibility.Cost Efficiency: By leveraging idle chips and incorporating token subsidies from network protocols for chip and computing power providers, decentralized computing networks can offer computing power at reduced costs.Censorship Resistance: The supply of cutting-edge chips is currently dominated by major technology companies, and with the United States government ramping up scrutiny of AI computing services, the ability to obtain computing power in a decentralized, flexible, and unrestricted manner is increasingly becoming a clear necessity. This is a core value proposition of web3-based computing platforms.
If fossil fuels were the lifeblood of the Industrial Age, then computing power may well be the lifeblood of the new digital era ushered in by AI, making the supply of computing power an infrastructure for the AI age. Similarly to how stablecoins have emerged as a vigorous derivative of fiat currency in the Web3 epoch, might the distributed computing market evolve into a burgeoning segment within the fast-expanding AI computing market?
This is still an emerging market, and much remains to be seen. However, several factors could potentially drive the narrative or market adoption of decentralized computing:
Persistent GPU Supply Challenges: The ongoing supply constraints for GPUs might encourage developers to explore decentralized computing platforms.Regulatory Expansion: Accessing AI computing services from major cloud platforms involves thorough KYC processes and scrutiny. This could lead to greater adoption of decentralized computing platforms, particularly in areas facing restrictions or sanctions.Token Price Incentives: Increases in token prices during bull markets could enhance the value of subsidies offered to GPU providers by platforms, attracting more vendors to the market, increasing its scale, and lowering costs for consumers.
At the same time, the challenges faced by decentralized computing platforms are also quite evident:
Technical and Engineering ChallengesProof of Work Issues: The computations in deep learning models, due to the hierarchical structure where the output of each layer is used as the input for the next, verifying the validity of computations requires executing all prior work, which is neither simple nor efficient. To tackle this, decentralized computing platforms need to either develop new algorithms or employ approximate verification techniques that offer probabilistic assurance of results, rather than absolute determinism.Parallelization Challenges: Decentralized computing platforms draw upon a diverse array of chip suppliers, each typically offering limited computing power. Completing the training or inference tasks of an AI model by a single chip supplier quickly is nearly impossible. Therefore, tasks must be decomposed and distributed using parallelization to shorten the overall completion time. This approach, however, introduces several complications, including how tasks are broken down (particularly complex deep learning tasks), data dependencies, and the extra connectivity costs between devices.Privacy Protection Issues: How can one ensure that the data and models of the client are not disclosed to the recipient of the tasks?
Regulatory Compliance ChallengesDecentralized computing platforms, due to their permissionless nature in the supply and demand markets, can appeal to certain customers as a key selling point. However, as AI regulatory frameworks evolve, these platforms may increasingly become targets of governmental scrutiny. Moreover, some GPU vendors are concerned about whether their leased computing resources are being used by sanctioned businesses or individuals.
In summary, the primary users of decentralized computing platforms are mostly professional developers or small to medium-sized enterprises. Unlike cryptocurrency and NFT investors, these clients prioritize the stability and continuity of the services provided by the platforms, and pricing is not necessarily their foremost concern. Decentralized computing platforms have a long journey to go before they can win widespread acceptance from this discerning user base.
Next, we will delve into the details and perform an analysis of IO.NET, a new decentralized computing power project in this cycle. We will also compare it with similar projects to estimate its potential market valuation after its launch.
Decentralized AI Computing Platform: IO.NET
Project Overview
IO.NET is a decentralized computing network that has established a two-sided market around chips. On the supply side, there are globally distributed computing powers, primarily GPUs, but also CPUs and Apple’s integrated GPUs (iGPUs). The demand side consists of AI engineers seeking to complete AI model training or inference tasks.
The official IO.NET website states their vision:
Our MissionPutting together one million GPUs in a DePIN – decentralized physical infrastructure network.
Compared to traditional cloud AI computing services, this platform highlights several key advantages:
Flexible Configuration: AI engineers have the freedom to select and assemble the necessary chips into a “cluster” tailored to their specific computing tasks.Rapid Deployment: Unlike the lengthy approval and wait times associated with centralized providers such as AWS, deployment on this platform can be completed in just seconds, allowing for immediate task commencement.Cost Efficiency: The service costs are up to 90% lower than those offered by mainstream providers.
Furthermore, IO.NET plans to launch additional services in the future, such as an AI model store.
Product Mechanism and Business Metrics
Product Mechanisms and Deployment Experience
Similar to major platforms like Amazon Cloud, Google Cloud, and Alibaba Cloud, IO.NET offers a computing service known as IO Cloud. This service operates through a distributed and decentralized network of chips that supports the execution of Python-based machine-learning code for AI and machine-learning applications.
The basic business module of IO Cloud is called Clusters——self-coordinating groups of GPUs designed to efficiently handle computing tasks. AI engineers have the flexibility to customize the clusters to meet their specific needs.
The user interface of IO.NET is highly user-friendly. If you’re looking to deploy your own chip cluster for AI computing tasks, simply navigate to the Clusters page on the platform, where you can effortlessly configure your desired chip cluster according to your requirements.

Cluster Page on IO.NET

First, you need to select your cluster type, with three options available:
General: Provides a general environment, suitable for early stages of a project where specific resource requirements are not yet clear.Train: A cluster designed specifically for the training and fine-tuning of machine learning models. This option provides additional GPU resources, higher memory capacity, and/or faster network connections to accommodate these intensive computing tasks.Inference: A cluster designed for low-latency inference and high-load work. In the context of machine learning, inference refers to using trained models to predict or analyze new datasets and provide feedback. Therefore, this option focuses on optimizing latency and throughput to support real-time or near-real-time data processing needs.
Next, you need to choose a supplier for your cluster. IO.NET has partnerships with Render Network and the Filecoin miner network, allowing users to select chips from IO.NET or other two networks as the supply source for their computing clusters. This effectively positions IO.NET as an aggregator (note: Filecoin services are temporarily offline). It’s worth noting that IO.NET currently has over 200,000 GPUs available online, while Render Network has over 3,700 GPUs available.
Following this, you’ll proceed to the hardware selection phase of your cluster. Presently, IO.NET lists only GPUs as the available hardware option, excluding CPUs or Apple’s iGPUs (M1, M2, etc.), with the GPUs primarily consisting of NVIDIA products.

Among the officially listed and available GPU hardware options, based on the data tested by me on the day,  the total number of online GPUs available within the IO.NET network was 206,001. The GPU with the highest availability was the GeForce RTX 4090, with 45,250 units, followed by the GeForce RTX 3090 Ti, with 30,779 units.
Furthermore, there are 7,965 units of the highly efficient A100-SXM4-80GB chip (each priced above $15,000) available online, which is more efficient for AI computing tasks such as machine learning, deep learning, and scientific computing.

The NVIDIA H100 80GB HBM3, which is designed from the ground up for AI (with a market price of over $40,000), delivers training performance that is 3.3 times greater and inference performance that is 4.5 times higher than the A100. Currently, there are 86 units available online.

Once the hardware type for the cluster has been chosen, users will need to specify further details such as the geographic location of the cluster, connectivity speed, the number of GPUs, and the duration.
Finally, IO.NET will calculate a detailed bill based on your selected options. As an illustration, consider the following cluster configuration:
Cluster Type: General16 A100-SXM4-80GB GPUsConnectivity tier: High SpeedGeographic location: United StatesDuration: 1 week
The total bill for this configuration is $3311.6, with the hourly rental price per card being $1.232.

The hourly rental price for a single A100-SXM4-80GB on Amazon Web Services, Google Cloud, and Microsoft Azure is $5.12, $5.07, and $3.67 respectively (data sourced from Cloud GPU Comparison, actual prices may vary depending on contract details).
Consequently, when it comes to cost, IO.NET offers chip computing power at prices much lower than those of mainstream providers. Additionally, the flexibility in supply and procurement options makes IO.NET an attractive choice for many users.
Business Overview
Supply Side
As of April 4th, 2024, official figures reveal that IO.NET had a total GPU supply of 371,027 units and a CPU supply of 42,321 units on the supply side. In addition, Render Network, as a partner, had an additional 9,997 GPUs and 776 CPUs connected to the network’s supply.

Data Source: io.net

At the time of writing, 214,387 of the GPUs integrated with IO.NET were online, resulting in an online rate of 57.8%. The online rate for GPUs coming from Render Network was 45.1%.
What does this data on the supply side imply?
To provide a benchmark, let’s bring in Akash Network, a more seasoned decentralized computing project.
Akash Network launched its mainnet as early as 2020, initially focusing on decentralized services for CPUs and storage. It rolled out a testnet for GPU services in June 2023 and subsequently launched the mainnet for decentralized GPU computing power in September of the same year.

Akash Network GPU Capacity

According to official data from Akash, even though the supply side has been growing continuously since the launch of its GPU network, the total number of GPUs connected to the network remains only 365.
When evaluating the volume of GPU supply, IO.NET vastly exceeds Akash Network, operating on a dramatically larger scale. IO.NET has established itself as the largest supply side in the decentralized GPU computing power sector.
Demand Side

From the demand side, IO.NET is still in the early stages of market cultivation, with a relatively small total volume of computation tasks being executed on its network. The majority of GPUs are online but idle, showing a workload percentage of 0%. Only four chip types—the A100 PCIe 80GB K8S, RTX A6000 K8S, RTX A4000 K8S, and H100 80GB HBM3—are actively engaged in processing tasks, and among these, only the A100 PCIe 80GB K8S is experiencing a workload above 20%.
The network’s official stress level reported for the day stood at 0%, indicating that a significant portion of the GPU supply is currently in an online but idle state.
Financially, IO.NET has accrued $586,029 in service fees to date, with $3,200 of that total generated on the most recent day.

Source: io.net

The financials concerning network settlement fees, both in terms of total and daily transaction volumes, align closely with those of Akash. However, it’s important to note that the bulk of Akash’s revenue is derived from its CPU offerings, with an inventory exceeding 20,000 CPUs.

Source: Akash Network Stats

Additionally, IO.NET has disclosed detailed data for AI inference tasks processed by the network. As of the latest report, the platform has successfully processed and validated over 230,000 inference tasks, though most of this volume stems from BC8.AI, a project sponsored by IO.NET.

Source: io.net

IO.NET’s supply side is expanding efficiently, driven by expectations surrounding an airdrop and a community event known as “Ignition.” This initiative has rapidly attracted a significant amount of AI computing power. On the demand side, however, expansion remains nascent with insufficient organic demand. The reasons behind this sluggish demand—whether due to uninitiated consumer outreach efforts or unstable service experiences leading to limited large-scale adoption—require further evaluation.
Given the challenges in quickly closing the gap in AI computing capabilities, many AI engineers and projects are exploring alternatives, potentially increasing interest in decentralized service providers. Moreover, IO.NET has not yet implemented economic incentives or activities to boost demand, and as the product experience continues to improve, the anticipated equilibrium between supply and demand holds promise for the future.
Team Background and Fundraising Overview
Team Profile
The core team of IO.NET initially focused on quantitative trading. Up until June 2022, they were engaged in creating institutional-level quantitative trading systems for equities and cryptocurrencies. Driven by the system backend’s demand for computing power, the team began exploring the potential of decentralized computing and ultimately focused on the specific issue of reducing the cost of GPU computing services.
Founder & CEO: Ahmad Shadid
Before founding IO.NET, Ahmad Shadid had worked in quantitative finance and financial engineering, and he is also a volunteer at the Ethereum Foundation.
CMO & Chief Strategy Officer: Garrison Yang
Garrison Yang officially joined IO.NET in March 2024. Before that, he was the VP of Strategy and Growth at Avalanche and is an alumnus of the University of California, Santa Barbara.
COO: Tory Green
Tory Green serves as the Chief Operating Officer of IO.NET. He was previously the COO of Hum Capital and the Director of Business Development and Strategy at Fox Mobile Group. He graduated from Stanford University.
IO.NET’s LinkedIn profile indicates that the team is headquartered in New York, USA, with a branch office in San Francisco, and employs over 50 staff members.
Funding Overview
IO.NET has only publicly announced one funding round—a Series A completed in March this year with a valuation of $1 billion, through which they successfully raised $30 million. This round was led by Hack VC, with participation from other investors including Multicoin Capital, Delphi Digital, Foresight Ventures, Animoca Brands, Continue Capital, Solana Ventures, Aptos, LongHash Ventures, OKX Ventures, Amber Group, SevenX Ventures, and ArkStream Capital.
Notably, the investment from the Aptos Foundation might have influenced the BC8.AI project’s decision to switch from using Solana for its settlement and accounting processes to the similarly high-performance Layer 1 blockchain, Aptos.
Valuation Estimation
According to previous statements by founder and CEO Ahmad Shadid, IO.NET is set to launch its token by the end of April 2024.
IO.NET has two benchmark projects that serve as references for valuation: Render Network and Akash Network, both of which are representative decentralized computing projects.
There are two principal methods to derive an estimate of IO.NET’s market cap: 
The Price-to-Sales (P/S) ratio, which compares the FDV to the revenue; FDV-to-Chip Ratio (M/C Ratio)
We will start by examining the potential valuation using the Price-to-Sales ratio:

Examining the price-to-sales ratio, Akash represents the conservative end of IO.NET’s estimated valuation spectrum, while Render provides a high-end benchmark, positing an FDV ranging from $1.67 billion to $5.93 billion.
However, given the updates to the IO.NET project, its more compelling narrative, coupled with its smaller initial market cap and a broader supply base, suggest its FDV could well surpass that of Render Network.
Turning to another valuation comparison perspective, namely the “FDV-to-Chip Ratio”.
In the context of a market where demand for AI computing power exceeds supply, the most crucial element of decentralized AI computing networks is the scale of GPU supply. Therefore, we can use the “FDV-to-Chip Ratio,” which is the ratio of the project’s fully diluted value to the number of chips within the network, to infer the possible valuation range of IO.NET, providing readers with a reference.

Utilizing the market-to-chip ratio to calculate IO.NET’s valuation range places us between $20.6 billion and $197.5 billion, with Render Network setting the upper benchmark and Akash Network the lower.
Enthusiasts of the IO.NET project might see this as a highly optimistic estimation of market cap.
It is important to consider the current vast number of chips online for IO.NET, stimulated by airdrop expectations and incentive activities. The actual online count of the supply after the project officially launches still requires observation. 
Overall, valuations derived from the price-to-sales ratio could offer more reliable insights.
IO.NET, built upon Solana and graced with the convergence of AI and DePIN, is on the cusp of its token launch. The anticipation is palpable as we stand by to witness the impact on its market cap post-launch.
Reference
The Real MergeUnderstanding the Intersection of Crypto and AI
An Overview of Ultiverse: The AI-powered Game Platform Backed By Top-tier InstitutionsBy Lawrence Lee, Researcher at Mint Ventures Introduction Mint Ventures has always been keeping a keen eye on Web3 gaming. Although the once-popular Play-to-Earn model, brought by the previous bull market, has fallen out of the discussion—highlighted by the dramatic collapse of Axie Infinity and StepN as Ponzi schemes—their peak engagement levels showcased millions of daily active users, marking the crypto space’s first encounter with “Massive Adoption.” Unlike social products, another category with the potential for Massive Adoption, games naturally feature a richer and more complex economic ecosystem. This affords teams refined space and greater control to implement various taxation strategies, coupled with the immersive experiences and irrational consumption brought about by exquisite design, thereby more likely to maintain a relative balance in the ecosystem for a long time. Furthermore, Web3 gaming adeptly leverages crypto’s tokenization opportunities, leading to widespread investor optimism about its future. Following the era of Axie and StepN, Web3 gaming has enjoyed the following favorable conditions: The ongoing enhancement of infrastructures. In 2024, game developers are presented with an abundance of choices in blockchain technology. Whether opting for existing Layer1 solutions like Solana, Layer2 solutions like Arbitrum, employing one-click blockchain deployment services offered by platforms such as Avalanche Subnets and OP Stack, or embracing the trend towards modular blockchain for a custom design, developers now have access to a variety of efficient and cost-effective options. On the wallet front, both MPC and Abstract Account (AA) wallet technologies have reached commercial viability, effectively addressing the issue of private key management for ordinary users.Gradual Recovery of Market Conditions, notably spurred by the recent approval of Bitcoin ETFs, has ignited a robust rally across the crypto space, making crypto once again attract more attention from the general public. This provides a significant boost to the potential user base for Web3 game projects. Compared to traditional Web2 games, Web3 games maintain clear competitive edges: The lifecycle of a typical Web3 game, marked by “NFT sales – FT issuance – game launch,” provides various revenue streams through NFT royalties and FT transaction fees even before the game’s official launch. This model enables game development teams to start earning revenue from day one of their project, helping to balance out the lengthy development process and significant costs involved. Post-launch, the ability to levy fees across different aspects is far more flexible than in Web2 games. The prospect of earlier and more varied monetization strategies is an attraction for game development teams toward the Web3 space.Web3 gaming has yet to be dominated by powerful distributors, which means high user acquisition costs common in Web2 gaming are not a concern here. Put simply, Web3 gaming isn’t as fiercely competitive, presenting an open field ripe for development by talented teams. We’ve observed a lot of skilled game development teams venturing into Web3 gaming. These teams are creating many games with excellent narratives, engaging gameplay, and high playability. The supply side of Web3 games has seen significant development, which also forms the favorable conditions of Web3 games. Given these developments, Mint Ventures maintains its keen interest in Web3 gaming. Ultiverse is a game project that Mint Ventures has been continuously following since the previous bear market stage. Despite the downturn, Ultiverse has continued to build, constantly enriching its product matrix, actively enhancing its storytelling, and also possesses an excellent investment background. Recently, they have launched several airdrop activities aimed at attracting new users, which are worth keeping an eye on. Disclosure: Mint Ventures participated in Ultiverse’s ElectricSheep NFT Builder Round and holds ElectricSheep NFTs. Ultiverse Landscape Ultiverse is a blockchain-based metaverse with deep integration with AI. Its architecture is divided into 4 layers: Protocol Layer: Ultiverse uses the Bodhi Protocol as the foundation of the entire ecosystem. According to its official documents, the Bodhi Protocol employs large language models and Stable Diffusion to generate a variety of in-game content. This AI-driven approach helps to provide comprehensive support across the ecosystem by producing more intelligent non-player characters (NPCs) and enriching narrative backgrounds. Infrastructure: Ultiverse abstracts a series of gaming infrastructures into SDKs, which will be available for developer utilization. These infrastructures include MPC wallets, AA wallets, native marketplaces, and a DID system. Moreover, its proprietary live streaming platform, Ultiverse Live, has captured the attention of 350,000 followers on Binance live, alongside receiving close to 40 million likes, offering substantial visibility for its collaborators. Additionally, Ultiverse’s infrastructure features its dedicated task platform, Mission Runner. Core Assets: Ultiverse’s prime assets presently encompass the Electric Sheep NFT, Meta GF NFT, and World Fragment.  Electric Sheep NFT, as the first asset of the Ultiverse ecosystem and one of its core assets, has a total supply of 7000. Launched in July 2022 at 0.5 ETH, its floor price now hovers around 2 ETH, providing holders with quadruple returns in ETH and octuple in USD, significantly strengthening its community foundation. Holders of Electric Sheep gain early access and boosted rewards across numerous Ultiverse ecosystem projects, in addition to being eligible for airdrops of Ultiverse’s governance token, $ULTC. The Floor Price Trend of Electric Sheep NFT The Meta GF NFT was issued at the end of 2022, positioned as an AI companion that supports customizable appearances, designed to join players in their explorations within the Ultiverse. Meanwhile, the World Fragment NFT, pending release, is designated to be airdropped to Electric Sheep NFT holders. As detailed in official documents, players can merge fragments to forge unique worlds. Ownership of varied World NFTs can unlock distinct, AI-driven personal experiences in the game. This includes divergent storylines, characters, world objectives, and core conflicts, among other elements, suggesting that different NFTs pave the way for unique gameplay experiences. Dapp, also known as “Micro Worlds” within the Ultiverse ecosystem, are divided into two categories: those in partnership with Ultiverse and those developed internally by Ultiverse. Partner Dapps includes the likes of the casual education simulation game Meta Merge and the upcoming racing game BAC on Blast. These partnerships extend beyond mere collaboration for traffic acquisition to encompass a deeper level of asset interoperability. For example, Meta Merge once airdropped its token $MMM to Electric Sheep holders, and Meta Merge’s NFT holders can also receive a portion of the upcoming $ULTC airdrop. The other category consists of Dapps developed in-house by Ultiverse, including Ulti-pilot, Terminus, and Endless Loop, which were just launched at the end of February. Launched in March, Terminus is a pivotal platform within the Ultiverse ecosystem, built using the cutting-edge UE5 engine, and is available in both PC and VR versions. Official documentation reveals Terminus as a distinctive virtual world where, different from average games, all NPCs are powered by sophisticated AI. These NPCs can engage players in diverse, highly interactive dialogues and interactions, which could significantly influence the gameplay and its outcomes. Furthermore, the game’s characters are also AI-operated, allowing for their continuous evolution even when the player is offline. This feature cements Terminus as an eternally active game universe. Terminus Gameplay Ulti-pilot was launched in February and serves as the gateway for users to experience the Ultiverse world. Users can dispatch their characters to explore the Ultiverse world and can earn substantial $Soul incentives, which are redeemable for the governance token, $ULTC, in the future. Endless Loop is an MMORPG game within the Ultiverse, also built upon the UE5 engine. The game has not yet been launched. The product structure of Ultiverse reveals that it aspires to be more than a game; it aims to evolve into an AI-driven metaverse platform. This ambition entails connecting with a vast user base while simultaneously engaging with many Web3 projects, thereby bringing its Meta-Fi concept to fruition. In the future roadmap, Ultiverse is poised to focus on the comprehensive rollout of its products in 2024, which includes launching both the PC and VR editions of Terminus and fostering collaborations with additional gaming partners. In the latter half of 2024, they aim to introduce a Game Launchpad and initiate their Rollup. This approach is designed to enhance support for their gaming collaborators and to further solidify their vision of a Web3 gaming ecosystem. Financing and Partners Ultiverse has completed three rounds of financing, with an impressive lineup of investors: On March 18, 2022, Ultiverse completed a $4.5 million seed round financing at a valuation of $50 million. This round was co-led by Binance Labs and Defiance Capital, with participation from Three Arrows Capital and SkyVision Capital. Following closely on March 25, 2022, Binance Labs made an additional investment of $5 million in Ultiverse, executed through an equity purchase. Nicole Zhang, the director at Binance Labs, emphasized that this investment aims to ensure that Binance Labs retains a voice in the strategic direction of Ultiverse’s team moving forward. On February 14, 2024, Ultiverse successfully closed a strategic funding round, securing $4 million on a valuation of $150 million. The round was led by IDG Capital, with notable contributions from many leading investors such as Animoca Brands, Polygon Ventures, MorningStar Ventures, Taiko, ZetaChain, Manta Network, and DWF Ventures. This round was also marked by the engagement of prominent NFT influencers and KOLs, including Dingaling, Grail.eth, Christian2022.eth, and 0xSun, among others. Ultiverse’s Investors Overall, Ultiverse boasts a remarkable investment pedigree, featuring top-tier VCs, market makers, exchanges, public blockchains, and influencers, all of which can support Ultiverse’s multifaceted development. Frank, the founder of Ultiverse, graduated from Carnegie Mellon University and had a deep passion for gaming. With over 200 members, the team’s background is equally impressive, boasting veterans from esteemed gaming companies such as Gameloft, Blizzard, Ubisoft, and Tencent who contributed to the development and design of well-known games like Elden Ring, Assassin’s Creed, and Prince of Persia. As its goal is to become a platform that connects players with games, Ultiverse has forged an extensive network of partnerships, collaborating broadly with various projects and communities within the gaming ecosystem. An example is the “Finding Your Path Partners” campaign held by Terminus in March, a concerted effort by Ultiverse alongside a variety of collaborators, including public blockchains like Zetachain, other gaming ventures such as Ainchess, infrastructure service providers like Rpggo and Particle Network, gaming guilds including N9Club and GuildFi, and NFT communities like the Weirdo Ghost Gang. UltiverseDAO Moreover, Ultiverse recently revealed that it has established collaborations with more than 300 teams in the AI and gaming realms across both Web2 and Web3. This extensive partnership network includes virtually all the top Web3 gaming projects, advancing Ultiverse’s ambition to develop a comprehensive gaming platform. Tokenomics Ultiverse has unveiled the utilities and tokenomics of its governance token $ULTC: In the Ultiverse ecosystem, $ULTC has three primary utilities:  GovernanceEntry Pass, Dapps looking to integrate with the Ultiverse ecosystem are required to pay with $ULTC, with a portion of these tokens allocated to usersPayment Method, $ULTC serves as a payment method for assets within the Ultiverse ecosystem The total supply of $ULTC is set at 10 billion tokens, with its allocation detailed as follows: Ultiverse Token Structure Overview A point of interest is that 8% of the total token supply will be allocated for airdrops, targeting holders of the Electric Sheep NFT and MetaGF NFT, $Soul token holders, and holders of other assets within the ecosystem. The current floor price of Electric Sheep NFT is around 2 ETH, with a total market cap of approximately 14,000 ETH, equivalent to about $47.6 million. If we base our calculations on Electric Sheep NFT holders receiving 2% of the $ULTC supply, this would equate to a fully diluted market cap for $ULTC of $2.38 billion. Such a valuation aligns closely with that of another Game+Ai project, Portal, which was recently listed on Binance, suggesting a fair valuation. Beyond the $ULTC airdrop, the Electric Sheep NFT carries several additional benefits within the Ultiverse ecosystem, including access to the previously mentioned World NFT. $Soul represents the “points” system within the Ultiverse ecosystem, initially available through staking Electric Sheep NFT as a reward for early holders. The team previously announced the redeem ratio of $Soul to $ULTC as 100:1. After unveiling the tokenomics, Ultiverse sparked a series of initiatives aimed at drawing new users, aiming to leverage Soul to further promote the Ultiverse brand. Among these, the Ulti-pilot campaign that started in late February will release 10 billion $Soul (equivalent to 100 million $ULTC, accounting for 1% of the total supply), drawing significant user participation. Conclusion Ultiverse stands out as a project meriting attention for the following reasons: It occupies a favorable position in Web3 gaming, poised for breakout success in the bullish cycle.Throughout the bear market, Ultiverse’s team has consistently built upon its foundation, delivering not only engaging products but also cultivating a tightly-knit community.With the striking progress of artificial intelligence over the past year drawing global interest, Ultiverse’s strategic integration of AI throughout its platform anticipates positive reception as the AI+gaming narrative gains prominence.The project benefits from a robust investment background.

An Overview of Ultiverse: The AI-powered Game Platform Backed By Top-tier Institutions

By Lawrence Lee, Researcher at Mint Ventures

Introduction
Mint Ventures has always been keeping a keen eye on Web3 gaming. Although the once-popular Play-to-Earn model, brought by the previous bull market, has fallen out of the discussion—highlighted by the dramatic collapse of Axie Infinity and StepN as Ponzi schemes—their peak engagement levels showcased millions of daily active users, marking the crypto space’s first encounter with “Massive Adoption.” Unlike social products, another category with the potential for Massive Adoption, games naturally feature a richer and more complex economic ecosystem. This affords teams refined space and greater control to implement various taxation strategies, coupled with the immersive experiences and irrational consumption brought about by exquisite design, thereby more likely to maintain a relative balance in the ecosystem for a long time. Furthermore, Web3 gaming adeptly leverages crypto’s tokenization opportunities, leading to widespread investor optimism about its future.
Following the era of Axie and StepN, Web3 gaming has enjoyed the following favorable conditions:
The ongoing enhancement of infrastructures. In 2024, game developers are presented with an abundance of choices in blockchain technology. Whether opting for existing Layer1 solutions like Solana, Layer2 solutions like Arbitrum, employing one-click blockchain deployment services offered by platforms such as Avalanche Subnets and OP Stack, or embracing the trend towards modular blockchain for a custom design, developers now have access to a variety of efficient and cost-effective options. On the wallet front, both MPC and Abstract Account (AA) wallet technologies have reached commercial viability, effectively addressing the issue of private key management for ordinary users.Gradual Recovery of Market Conditions, notably spurred by the recent approval of Bitcoin ETFs, has ignited a robust rally across the crypto space, making crypto once again attract more attention from the general public. This provides a significant boost to the potential user base for Web3 game projects.
Compared to traditional Web2 games, Web3 games maintain clear competitive edges:
The lifecycle of a typical Web3 game, marked by “NFT sales – FT issuance – game launch,” provides various revenue streams through NFT royalties and FT transaction fees even before the game’s official launch. This model enables game development teams to start earning revenue from day one of their project, helping to balance out the lengthy development process and significant costs involved. Post-launch, the ability to levy fees across different aspects is far more flexible than in Web2 games. The prospect of earlier and more varied monetization strategies is an attraction for game development teams toward the Web3 space.Web3 gaming has yet to be dominated by powerful distributors, which means high user acquisition costs common in Web2 gaming are not a concern here. Put simply, Web3 gaming isn’t as fiercely competitive, presenting an open field ripe for development by talented teams.
We’ve observed a lot of skilled game development teams venturing into Web3 gaming. These teams are creating many games with excellent narratives, engaging gameplay, and high playability. The supply side of Web3 games has seen significant development, which also forms the favorable conditions of Web3 games.
Given these developments, Mint Ventures maintains its keen interest in Web3 gaming. Ultiverse is a game project that Mint Ventures has been continuously following since the previous bear market stage. Despite the downturn, Ultiverse has continued to build, constantly enriching its product matrix, actively enhancing its storytelling, and also possesses an excellent investment background. Recently, they have launched several airdrop activities aimed at attracting new users, which are worth keeping an eye on.
Disclosure: Mint Ventures participated in Ultiverse’s ElectricSheep NFT Builder Round and holds ElectricSheep NFTs.
Ultiverse Landscape
Ultiverse is a blockchain-based metaverse with deep integration with AI. Its architecture is divided into 4 layers:

Protocol Layer: Ultiverse uses the Bodhi Protocol as the foundation of the entire ecosystem. According to its official documents, the Bodhi Protocol employs large language models and Stable Diffusion to generate a variety of in-game content. This AI-driven approach helps to provide comprehensive support across the ecosystem by producing more intelligent non-player characters (NPCs) and enriching narrative backgrounds.
Infrastructure: Ultiverse abstracts a series of gaming infrastructures into SDKs, which will be available for developer utilization. These infrastructures include MPC wallets, AA wallets, native marketplaces, and a DID system. Moreover, its proprietary live streaming platform, Ultiverse Live, has captured the attention of 350,000 followers on Binance live, alongside receiving close to 40 million likes, offering substantial visibility for its collaborators. Additionally, Ultiverse’s infrastructure features its dedicated task platform, Mission Runner.
Core Assets: Ultiverse’s prime assets presently encompass the Electric Sheep NFT, Meta GF NFT, and World Fragment. 
Electric Sheep NFT, as the first asset of the Ultiverse ecosystem and one of its core assets, has a total supply of 7000. Launched in July 2022 at 0.5 ETH, its floor price now hovers around 2 ETH, providing holders with quadruple returns in ETH and octuple in USD, significantly strengthening its community foundation. Holders of Electric Sheep gain early access and boosted rewards across numerous Ultiverse ecosystem projects, in addition to being eligible for airdrops of Ultiverse’s governance token, $ULTC.

The Floor Price Trend of Electric Sheep NFT

The Meta GF NFT was issued at the end of 2022, positioned as an AI companion that supports customizable appearances, designed to join players in their explorations within the Ultiverse.
Meanwhile, the World Fragment NFT, pending release, is designated to be airdropped to Electric Sheep NFT holders. As detailed in official documents, players can merge fragments to forge unique worlds. Ownership of varied World NFTs can unlock distinct, AI-driven personal experiences in the game. This includes divergent storylines, characters, world objectives, and core conflicts, among other elements, suggesting that different NFTs pave the way for unique gameplay experiences.
Dapp, also known as “Micro Worlds” within the Ultiverse ecosystem, are divided into two categories: those in partnership with Ultiverse and those developed internally by Ultiverse. Partner Dapps includes the likes of the casual education simulation game Meta Merge and the upcoming racing game BAC on Blast. These partnerships extend beyond mere collaboration for traffic acquisition to encompass a deeper level of asset interoperability. For example, Meta Merge once airdropped its token $MMM to Electric Sheep holders, and Meta Merge’s NFT holders can also receive a portion of the upcoming $ULTC airdrop. The other category consists of Dapps developed in-house by Ultiverse, including Ulti-pilot, Terminus, and Endless Loop, which were just launched at the end of February.
Launched in March, Terminus is a pivotal platform within the Ultiverse ecosystem, built using the cutting-edge UE5 engine, and is available in both PC and VR versions. Official documentation reveals Terminus as a distinctive virtual world where, different from average games, all NPCs are powered by sophisticated AI. These NPCs can engage players in diverse, highly interactive dialogues and interactions, which could significantly influence the gameplay and its outcomes. Furthermore, the game’s characters are also AI-operated, allowing for their continuous evolution even when the player is offline. This feature cements Terminus as an eternally active game universe.

Terminus Gameplay

Ulti-pilot was launched in February and serves as the gateway for users to experience the Ultiverse world. Users can dispatch their characters to explore the Ultiverse world and can earn substantial $Soul incentives, which are redeemable for the governance token, $ULTC, in the future.
Endless Loop is an MMORPG game within the Ultiverse, also built upon the UE5 engine. The game has not yet been launched.
The product structure of Ultiverse reveals that it aspires to be more than a game; it aims to evolve into an AI-driven metaverse platform. This ambition entails connecting with a vast user base while simultaneously engaging with many Web3 projects, thereby bringing its Meta-Fi concept to fruition.
In the future roadmap, Ultiverse is poised to focus on the comprehensive rollout of its products in 2024, which includes launching both the PC and VR editions of Terminus and fostering collaborations with additional gaming partners. In the latter half of 2024, they aim to introduce a Game Launchpad and initiate their Rollup. This approach is designed to enhance support for their gaming collaborators and to further solidify their vision of a Web3 gaming ecosystem.
Financing and Partners
Ultiverse has completed three rounds of financing, with an impressive lineup of investors:
On March 18, 2022, Ultiverse completed a $4.5 million seed round financing at a valuation of $50 million. This round was co-led by Binance Labs and Defiance Capital, with participation from Three Arrows Capital and SkyVision Capital.
Following closely on March 25, 2022, Binance Labs made an additional investment of $5 million in Ultiverse, executed through an equity purchase. Nicole Zhang, the director at Binance Labs, emphasized that this investment aims to ensure that Binance Labs retains a voice in the strategic direction of Ultiverse’s team moving forward.
On February 14, 2024, Ultiverse successfully closed a strategic funding round, securing $4 million on a valuation of $150 million. The round was led by IDG Capital, with notable contributions from many leading investors such as Animoca Brands, Polygon Ventures, MorningStar Ventures, Taiko, ZetaChain, Manta Network, and DWF Ventures. This round was also marked by the engagement of prominent NFT influencers and KOLs, including Dingaling, Grail.eth, Christian2022.eth, and 0xSun, among others.

Ultiverse’s Investors

Overall, Ultiverse boasts a remarkable investment pedigree, featuring top-tier VCs, market makers, exchanges, public blockchains, and influencers, all of which can support Ultiverse’s multifaceted development.
Frank, the founder of Ultiverse, graduated from Carnegie Mellon University and had a deep passion for gaming. With over 200 members, the team’s background is equally impressive, boasting veterans from esteemed gaming companies such as Gameloft, Blizzard, Ubisoft, and Tencent who contributed to the development and design of well-known games like Elden Ring, Assassin’s Creed, and Prince of Persia.
As its goal is to become a platform that connects players with games, Ultiverse has forged an extensive network of partnerships, collaborating broadly with various projects and communities within the gaming ecosystem.
An example is the “Finding Your Path Partners” campaign held by Terminus in March, a concerted effort by Ultiverse alongside a variety of collaborators, including public blockchains like Zetachain, other gaming ventures such as Ainchess, infrastructure service providers like Rpggo and Particle Network, gaming guilds including N9Club and GuildFi, and NFT communities like the Weirdo Ghost Gang.

UltiverseDAO

Moreover, Ultiverse recently revealed that it has established collaborations with more than 300 teams in the AI and gaming realms across both Web2 and Web3. This extensive partnership network includes virtually all the top Web3 gaming projects, advancing Ultiverse’s ambition to develop a comprehensive gaming platform.

Tokenomics
Ultiverse has unveiled the utilities and tokenomics of its governance token $ULTC:

In the Ultiverse ecosystem, $ULTC has three primary utilities: 
GovernanceEntry Pass, Dapps looking to integrate with the Ultiverse ecosystem are required to pay with $ULTC, with a portion of these tokens allocated to usersPayment Method, $ULTC serves as a payment method for assets within the Ultiverse ecosystem
The total supply of $ULTC is set at 10 billion tokens, with its allocation detailed as follows:

Ultiverse Token Structure Overview

A point of interest is that 8% of the total token supply will be allocated for airdrops, targeting holders of the Electric Sheep NFT and MetaGF NFT, $Soul token holders, and holders of other assets within the ecosystem.
The current floor price of Electric Sheep NFT is around 2 ETH, with a total market cap of approximately 14,000 ETH, equivalent to about $47.6 million. If we base our calculations on Electric Sheep NFT holders receiving 2% of the $ULTC supply, this would equate to a fully diluted market cap for $ULTC of $2.38 billion. Such a valuation aligns closely with that of another Game+Ai project, Portal, which was recently listed on Binance, suggesting a fair valuation. Beyond the $ULTC airdrop, the Electric Sheep NFT carries several additional benefits within the Ultiverse ecosystem, including access to the previously mentioned World NFT.
$Soul represents the “points” system within the Ultiverse ecosystem, initially available through staking Electric Sheep NFT as a reward for early holders. The team previously announced the redeem ratio of $Soul to $ULTC as 100:1. After unveiling the tokenomics, Ultiverse sparked a series of initiatives aimed at drawing new users, aiming to leverage Soul to further promote the Ultiverse brand. Among these, the Ulti-pilot campaign that started in late February will release 10 billion $Soul (equivalent to 100 million $ULTC, accounting for 1% of the total supply), drawing significant user participation.
Conclusion
Ultiverse stands out as a project meriting attention for the following reasons:
It occupies a favorable position in Web3 gaming, poised for breakout success in the bullish cycle.Throughout the bear market, Ultiverse’s team has consistently built upon its foundation, delivering not only engaging products but also cultivating a tightly-knit community.With the striking progress of artificial intelligence over the past year drawing global interest, Ultiverse’s strategic integration of AI throughout its platform anticipates positive reception as the AI+gaming narrative gains prominence.The project benefits from a robust investment background.
Exploring The Dormant TON Community and Its Cross-ecosystem ConnectivityBy Lydia Wu, Researcher at Mint Ventures Decoupling the classical chain concept The figures on DefiLlama and CoinMarketCap seem to conclusively show that TON is an “under-the-radar giant among chains” — consistently ranking within the top 20 in terms of market cap, yet its 24-hour trading volume lingers beyond the top 100. Despite reaching a new peak in Total Value Locked (TVL) at $53 million, it failed to enter the top 50 public chains; moreover, its market cap to TVL ratio hit an astonishing 293, which is 33 times that of Ethereum. Source: DefiLlama, CoinMarketCap Recent months have seen a decoupling in the Ethereum-defined concept of chains, facilitated by the advent of Rollups and modular approaches, diluting the traditional narrative around blockchain legitimacy. The market has accepted Blast’s mere name without substantial blockchain functionality and acknowledged Arweave AO’s actions of leveraging network effect for actual public chain operations. This shift has accelerated the unleashing of TON’s potential under its “less legitimacy” positioning. The market’s evaluation of TON now extends beyond TVL-centric narratives of classic chains, suggesting a closer look at metrics like daily and monthly active users, adding a new dimension to understanding its value. Recently, the founder of Telegram, Pavel Durov, shared with the Financial Times that Telegram’s monthly active users have soared past 900 million, reaching a valuation of over 30 billion USD, and the company is contemplating an Initial Public Offering. This starkly contrasts with Meta’s 3 billion monthly active users and a market capitalization of 1.3 trillion USD, placing Telegram’s valuation at just 1/40th of Meta. Telegram’s user demographic is notably concentrated in Asia, Europe, South America, and the Middle East, characterized by a significant presence of retail investors and a robust demand for peer-to-peer payments, positioning it as an ideal audience onboarding into the Web3 space. As Telegram continues to expand, if TON can draw in 30% of Telegram’s users within the next 3 to 5 years, it could effectively bolster its valuation. Source: DefiLlama, CoinMarketCap, Nansen, Token Terminal New Paradigm of Web3 Creator Economy: Leveraging Telegram for Content Packaging and Value Transfer Redefining Distribution: How Telegram Activates the Web2 Incremental Growth Market Last year, I outlined the three stages of the Web3 creator economy, with Phase 1.0 embracing the straightforward “blockchain+” approach, integrating centralized content creation with Web3 distribution mechanisms, predominantly through NFTs. This model remains mainstream, except NFTs are beginning to be used more as speculative Pre-token tools rather than as creative mediums. Throughout this year, my immersion in the TON ecosystem has gradually made me realize that, along the same attention-driven pathways, the role of NFTs wrapping content, vouchers, and governance are seamlessly supplanted by a more intuitive “channel-advertising-payment” framework. The briefly celebrated Friend.tech serves as an intermediate form of this idea—wrapping group chats into assets, albeit with limited content scalability and an unsustainable economic model. Compared to many NFT and SocialFi projects still struggling to find application scenarios for their assets, the TON ecosystem, developed on the Telegram platform, offers a more friendly product experience and value network for Web2 developers, creators, and users — on existing social networks, further deconstructing the Web3 distribution model into a combination of traditional Web2 business practices, middleware wallets, and Web3 financial settlements. This approach not only simplifies the onboarding process for both creators and users but also diversifies the creators’ revenue streams through mechanisms like ad revenue sharing. Source: Lydia @Mint Ventures From Content to Services, Trading Bots Facilitate the Transition of Web3 User Base WeChat, a super App in the Web2 space, has achieved ecosystem expansion from a content-centric to a service-oriented ecosystem by launching service accounts and mini-programs. Telegram has followed a similar path by introducing business accounts and service-oriented bots to solidify its position as a key gateway for mobile traffic, among which Trading Bots have won the favor of the crypto degens. The Trading Bot sector has seen explosive growth since 2023, emerging as one of the few products with genuine demand and cash flow during the bear market. As the market shifts from bear to bull , the financial allure of these Bots has intensified, with leading protocols like Banana Gun and Unibot generating daily revenues exceeding $200,000. Though a minority are hosted on websites and Discord, the majority of Trading Bots are integrated through Telegram. Source: https://dune.com/whale_hunter/dex-trading-bot-wars At this stage, the operation of Trading Bots and TON are not closely related, primarily leveraging Telegram’s Bot module for integration with Ethereum and Solana. However, the popularity of Bots has greatly contributed to market education and the nurturing of user engagement within the TON ecosystem. As crypto enthusiasts grow more comfortable with innovative interaction models like Bots, the barrier to exploring other non-trading products within the TON ecosystem will significantly diminish. Supported by a fully on-chain identity system and payment infrastructure, TON is theoretically poised to develop a coherent system for content packaging and value transfer. Exploring the Bazzar Mode: The Bots and Mini Apps of TON The TON ecosystem currently hosts a range of products that, at first glance, seem to consist merely of simply Telegram Bot portals combined with HTML5 websites, coupled with lively Telegram groups flooding the screen, making one feel as if they are in a bustling and chaotic bazzar. In the Meme project Notcoin, players engage in the activity of tapping on coins to earn points. Additionally, they can join teams led by celebrities, such as Telegram’s founder Durov, or gain attraction by sending “red packets” to friends. Through its no-frills gameplay and viral marketing strategy, Notcoin quickly attracted 5 million players within just a week, with the current player base exceeding 26 million and 1 million followers on X. In the recently launched Pre-Market trading, the biggest deal was a purchase of one billion points with 1100 $TON, which was equivalent to approximately $4521. Farcaster, a favorite among crypto influencers, introduced its Frames in January 2024, celebrated for being “a groundbreaking innovation.” Yet, this feature’s efficacy on the densely populated mobile landscape leaves something to be desired. Although users can perform basic Swap and Mint operations, when it comes to more complex interactions, even to the extent of the simplest mini-games, executing these within a diminutive frame that takes up less than one-third of a smartphone screen can be a significant visual challenge. By contrast, what is rarely mentioned is that the integration of Telegram and TON has already achieved an almost seamless transition from chat boxes to semi-native applications, with the response speed of calling up applications via Bots even faster than WeChat mini-programs.  In June 2015, Telegram launched its Bot feature with limitations such as the inability to custom interface design and the absence of direct client-server communication. Based on this functionality, Trading Bots act more as intermediary interfaces than independent apps, facing constraints in response speed and the challenge of conducting multiple simultaneous interactions. In April 2022, Telegram rolled out the Mini App, granting developers complete control over user interfaces and enabling direct client-to-server communication. Mini Apps provides a more friendly user experience and enhanced composability, with seamless wallet integration and other infrastructure capabilities, making it well-suited for deploying Web3 products. Mini Apps have the potential to supplant all mobile websites. After the launch of Mini Apps, Bots have not been relegated to obsolescence; instead, they play a “relay room” role. They serve as the primary gateway for user interactions, seamlessly connecting a series of Mini Apps. Source: TON x Fans Source: Lydia Deploying a Bot or Mini App is relatively easy. Users can swiftly configure their Bot through a Q&A format on the Telegram @BotFather channel. Additionally, they can explore a virtual dining experience with a Mini App by visiting @DurgerKingBot and experiencing the setup firsthand. Unique On-Chain Experience: A Spotlight on the TON Ecosystem Lite Gaming In the current landscape, where blockchain capabilities far exceed demand, the platform choice—whether it’s on Ethereum Layer2, Solana, or Binance Smart Chain—makes little difference to a game’s playability. Yet, games imbued with strong social attributes, like chess, card games, or other party games, would offer vastly different experiences when it comes to one-click sharing for team formation on Telegram or waiting for random matches with strangers on the network. Originally an MMORPG game on Facebook, Tap Fantasy attracted over 700,000 players after venturing into Web3 on the Binance Smart Chain and Solana blockchains. By August 2023, Tap Fantasy became the premier IDO project on TonUP, the first Launchpad within the Ton ecosystem, with its token $MC achieving a sell-out in just half an hour. Come November 2023, a TON-based new version of Tap Fantasy was rolled out by Pluto, a Web3 gaming incubator. This update broke through 600,000 players in three months, with the on-chain players exceeding 16,000. A robust in-game economy propelled the value of the $MC token from a 1:5 to a 1:1 exchange rate against TON. Catizen, a new game developed by Pluto, merges AI technology with the metaverse concept to offer a unique cat-raising experience. Its beta version was released on March 7, 2024, quickly attracting a community of over 160,000 players and 13,000 blockchain participants in merely five days. Catizen also partnered with $FISH, a leading meme within the TON ecosystem, announcing an upcoming airdrop to $FISH token holders following the conclusion of its beta testing phase. Source: Tap Fantasy TON version dashboard Source: Catizen dashboard Social Inscriptions and Memes As a novel approach to asset distribution, the expansion of the Bitcoin inscription ecosystem to multiple chains is a trending experiment. TON’s inscription ecosystem has cleverly incorporated Telegram’s front-end and native wallet, streamlining user interactions while implementing safeguards against spambots. $NANO: As the first TON-20 inscription, it has injected the TON ecosystem with 20 million interactions and 36,000 unique minting addresses.$GRAM: Drawing its name from the Telegram Open Network’s native token, which faced regulations from the SEC, it pioneered the deployment, minting, and transferring processes via the Telegram Mini Apps.$TONOT: Breaking $NANO’s records with 61,000 unique minting addresses and 57,000 holders, TONOT facilitates transitions between inscriptions, NFTs, and tokens. Its roadmap extends to in-game currencies, decentralized identities (DID), and staking mechanisms, among other features. Meme assets had a period of scarcity on the lesser-known TON blockchain, creating a gap in crypto awareness and interaction with TON, until Notcoin captured widespread attention. $NOT: Notcoin plans to airdrop $NOT tokens in late March or early April. Pre-market trading for $NOT is currently active on Getgems, an NFT marketplace within the TON ecosystem.$REDO: Inspired by a sketch made by Durov, during a protest, $REDO has risen to become the meme with the highest market cap within the TON ecosystem.$FISH: Marking its place as the TON ecosystem’s first social meme, Ton Fish has amassed a community exceeding 18,000 holders.$TPET: Emerging from the Ton Fish ecosystem, $TPET fair launch is ongoing until March 26. It is positioned to be the key token for the upcoming game Ton Pet: Tik Ton, offering $FISH and NFT holders a chance to participate in an airdrop. Multichain Liquidity Launchpad XTON is the first launchpad to introduce multichain liquidity within the TON ecosystem, featuring team members from the TON Foundation. XTON is scheduled to finalize its mainnet launch and token sale in Q1 2024 and to kick off its first project in Q2 2024. In line with XTON’s vision, the TON ecosystem is poised to bridge the gap and facilitate bilateral traffic integration between the Web2 social networking giant and the Web3 EVM-compatible world. Moving Toward a Future of Interconnectivity Since March, with the announcement of Telegram using TON to process advertising revenues, Binance launching a USDⓈ-M TON perpetual contract, and Telegram seeking an IPO, the TON token, previously dormant, has swiftly achieved a notable leap in both price and on-chain activity. Source: https://www.tonstat.com/ Reflecting on TON’s journey since its 2018 inception—from Telegram’s launch to community stewardship, the establishment of the first cross-chain bridge, to the evolution of its infrastructure—the TON ecosystem’s resilience and dynamism stand out. With a 2024 focus on stablecoins, cross-chain bridges, and expansion in the Asian market, there’s an eagerness to witness TON evolve into a genuinely open network fostering interconnectivity across various regions, ecosystems, and applications, potentially offering each participant a glimpse into the long-promised future of blockchain technology. *Data referenced in this article is updated to March 13, 2024. Reference Transforming Telegram to Web3 with Toncoin – TOKEN2049 Singapore 2023Telegram Game ‘Notcoin’ Launches Pre-Market Trading Ahead of AirdropPractical Guide to Developing Telegram Bots and Mini-AppsTON’s roadmapAn In-depth Analysis of Inscriptions within TON Ecosystem

Exploring The Dormant TON Community and Its Cross-ecosystem Connectivity

By Lydia Wu, Researcher at Mint Ventures

Decoupling the classical chain concept
The figures on DefiLlama and CoinMarketCap seem to conclusively show that TON is an “under-the-radar giant among chains” — consistently ranking within the top 20 in terms of market cap, yet its 24-hour trading volume lingers beyond the top 100. Despite reaching a new peak in Total Value Locked (TVL) at $53 million, it failed to enter the top 50 public chains; moreover, its market cap to TVL ratio hit an astonishing 293, which is 33 times that of Ethereum.

Source: DefiLlama, CoinMarketCap

Recent months have seen a decoupling in the Ethereum-defined concept of chains, facilitated by the advent of Rollups and modular approaches, diluting the traditional narrative around blockchain legitimacy. The market has accepted Blast’s mere name without substantial blockchain functionality and acknowledged Arweave AO’s actions of leveraging network effect for actual public chain operations.
This shift has accelerated the unleashing of TON’s potential under its “less legitimacy” positioning. The market’s evaluation of TON now extends beyond TVL-centric narratives of classic chains, suggesting a closer look at metrics like daily and monthly active users, adding a new dimension to understanding its value.
Recently, the founder of Telegram, Pavel Durov, shared with the Financial Times that Telegram’s monthly active users have soared past 900 million, reaching a valuation of over 30 billion USD, and the company is contemplating an Initial Public Offering. This starkly contrasts with Meta’s 3 billion monthly active users and a market capitalization of 1.3 trillion USD, placing Telegram’s valuation at just 1/40th of Meta. Telegram’s user demographic is notably concentrated in Asia, Europe, South America, and the Middle East, characterized by a significant presence of retail investors and a robust demand for peer-to-peer payments, positioning it as an ideal audience onboarding into the Web3 space. As Telegram continues to expand, if TON can draw in 30% of Telegram’s users within the next 3 to 5 years, it could effectively bolster its valuation.

Source: DefiLlama, CoinMarketCap, Nansen, Token Terminal

New Paradigm of Web3 Creator Economy: Leveraging Telegram for Content Packaging and Value Transfer
Redefining Distribution: How Telegram Activates the Web2 Incremental Growth Market
Last year, I outlined the three stages of the Web3 creator economy, with Phase 1.0 embracing the straightforward “blockchain+” approach, integrating centralized content creation with Web3 distribution mechanisms, predominantly through NFTs. This model remains mainstream, except NFTs are beginning to be used more as speculative Pre-token tools rather than as creative mediums.
Throughout this year, my immersion in the TON ecosystem has gradually made me realize that, along the same attention-driven pathways, the role of NFTs wrapping content, vouchers, and governance are seamlessly supplanted by a more intuitive “channel-advertising-payment” framework. The briefly celebrated Friend.tech serves as an intermediate form of this idea—wrapping group chats into assets, albeit with limited content scalability and an unsustainable economic model.
Compared to many NFT and SocialFi projects still struggling to find application scenarios for their assets, the TON ecosystem, developed on the Telegram platform, offers a more friendly product experience and value network for Web2 developers, creators, and users — on existing social networks, further deconstructing the Web3 distribution model into a combination of traditional Web2 business practices, middleware wallets, and Web3 financial settlements. This approach not only simplifies the onboarding process for both creators and users but also diversifies the creators’ revenue streams through mechanisms like ad revenue sharing.

Source: Lydia @Mint Ventures

From Content to Services, Trading Bots Facilitate the Transition of Web3 User Base
WeChat, a super App in the Web2 space, has achieved ecosystem expansion from a content-centric to a service-oriented ecosystem by launching service accounts and mini-programs. Telegram has followed a similar path by introducing business accounts and service-oriented bots to solidify its position as a key gateway for mobile traffic, among which Trading Bots have won the favor of the crypto degens.
The Trading Bot sector has seen explosive growth since 2023, emerging as one of the few products with genuine demand and cash flow during the bear market. As the market shifts from bear to bull , the financial allure of these Bots has intensified, with leading protocols like Banana Gun and Unibot generating daily revenues exceeding $200,000. Though a minority are hosted on websites and Discord, the majority of Trading Bots are integrated through Telegram.

Source: https://dune.com/whale_hunter/dex-trading-bot-wars

At this stage, the operation of Trading Bots and TON are not closely related, primarily leveraging Telegram’s Bot module for integration with Ethereum and Solana. However, the popularity of Bots has greatly contributed to market education and the nurturing of user engagement within the TON ecosystem. As crypto enthusiasts grow more comfortable with innovative interaction models like Bots, the barrier to exploring other non-trading products within the TON ecosystem will significantly diminish. Supported by a fully on-chain identity system and payment infrastructure, TON is theoretically poised to develop a coherent system for content packaging and value transfer.
Exploring the Bazzar Mode: The Bots and Mini Apps of TON
The TON ecosystem currently hosts a range of products that, at first glance, seem to consist merely of simply Telegram Bot portals combined with HTML5 websites, coupled with lively Telegram groups flooding the screen, making one feel as if they are in a bustling and chaotic bazzar.
In the Meme project Notcoin, players engage in the activity of tapping on coins to earn points. Additionally, they can join teams led by celebrities, such as Telegram’s founder Durov, or gain attraction by sending “red packets” to friends. Through its no-frills gameplay and viral marketing strategy, Notcoin quickly attracted 5 million players within just a week, with the current player base exceeding 26 million and 1 million followers on X. In the recently launched Pre-Market trading, the biggest deal was a purchase of one billion points with 1100 $TON, which was equivalent to approximately $4521.

Farcaster, a favorite among crypto influencers, introduced its Frames in January 2024, celebrated for being “a groundbreaking innovation.” Yet, this feature’s efficacy on the densely populated mobile landscape leaves something to be desired. Although users can perform basic Swap and Mint operations, when it comes to more complex interactions, even to the extent of the simplest mini-games, executing these within a diminutive frame that takes up less than one-third of a smartphone screen can be a significant visual challenge.
By contrast, what is rarely mentioned is that the integration of Telegram and TON has already achieved an almost seamless transition from chat boxes to semi-native applications, with the response speed of calling up applications via Bots even faster than WeChat mini-programs. 

In June 2015, Telegram launched its Bot feature with limitations such as the inability to custom interface design and the absence of direct client-server communication. Based on this functionality, Trading Bots act more as intermediary interfaces than independent apps, facing constraints in response speed and the challenge of conducting multiple simultaneous interactions.
In April 2022, Telegram rolled out the Mini App, granting developers complete control over user interfaces and enabling direct client-to-server communication. Mini Apps provides a more friendly user experience and enhanced composability, with seamless wallet integration and other infrastructure capabilities, making it well-suited for deploying Web3 products. Mini Apps have the potential to supplant all mobile websites.
After the launch of Mini Apps, Bots have not been relegated to obsolescence; instead, they play a “relay room” role. They serve as the primary gateway for user interactions, seamlessly connecting a series of Mini Apps.

Source: TON x Fans

Source: Lydia

Deploying a Bot or Mini App is relatively easy. Users can swiftly configure their Bot through a Q&A format on the Telegram @BotFather channel. Additionally, they can explore a virtual dining experience with a Mini App by visiting @DurgerKingBot and experiencing the setup firsthand.
Unique On-Chain Experience: A Spotlight on the TON Ecosystem
Lite Gaming
In the current landscape, where blockchain capabilities far exceed demand, the platform choice—whether it’s on Ethereum Layer2, Solana, or Binance Smart Chain—makes little difference to a game’s playability. Yet, games imbued with strong social attributes, like chess, card games, or other party games, would offer vastly different experiences when it comes to one-click sharing for team formation on Telegram or waiting for random matches with strangers on the network.
Originally an MMORPG game on Facebook, Tap Fantasy attracted over 700,000 players after venturing into Web3 on the Binance Smart Chain and Solana blockchains. By August 2023, Tap Fantasy became the premier IDO project on TonUP, the first Launchpad within the Ton ecosystem, with its token $MC achieving a sell-out in just half an hour. Come November 2023, a TON-based new version of Tap Fantasy was rolled out by Pluto, a Web3 gaming incubator. This update broke through 600,000 players in three months, with the on-chain players exceeding 16,000. A robust in-game economy propelled the value of the $MC token from a 1:5 to a 1:1 exchange rate against TON.
Catizen, a new game developed by Pluto, merges AI technology with the metaverse concept to offer a unique cat-raising experience. Its beta version was released on March 7, 2024, quickly attracting a community of over 160,000 players and 13,000 blockchain participants in merely five days. Catizen also partnered with $FISH, a leading meme within the TON ecosystem, announcing an upcoming airdrop to $FISH token holders following the conclusion of its beta testing phase.

Source: Tap Fantasy TON version dashboard

Source: Catizen dashboard

Social Inscriptions and Memes
As a novel approach to asset distribution, the expansion of the Bitcoin inscription ecosystem to multiple chains is a trending experiment. TON’s inscription ecosystem has cleverly incorporated Telegram’s front-end and native wallet, streamlining user interactions while implementing safeguards against spambots.
$NANO: As the first TON-20 inscription, it has injected the TON ecosystem with 20 million interactions and 36,000 unique minting addresses.$GRAM: Drawing its name from the Telegram Open Network’s native token, which faced regulations from the SEC, it pioneered the deployment, minting, and transferring processes via the Telegram Mini Apps.$TONOT: Breaking $NANO’s records with 61,000 unique minting addresses and 57,000 holders, TONOT facilitates transitions between inscriptions, NFTs, and tokens. Its roadmap extends to in-game currencies, decentralized identities (DID), and staking mechanisms, among other features.
Meme assets had a period of scarcity on the lesser-known TON blockchain, creating a gap in crypto awareness and interaction with TON, until Notcoin captured widespread attention.
$NOT: Notcoin plans to airdrop $NOT tokens in late March or early April. Pre-market trading for $NOT is currently active on Getgems, an NFT marketplace within the TON ecosystem.$REDO: Inspired by a sketch made by Durov, during a protest, $REDO has risen to become the meme with the highest market cap within the TON ecosystem.$FISH: Marking its place as the TON ecosystem’s first social meme, Ton Fish has amassed a community exceeding 18,000 holders.$TPET: Emerging from the Ton Fish ecosystem, $TPET fair launch is ongoing until March 26. It is positioned to be the key token for the upcoming game Ton Pet: Tik Ton, offering $FISH and NFT holders a chance to participate in an airdrop.
Multichain Liquidity Launchpad
XTON is the first launchpad to introduce multichain liquidity within the TON ecosystem, featuring team members from the TON Foundation. XTON is scheduled to finalize its mainnet launch and token sale in Q1 2024 and to kick off its first project in Q2 2024. In line with XTON’s vision, the TON ecosystem is poised to bridge the gap and facilitate bilateral traffic integration between the Web2 social networking giant and the Web3 EVM-compatible world.
Moving Toward a Future of Interconnectivity
Since March, with the announcement of Telegram using TON to process advertising revenues, Binance launching a USDⓈ-M TON perpetual contract, and Telegram seeking an IPO, the TON token, previously dormant, has swiftly achieved a notable leap in both price and on-chain activity.

Source: https://www.tonstat.com/
Reflecting on TON’s journey since its 2018 inception—from Telegram’s launch to community stewardship, the establishment of the first cross-chain bridge, to the evolution of its infrastructure—the TON ecosystem’s resilience and dynamism stand out. With a 2024 focus on stablecoins, cross-chain bridges, and expansion in the Asian market, there’s an eagerness to witness TON evolve into a genuinely open network fostering interconnectivity across various regions, ecosystems, and applications, potentially offering each participant a glimpse into the long-promised future of blockchain technology.
*Data referenced in this article is updated to March 13, 2024.
Reference
Transforming Telegram to Web3 with Toncoin – TOKEN2049 Singapore 2023Telegram Game ‘Notcoin’ Launches Pre-Market Trading Ahead of AirdropPractical Guide to Developing Telegram Bots and Mini-AppsTON’s roadmapAn In-depth Analysis of Inscriptions within TON Ecosystem
Preparing for Primary Wave: My Periodic Strategy on This Bull Market CycleBy Alex Xu, Research Partner at Mint Ventures Key Takeaways Last week, BTC touched a historical high against the USD, signifying our entrance into the official stage of this bull market. Unlike the initial rebound and warm-up from the depths of the bear market, the official bull market phase brings an amplified sentiment and even more pronounced volatility. Each bull market’s official start is characterized by a set of common features, including: A shift from $BTC-led gains to altcoins spearheading the surge, resulting in a decline in Bitcoin’s market share.An increase in the velocity and magnitude of gains across various cryptocurrencies.A surge in popularity on social media and search engines, leading to a swift rise in public interest. In this article, I explore the potential differences between this cycle and previous ones, offering my analysis and strategies for navigating ahead. Please note that the insights above reflect my current thinking and may evolve. The perspectives are subjective and there may be factual inaccuracies or biases. This is not financial advice, but feedback and discussions are welcomed. Let’s dive into the heart of this analysis. The Catalysts for Crypto Bull Markets and Alpha Opportunities Catalysts for the Bull Market With Bitcoin’s valuation reaching significant levels, a retrospective of the last three cycles reveals that bull markets are typically ignited by a confluence of factors, including: The anticipated halving of Bitcoin, a critical adjustment to its supply and demand dynamics, is on the horizon for April in this cycle.Loosening monetary policies or the expectation thereof, with a market consensus that the zenith of interest rates is behind us and a strong anticipation for rate cuts in the upcoming quarter.The relaxation of regulatory policies. In this cycle, notable developments include updates to U.S. accounting standards that allow crypto assets to be accounted for at fair value on the balance sheets of public companies, and the SEC’s legal setback against Grayscale, which paved the way for the approval of ETFs.Innovations in asset and business models. The current bull market has already seen the emergence of the first three factors. Identifying Alpha Opportunities in Bull Markets Historically, the most substantial gains in each bull market cycle have been captured by the newcomers or those who experienced their first major breakout during the cycle. For example, the 2017 bull market was dominated by the ICO craze, with platforms like Neo and Qtum, which facilitated smart contracts, leading the charge. Fast forward to 2021, and the spotlight shifted to Defi, Gamefi, Metaverse, and NFTs, with 2020 heralded as the year of DeFi, and 2021 marking the rise of NFTs and GameFi. Yet, as we navigate through the current bull market, no new asset class or business model has achieved the transformative impact akin to that of smart contract platforms or DeFi in previous cycles. As for the current landscape of Defi, Gamefi, NFTs, and Depin, there seems to be a stagnation in terms of innovation in product or narrative evolution, regardless of whether the projects are new or established. Most advancements are merely iterations or refinements of existing functionalities, leading to the perception that these are merely “old concepts” being recycled. The current cycle has witnessed the emergence of two particularly novel categories within the crypto ecosystem: Bitcoin Ecosystem Innovations: This category includes inscription assets, exemplified by $ORDI and Node Monkeys, and projects that leverage Bitcoin’s Layer 2 solutions.Web3 + AI: This includes both decentralized computing projects from the previous cycle, like Akash and Render Network, and novel AI-centric initiatives such as Bittensor (TAO), which have surged to prominence in this cycle. It’s crucial to underline, however, that AI does not inherently derive from within the blockchain. The surge of interest in Web3+AI projects is largely a spillover from the AI boom, particularly triggered by developments in ChatGPT in 2023. This positions AI-related projects as a “partially new narrative” within this cycle. Speculations and Strategies for the Current Bull Market Potentially Overestimated Alphas In many recommended investment portfolios, I’ve seen a common inclination towards including Gamefi, DePIN, and DeFi-related altcoins within the asset pool. The prevailing logic suggests that due to their smaller market capitalization and higher volatility, these cryptocurrencies are expected to significantly outperform $BTC and $ETH in the bull market’s peak phases(after BTC reaches new highs), achieving Alpha returns. However, as previously pointed out, “the most substantial gains in each bull market cycle have been captured by the newcomers or those who experienced their first major breakout during the cycle.” Given that DeFi, GameFi, NFT, and DePIN do not fit the characteristic of being “new assets or new business models” of this cycle, and considering they’re embarking on their second round, expecting them to replicate their inaugural cycle’s price performance is optimistic. It’s essential to understand that an asset class tends to enjoy its most substantial valuation bubble only during its first appearance in a cycle. In their debut in the bull market, new business models or asset classes grapple with the challenge of “disproval.” This is a tough hurdle to clear amid the bullish euphoria. Conversely, in the second bull market, they face the necessity to “prove their mettle,” demonstrating that their growth potential is still untapped and their room for imagination substantial. This is equally challenging as reigniting faith in previously told tales isn’t straightforward, especially when they are still wary of being trapped at the high peaks of the previous bull market. Some might argue that the Layer1 track was the “brightest star” in both the 2017 and 2021 bull markets, posing a counterexample. This is not the case. The demand for L1 solutions in the 2021 bull market experienced exponential growth, propelled by the meteoric rise of several product categories, including DeFi, NFT, and GameFi. This growth spurt resulted in a swift expansion of the market for both users and developers, creating an unparalleled demand for blockchain capacity. This not only elevated Ethereum’s market valuation but also catalyzed a boom among alternative L1 platforms due to overflow demand from Ethereum, making 2021 the watershed year for these Alternative L1s. Can this current cycle replicate the previous cycle’s explosive growth in decentralized applications and asset classes, leading to further demand for L1? At this juncture, we may not witness this replication. Hence, the conditions that allowed L1 platforms to enjoy a meteoric rise in their last cycle seem absent now, suggesting a need to moderate expectations for Alternative L1 platforms in this bull market’s context. $BTC and $ETH: The Better Choices In this current bull market phase, the most significant propulsion seems to be the capital inflow triggered by the facilitation of ETF approval, alongside optimistic expectations for long-term inflows. Consequently, the main beneficiaries of this cycle are primarily BTC and potentially ETH (as a likely candidate for ETF listing). Taking into account the perspectives on GameFi, DePIN, DeFi, and L1s discussed earlier, achieving Alpha in this bull market seems more challenging, indicating that a strategic investment in BTC and ETH could yield a more favorable risk-reward balance compared to the last cycle. When deliberating between $BTC and $ETH, both likely to gain from ETF endorsements, which emerges as the superior target? From my perspective, $ETH might edge out in the short term. This is attributed to the market having already adjusted $BTC prices in anticipation of its ETF approval, with little else to drive its value post-April’s halving. $ETH, conversely, stands at a comparative low against $BTC, and with rising speculation about its ETF prospects, $ETH appears to offer better short-term potential than BTC. Looking towards the future, $BTC could emerge as the more favorable investment choice. $ETH increasingly mirrors the characteristics of a technology stock, with its valuation closely tied to its role in providing blockchain capacity, similar to a Web3 cloud computing venture. This sector is marked by intense competition, with $ETH under continual threat of losing narrative appeal and market share to other blockchain capacity offerings (including L1s, Rollups, and DA projects) and a variety of novel technological solutions. Missteps in Ethereum’s technological development or delays in product updates could prompt investors to withdraw their support. In contrast, $BTC’s status as “digital gold” is progressively being cemented. The steady growth of its market valuation, coupled with the facilitation provided by ETFs, solidifies its position. The consensus on $BTC as a hedge against fiat inflation is slowly gaining endorsement, extending from financial institutions and publicly traded companies to smaller nations. The once-popular argument that “$ETH’s potential as a store of value could surpass $BTC” is increasingly becoming a thing of the past. Crafting a Bull Market Strategy: A Comprehensive Overview Acknowledging the improved risk-reward proposition of favoring $BTC + $ETH in this cycle over the last doesn’t negate the value of diversifying with other altcoins. It simply suggests a more deliberate approach to determining their share in the investment portfolio. The cornerstone of my current strategy involves: Elevating the allocation for $BTC and $ETHExercising restraint in investments within established sectors like DeFi, GameFi, Depin, and NFTs.Identifying and leveraging new tracks for seeking Alpha, including:Memecoins: Positioned as the best speculative vehicles, with each cycle presenting renewed concepts and are known for generating remarkable wealth narratives, making it the easiest asser category to understand and to trigger widespread popularity.AI-related Projects: Emerging as a fresh commercial frontier within Web3, gaining traction from non-crypto groups.$BTC Ecosystem: Particularly the inscription assets, and to a lesser extent, Bitcoin L2 solutions. The former is favored for introducing a novel asset class in this cycle, whereas the latter is perceived as a conceptual iteration of Ethereum’s Rollups — essentially old wine in a new bottle. Market Cycles Persist with an Accelerated Timeline Furthermore, in the context of market cycles, my analysis diverges from the traditional pattern observed in past bull markets, where the year after a halving marked the primary ascent. Instead, I posit that the pinnacle of this cycle’s bull run will be in 2024, rather than 2025. Historically, the Bitcoin halving events occurred in 2012, 2016, and 2020, with the forthcoming cycle slated for 2024. Last year, Hithink Finance conducted a comparison of the returns of major financial assets over the last decade, summarized as follows: Generally, Bitcoin adheres to a “three-year rising, one-year falling” principle, correlating with its price increasing in the year leading up to the halving, the halving year itself, and the year following, before undergoing a decline. In the first cycle of Bitcoin’s halving, the halving year of 2012 experienced a 186% price surge, followed by a monumental 5372% increase in 2013. Similarly, 2017 followed a comparable pattern, where, before the 2017 bull market cycle, $BTC price trend aligns with the principle of “moderate gains before the halving, followed by a substantial surge in the subsequent year.” Nonetheless, this established pattern started to shift in the most recent cycle. Notably, the year preceding the halving, 2019, registered a notable rise of 93.4%, outpacing the 40.9% growth observed in 2015. The halving year of 2020 posted a 273% gain, surpassing the 62.3% increase recorded in the post-halving year, 2021. The current cycle prominently showcases a shift towards an earlier “bull phase.” In 2023, the year before the halving, BTC achieved a 147.3% increase, outperforming the pre-halving year of the previous cycle (2019). As we venture into the first quarter of 2024, $BTC has already secured nearly a 60% increase. I believe that it’s highly probable that 2024 will be the year of the main bull run for this cycle. Waiting for a boom in 2025 may lead to missed opportunities; thus, strategically increasing your investment now seems to be a more cautious approach. The year 2025, in contrast, might be apt for scaling down investments and harvesting gains. Lastly, I extend my best wishes for a prosperous and rewarding journey through this bull market to all.

Preparing for Primary Wave: My Periodic Strategy on This Bull Market Cycle

By Alex Xu, Research Partner at Mint Ventures

Key Takeaways
Last week, BTC touched a historical high against the USD, signifying our entrance into the official stage of this bull market. Unlike the initial rebound and warm-up from the depths of the bear market, the official bull market phase brings an amplified sentiment and even more pronounced volatility.
Each bull market’s official start is characterized by a set of common features, including:
A shift from $BTC -led gains to altcoins spearheading the surge, resulting in a decline in Bitcoin’s market share.An increase in the velocity and magnitude of gains across various cryptocurrencies.A surge in popularity on social media and search engines, leading to a swift rise in public interest.
In this article, I explore the potential differences between this cycle and previous ones, offering my analysis and strategies for navigating ahead.
Please note that the insights above reflect my current thinking and may evolve. The perspectives are subjective and there may be factual inaccuracies or biases. This is not financial advice, but feedback and discussions are welcomed.
Let’s dive into the heart of this analysis.
The Catalysts for Crypto Bull Markets and Alpha Opportunities
Catalysts for the Bull Market
With Bitcoin’s valuation reaching significant levels, a retrospective of the last three cycles reveals that bull markets are typically ignited by a confluence of factors, including:
The anticipated halving of Bitcoin, a critical adjustment to its supply and demand dynamics, is on the horizon for April in this cycle.Loosening monetary policies or the expectation thereof, with a market consensus that the zenith of interest rates is behind us and a strong anticipation for rate cuts in the upcoming quarter.The relaxation of regulatory policies. In this cycle, notable developments include updates to U.S. accounting standards that allow crypto assets to be accounted for at fair value on the balance sheets of public companies, and the SEC’s legal setback against Grayscale, which paved the way for the approval of ETFs.Innovations in asset and business models.
The current bull market has already seen the emergence of the first three factors.
Identifying Alpha Opportunities in Bull Markets
Historically, the most substantial gains in each bull market cycle have been captured by the newcomers or those who experienced their first major breakout during the cycle. For example, the 2017 bull market was dominated by the ICO craze, with platforms like Neo and Qtum, which facilitated smart contracts, leading the charge. Fast forward to 2021, and the spotlight shifted to Defi, Gamefi, Metaverse, and NFTs, with 2020 heralded as the year of DeFi, and 2021 marking the rise of NFTs and GameFi.
Yet, as we navigate through the current bull market, no new asset class or business model has achieved the transformative impact akin to that of smart contract platforms or DeFi in previous cycles.
As for the current landscape of Defi, Gamefi, NFTs, and Depin, there seems to be a stagnation in terms of innovation in product or narrative evolution, regardless of whether the projects are new or established. Most advancements are merely iterations or refinements of existing functionalities, leading to the perception that these are merely “old concepts” being recycled.
The current cycle has witnessed the emergence of two particularly novel categories within the crypto ecosystem:
Bitcoin Ecosystem Innovations: This category includes inscription assets, exemplified by $ORDI and Node Monkeys, and projects that leverage Bitcoin’s Layer 2 solutions.Web3 + AI: This includes both decentralized computing projects from the previous cycle, like Akash and Render Network, and novel AI-centric initiatives such as Bittensor (TAO), which have surged to prominence in this cycle.
It’s crucial to underline, however, that AI does not inherently derive from within the blockchain. The surge of interest in Web3+AI projects is largely a spillover from the AI boom, particularly triggered by developments in ChatGPT in 2023. This positions AI-related projects as a “partially new narrative” within this cycle.
Speculations and Strategies for the Current Bull Market
Potentially Overestimated Alphas
In many recommended investment portfolios, I’ve seen a common inclination towards including Gamefi, DePIN, and DeFi-related altcoins within the asset pool. The prevailing logic suggests that due to their smaller market capitalization and higher volatility, these cryptocurrencies are expected to significantly outperform $BTC and $ETH in the bull market’s peak phases(after BTC reaches new highs), achieving Alpha returns.
However, as previously pointed out, “the most substantial gains in each bull market cycle have been captured by the newcomers or those who experienced their first major breakout during the cycle.” Given that DeFi, GameFi, NFT, and DePIN do not fit the characteristic of being “new assets or new business models” of this cycle, and considering they’re embarking on their second round, expecting them to replicate their inaugural cycle’s price performance is optimistic. It’s essential to understand that an asset class tends to enjoy its most substantial valuation bubble only during its first appearance in a cycle.
In their debut in the bull market, new business models or asset classes grapple with the challenge of “disproval.” This is a tough hurdle to clear amid the bullish euphoria. Conversely, in the second bull market, they face the necessity to “prove their mettle,” demonstrating that their growth potential is still untapped and their room for imagination substantial. This is equally challenging as reigniting faith in previously told tales isn’t straightforward, especially when they are still wary of being trapped at the high peaks of the previous bull market.
Some might argue that the Layer1 track was the “brightest star” in both the 2017 and 2021 bull markets, posing a counterexample.
This is not the case.
The demand for L1 solutions in the 2021 bull market experienced exponential growth, propelled by the meteoric rise of several product categories, including DeFi, NFT, and GameFi. This growth spurt resulted in a swift expansion of the market for both users and developers, creating an unparalleled demand for blockchain capacity. This not only elevated Ethereum’s market valuation but also catalyzed a boom among alternative L1 platforms due to overflow demand from Ethereum, making 2021 the watershed year for these Alternative L1s.
Can this current cycle replicate the previous cycle’s explosive growth in decentralized applications and asset classes, leading to further demand for L1?
At this juncture, we may not witness this replication. Hence, the conditions that allowed L1 platforms to enjoy a meteoric rise in their last cycle seem absent now, suggesting a need to moderate expectations for Alternative L1 platforms in this bull market’s context.
$BTC and $ETH : The Better Choices
In this current bull market phase, the most significant propulsion seems to be the capital inflow triggered by the facilitation of ETF approval, alongside optimistic expectations for long-term inflows. Consequently, the main beneficiaries of this cycle are primarily BTC and potentially ETH (as a likely candidate for ETF listing). Taking into account the perspectives on GameFi, DePIN, DeFi, and L1s discussed earlier, achieving Alpha in this bull market seems more challenging, indicating that a strategic investment in BTC and ETH could yield a more favorable risk-reward balance compared to the last cycle.
When deliberating between $BTC and $ETH , both likely to gain from ETF endorsements, which emerges as the superior target?
From my perspective, $ETH might edge out in the short term. This is attributed to the market having already adjusted $BTC prices in anticipation of its ETF approval, with little else to drive its value post-April’s halving. $ETH , conversely, stands at a comparative low against $BTC , and with rising speculation about its ETF prospects, $ETH appears to offer better short-term potential than BTC.
Looking towards the future, $BTC could emerge as the more favorable investment choice. $ETH increasingly mirrors the characteristics of a technology stock, with its valuation closely tied to its role in providing blockchain capacity, similar to a Web3 cloud computing venture. This sector is marked by intense competition, with $ETH under continual threat of losing narrative appeal and market share to other blockchain capacity offerings (including L1s, Rollups, and DA projects) and a variety of novel technological solutions. Missteps in Ethereum’s technological development or delays in product updates could prompt investors to withdraw their support.
In contrast, $BTC ’s status as “digital gold” is progressively being cemented. The steady growth of its market valuation, coupled with the facilitation provided by ETFs, solidifies its position. The consensus on $BTC as a hedge against fiat inflation is slowly gaining endorsement, extending from financial institutions and publicly traded companies to smaller nations.
The once-popular argument that “$ETH ’s potential as a store of value could surpass $BTC ” is increasingly becoming a thing of the past.
Crafting a Bull Market Strategy: A Comprehensive Overview
Acknowledging the improved risk-reward proposition of favoring $BTC + $ETH in this cycle over the last doesn’t negate the value of diversifying with other altcoins. It simply suggests a more deliberate approach to determining their share in the investment portfolio.
The cornerstone of my current strategy involves:
Elevating the allocation for $BTC and $ETHExercising restraint in investments within established sectors like DeFi, GameFi, Depin, and NFTs.Identifying and leveraging new tracks for seeking Alpha, including:Memecoins: Positioned as the best speculative vehicles, with each cycle presenting renewed concepts and are known for generating remarkable wealth narratives, making it the easiest asser category to understand and to trigger widespread popularity.AI-related Projects: Emerging as a fresh commercial frontier within Web3, gaining traction from non-crypto groups.$BTC Ecosystem: Particularly the inscription assets, and to a lesser extent, Bitcoin L2 solutions. The former is favored for introducing a novel asset class in this cycle, whereas the latter is perceived as a conceptual iteration of Ethereum’s Rollups — essentially old wine in a new bottle.
Market Cycles Persist with an Accelerated Timeline
Furthermore, in the context of market cycles, my analysis diverges from the traditional pattern observed in past bull markets, where the year after a halving marked the primary ascent. Instead, I posit that the pinnacle of this cycle’s bull run will be in 2024, rather than 2025.
Historically, the Bitcoin halving events occurred in 2012, 2016, and 2020, with the forthcoming cycle slated for 2024.
Last year, Hithink Finance conducted a comparison of the returns of major financial assets over the last decade, summarized as follows:

Generally, Bitcoin adheres to a “three-year rising, one-year falling” principle, correlating with its price increasing in the year leading up to the halving, the halving year itself, and the year following, before undergoing a decline.
In the first cycle of Bitcoin’s halving, the halving year of 2012 experienced a 186% price surge, followed by a monumental 5372% increase in 2013. Similarly, 2017 followed a comparable pattern, where, before the 2017 bull market cycle, $BTC price trend aligns with the principle of “moderate gains before the halving, followed by a substantial surge in the subsequent year.”
Nonetheless, this established pattern started to shift in the most recent cycle. Notably, the year preceding the halving, 2019, registered a notable rise of 93.4%, outpacing the 40.9% growth observed in 2015. The halving year of 2020 posted a 273% gain, surpassing the 62.3% increase recorded in the post-halving year, 2021.
The current cycle prominently showcases a shift towards an earlier “bull phase.” In 2023, the year before the halving, BTC achieved a 147.3% increase, outperforming the pre-halving year of the previous cycle (2019). As we venture into the first quarter of 2024, $BTC has already secured nearly a 60% increase.
I believe that it’s highly probable that 2024 will be the year of the main bull run for this cycle. Waiting for a boom in 2025 may lead to missed opportunities; thus, strategically increasing your investment now seems to be a more cautious approach. The year 2025, in contrast, might be apt for scaling down investments and harvesting gains.
Lastly, I extend my best wishes for a prosperous and rewarding journey through this bull market to all.
Gelato: A Veteran in Web3 Developer Services Embarks on a RaaS Journey👉Due to length constraints, only a part of the report is provided here. Please click the link for the complete content https://mintventures.fund/pdf/Gelato-A-Veteran-in-Web3-Developer-Services-Embarks-on-a-RaaS-Journey Author: Lawrence Lee, Researcher at Mint Ventures Investment Thesis Gelato has been deeply involved in the developer services field for many years and has developed a comprehensive suite of tools and services for developers. It is expected to achieve a breakthrough in business by integrating these offerings with its newly minted Rollup-as-a-Service (RaaS) platform, launched at the end of 2023.The RaaS projects are currently in a phase of vigorous token issuance, with notable projects such as Altlayer, Dymension, and Saga having recently launched their tokens. In addition, the sector includes well-funded competitors such as Conduit and Caldera. Given the influx of attention and funding, the RaaS landscape is anticipated to remain a focal point of market interest and activity in the foreseeable future. Risk Factors Challenges in Revenue Generation: The business models of Gelato’s dual-core services—smart contract automation and Reliability as a Service (RaaS)—present challenges in generating sustainable revenue streams.Competitive Pressures: In the smart contract automation sector, Gelato faces formidable competition from Chainlink. In the RaaS domain, rivals such as Altlayer, Conduit, Caldera, and Dymension significantly challenge Gelato’s market position, as its competitive advantages are not sufficiently strong.Limited Token Utility Overview of Gelato Gelato’s business covers merely all aspects of developer services, such as account abstraction(AA) wallet services, multichain payment services, RELAY services that can help developers better onboard users, Verifiable Random Function (VRF) that used by NFT & Game projects, etc. But among them, two most important businesses are: Automate(of smart contracts) and RaaS(Rollup as a service). Automate We published a research report in December 2021 about Gelato, which interested parties can refer to for further information. At that time, Gelato’s primary business strategy centered on “automated smart contracts.” This process involves the conditional automation of operations within smart contracts, specifically triggering operation B when condition A is met. The products introduced by Gelato included the following three features: AMM Limit Orders: This feature automates trade execution when a token’s price hits a specified threshold. Gelato’s pioneering limit order service, Sorbet Finance, has been directly integrated into the platforms of major decentralized exchanges (Dexes) such as PancakeSwap, QuickSwap (the largest Dex on Polygon), and SpookySwap (the largest Dex on Fantom).Loan Liquidation Protection: Designed to protect loans from liquidation by automatically managing the Loan-to-Value (LTV) ratio, this feature swaps collateral for debt and repays the debt when the LTV ratio reaches a critical level. Gelato introduced this feature through a consumer-oriented product, Cono Finance. This feature also received a grant from Aave and was integrated into Instadapp.G-UNI, Position Management Tool for Uniswap V3: This tool adjusts the liquidity provision (LP) market-making range on Uniswap V3 based on token price movements, optimizing the position management process. In 2022, the strategic decision to spin off GUNI as Arrakis Finance and plan for its token issuance marked a pivotal evolution in its offering. Beyond the three core functionalities previously outlined, Gelato’s Automate platform extends its capabilities to a vast array of use cases, becoming an indispensable tool for numerous DeFi protocols. Notably, it facilitates automatic yield harvesting within Yield Farming protocols and ensures timely updates to oracles, among other applications. Gelato plans to upgrade its automation services to “Web3 Function” in June 2024. This strategic upgrade will broaden the scope of trigger conditions available to developers, empowering them to initiate on-chain transactions in response to a diverse array of off-chain data sources, including APIs and subgraphs. These advanced trigger conditions will be securely stored on IPFS before being seamlessly submitted to Gelato for execution. RaaS As the year 2023 concluded, Gelato made a significant stride by launching its Rollup-as-a-Service (RaaS) offering. This innovative service is designed to guide developers through the selection of the optimal technology stack, thereby streamlining the Rollup deployment process. Gelato’s RaaS offering has already integrated a multitude of infrastructure service providers: Execution Layer Integration: Gelato has incorporated leading solutions such as the OP stack, Polygon CDK, and Arbitrum Orbit.Data Availability Layer Integration: Gelato has partnered with Ethereum, Celestia, and Avail.Cross-Chain Solutions: Gelato has integrated with Layerzero and Connext.Oracle Services: Gelato has integrated Oracle services from Redstone, Pyth, and API3.Indexers: Gelato has integrated with The Graph and Goldsky.Fiat Payment and Other Services: Gelato also extends its offerings to include fiat payment solutions with Moonpay and Monerium, KYC services through Fractal ID, and wallet services via Safe. Business Analysis Automate In the world of Web3, scenarios requiring the automatic execution of smart contracts are widespread, such as periodic reinvestment of earnings, regular salary payments, liquidity rebalancing, and more. For developers, designing and executing a complete set of monitoring, computation, and operational programs is both labor-intensive and time-consuming. Automation service providers can help developers avoid “reinventing the wheel”. For providers like Gelato, the marginal cost of serving new users is meager. There is no difference between the process of conducting limit orders on Uniswap and Quickswap, which not only fosters economic synergies between Gelato and decentralized exchanges but also solidifies the business rationale underpinning such collaborations. However, a potential challenge lies in the relatively low technical barriers to entry for the services Gelato provides, leading to a ceiling on the value developers are prepared to pay. This dilemma mirrors the experiences of Web2 automation platforms like IFTTT, which, despite offering valuable tools, struggle to convert free users into paying customers. According to insights from IOSG, Gelato commands an impressive 80% share of the smart contract automation market. Achieving dominance in a niche market also involving the Web3 infrastructure leader Chainlink is no small feat. Unfortunately, a high market share has not translated into stable revenue streams, and the product is in a state of being “liked but not widely adopted,” presenting hurdles to effective commercialization. From a competitive standpoint, Gelato’s early market entry and current leadership pose significant advantages. However, in the medium to long term, Chainlink possesses stronger brand recognition, superior developer engagement channels, more substantial financial resources, and the synergy of cross-selling with its array of services. For Gelato to sustain its competitive edge over Chainlink will not be easy. RaaS With the rapid development of Ethereum Layer2 solutions, the scalability issues that Ethereum once faced seem to have been largely addressed through Rollups. Especially with the upcoming Dencun upgrade, the cost of Rollups is expected to decrease significantly, laying the groundwork for potential widespread commercial adoption. Embracing Ethereum’s Layer2 solutions and the broader adoption of Rollup technology is expected to persist into the future. In the process of developers constructing Rollups, there remains a series of issues and trade-offs for developers to consider. These include selecting a Rollup solution that aligns with their project’s unique requirements, the intricacies of building and managing a Sequencer, mitigating MEV issues, and choosing appropriate oracles and indexing services. RaaS platforms, serving as one-stop service providers and offering a suite of toolkits, clearly have a relatively stable demand in this context. Despite its relatively recent emergence, the RaaS field is characterized by its highly competitive environment. Gelato’s competitors in RaaS field are as follows: Drawing from an analysis of the current landscape, possible ways for RaaS providers to generate revenue or capture value have been identified: Hosting sequencers and engaging in MEV extraction at the execution layer emerge as the most direct and promising revenue sources.Becoming the settlement layer for Rollups or Appchains.Instead of charging fees on user transactions, RaaS providers can explore revenue generation by offering a suite of integrated infrastructure services, such as wallets, and explorers, and engaging in technical consulting services.RaaS providers can institute subscription fees for access to their services. Furthermore, the Restaked rollup, which Altlayer is exploring in collaboration with Eigenlayer, is designed to utilize $ALT more as economic bandwidth, coupled with Restaking mechanisms, to capture value for the token. However, this method of value capture is not closely related to the RaaS services they provide. Overall, due to the limited number of launched RaaS projects, viable business models of revenue generation remain uncertain. Yet, an analysis of the revenue and cost structure of existing Rollups illustrates the challenges RaaS providers face in revenue generation. In the competitive landscape of RaaS providers, given the primary target audience consisting of developers and project builders, the ability to attract and retain developer interest becomes a pivotal factor. Despite the presence of unique technological features across different RaaS platforms, the scope and nature of the services offered are inherently influenced by the underlying framework on which they operate. This dependency on the core framework results in a level of service homogeneity among RaaS providers. Given the relatively uniform service offerings within the RaaS space, The ecosystem influence of the project may be a deciding factor for its success. For RaaS platforms with strong network effects, the business development capabilities are critical determinants of their long-term success and scalability. In such a niche market that appears ripe with opportunities but may, in reality, be approaching saturation, Gelato does not necessarily have an advantage in terms of influence or business development prowess compared to its competitors. Rather, Gelato’s true strength is rooted in the team’s longstanding commitment to serving the developer community, allowing it to offer a more comprehensive suite of development tools. Team, fundraising and Partners Gelato’s co-founders, Hilmar Orth (X: @hilmarxo) and Luis Schliesske (X: @gitpusha), are both esteemed developers. They initially architected the core functionalities that underpin Gelato’s innovative products. They have been close friends since their university days and have worked together ever since. Before founding Gelato, they co-founded a startup aimed at pioneering new business models for large European enterprises through the strategic use of smart contracts. Their prowess and innovation were further showcased through their active participation and notable successes in a series of high-profile hackathons, including ETHParis, ETHBerlin, ETHCapeTown, and the Kyber DeFi Hackathon. These achievements paved the way for securing grants from Gnosis and MetaCartel, which were crucial in the establishment of the Gelato Network. Gelato has conducted four rounds of fundraising, including three private rounds and one public round: In September 2020, Gelato embarked on its fundraising journey with a seed round that culminated in $1.2 million, supported by investors including IOSG, Galaxy Digital, D1 VC, The LAO, Ming Ng, MetaCartel, and Christopher Jentzsch. The valuation of $GEL, Gelato’s native token, was pegged at $0.019 during this round.In September 2021, Gelato announced a fundraising of $11 million from investors such as Dragonfly, Parafi, IDEO, Nascent, and Stani Kulechov (the founder of Aave). The cost of $GEL for this round rose to $0.2971.The public sale also occurred in September 2021 and Gelato conducted a public sale that raised $5 million, with $GEL priced equally at $0.2971.In December 2023, Gelato completed a bridge round led by IOSG. The specific details regarding the amount raised and the financing method remain undisclosed. In addition to the fundraising rounds, Gelato received grants from Gnosis and MetaCartel at the inception of the project. Partnerships have been a cornerstone of Gelato’s strategy, underpinning its RaaS offering and establishing it as a key player in the developer services industry. The project has numerous partners, which have been listed in the previous part. Furthermore, a testament to Gelato’s innovation was its recognition as one of the winners of Most Valuable Builders III on BNB Chain in 2021. Valuation Whether in the realm of smart contract automation or RaaS, there exists a significant gap in accessing detailed revenue metrics for projects operating in these areas. This scarcity of precise financial data makes the process of accurately valuing these projects challenging. In this context, our analysis focuses on presenting the circulating market cap and the fully diluted market cap of various projects that are in direct competition with Gelato for reference. 👉Due to length constraints, only a part of the report is provided here. Please click the link for the complete content https://mintventures.fund/pdf/Gelato-A-Veteran-in-Web3-Developer-Services-Embarks-on-a-RaaS-Journey

Gelato: A Veteran in Web3 Developer Services Embarks on a RaaS Journey

👉Due to length constraints, only a part of the report is provided here. Please click the link for the complete content

https://mintventures.fund/pdf/Gelato-A-Veteran-in-Web3-Developer-Services-Embarks-on-a-RaaS-Journey

Author: Lawrence Lee, Researcher at Mint Ventures

Investment Thesis
Gelato has been deeply involved in the developer services field for many years and has developed a comprehensive suite of tools and services for developers. It is expected to achieve a breakthrough in business by integrating these offerings with its newly minted Rollup-as-a-Service (RaaS) platform, launched at the end of 2023.The RaaS projects are currently in a phase of vigorous token issuance, with notable projects such as Altlayer, Dymension, and Saga having recently launched their tokens. In addition, the sector includes well-funded competitors such as Conduit and Caldera. Given the influx of attention and funding, the RaaS landscape is anticipated to remain a focal point of market interest and activity in the foreseeable future.
Risk Factors
Challenges in Revenue Generation: The business models of Gelato’s dual-core services—smart contract automation and Reliability as a Service (RaaS)—present challenges in generating sustainable revenue streams.Competitive Pressures: In the smart contract automation sector, Gelato faces formidable competition from Chainlink. In the RaaS domain, rivals such as Altlayer, Conduit, Caldera, and Dymension significantly challenge Gelato’s market position, as its competitive advantages are not sufficiently strong.Limited Token Utility
Overview of Gelato
Gelato’s business covers merely all aspects of developer services, such as account abstraction(AA) wallet services, multichain payment services, RELAY services that can help developers better onboard users, Verifiable Random Function (VRF) that used by NFT & Game projects, etc. But among them, two most important businesses are: Automate(of smart contracts) and RaaS(Rollup as a service).
Automate
We published a research report in December 2021 about Gelato, which interested parties can refer to for further information.
At that time, Gelato’s primary business strategy centered on “automated smart contracts.” This process involves the conditional automation of operations within smart contracts, specifically triggering operation B when condition A is met. The products introduced by Gelato included the following three features:
AMM Limit Orders: This feature automates trade execution when a token’s price hits a specified threshold. Gelato’s pioneering limit order service, Sorbet Finance, has been directly integrated into the platforms of major decentralized exchanges (Dexes) such as PancakeSwap, QuickSwap (the largest Dex on Polygon), and SpookySwap (the largest Dex on Fantom).Loan Liquidation Protection: Designed to protect loans from liquidation by automatically managing the Loan-to-Value (LTV) ratio, this feature swaps collateral for debt and repays the debt when the LTV ratio reaches a critical level. Gelato introduced this feature through a consumer-oriented product, Cono Finance. This feature also received a grant from Aave and was integrated into Instadapp.G-UNI, Position Management Tool for Uniswap V3: This tool adjusts the liquidity provision (LP) market-making range on Uniswap V3 based on token price movements, optimizing the position management process. In 2022, the strategic decision to spin off GUNI as Arrakis Finance and plan for its token issuance marked a pivotal evolution in its offering.
Beyond the three core functionalities previously outlined, Gelato’s Automate platform extends its capabilities to a vast array of use cases, becoming an indispensable tool for numerous DeFi protocols. Notably, it facilitates automatic yield harvesting within Yield Farming protocols and ensures timely updates to oracles, among other applications.
Gelato plans to upgrade its automation services to “Web3 Function” in June 2024. This strategic upgrade will broaden the scope of trigger conditions available to developers, empowering them to initiate on-chain transactions in response to a diverse array of off-chain data sources, including APIs and subgraphs. These advanced trigger conditions will be securely stored on IPFS before being seamlessly submitted to Gelato for execution.
RaaS
As the year 2023 concluded, Gelato made a significant stride by launching its Rollup-as-a-Service (RaaS) offering. This innovative service is designed to guide developers through the selection of the optimal technology stack, thereby streamlining the Rollup deployment process.
Gelato’s RaaS offering has already integrated a multitude of infrastructure service providers:
Execution Layer Integration: Gelato has incorporated leading solutions such as the OP stack, Polygon CDK, and Arbitrum Orbit.Data Availability Layer Integration: Gelato has partnered with Ethereum, Celestia, and Avail.Cross-Chain Solutions: Gelato has integrated with Layerzero and Connext.Oracle Services: Gelato has integrated Oracle services from Redstone, Pyth, and API3.Indexers: Gelato has integrated with The Graph and Goldsky.Fiat Payment and Other Services: Gelato also extends its offerings to include fiat payment solutions with Moonpay and Monerium, KYC services through Fractal ID, and wallet services via Safe.

Business Analysis
Automate
In the world of Web3, scenarios requiring the automatic execution of smart contracts are widespread, such as periodic reinvestment of earnings, regular salary payments, liquidity rebalancing, and more. For developers, designing and executing a complete set of monitoring, computation, and operational programs is both labor-intensive and time-consuming. Automation service providers can help developers avoid “reinventing the wheel”. For providers like Gelato, the marginal cost of serving new users is meager. There is no difference between the process of conducting limit orders on Uniswap and Quickswap, which not only fosters economic synergies between Gelato and decentralized exchanges but also solidifies the business rationale underpinning such collaborations.
However, a potential challenge lies in the relatively low technical barriers to entry for the services Gelato provides, leading to a ceiling on the value developers are prepared to pay. This dilemma mirrors the experiences of Web2 automation platforms like IFTTT, which, despite offering valuable tools, struggle to convert free users into paying customers.
According to insights from IOSG, Gelato commands an impressive 80% share of the smart contract automation market. Achieving dominance in a niche market also involving the Web3 infrastructure leader Chainlink is no small feat. Unfortunately, a high market share has not translated into stable revenue streams, and the product is in a state of being “liked but not widely adopted,” presenting hurdles to effective commercialization.
From a competitive standpoint, Gelato’s early market entry and current leadership pose significant advantages. However, in the medium to long term, Chainlink possesses stronger brand recognition, superior developer engagement channels, more substantial financial resources, and the synergy of cross-selling with its array of services. For Gelato to sustain its competitive edge over Chainlink will not be easy.
RaaS
With the rapid development of Ethereum Layer2 solutions, the scalability issues that Ethereum once faced seem to have been largely addressed through Rollups. Especially with the upcoming Dencun upgrade, the cost of Rollups is expected to decrease significantly, laying the groundwork for potential widespread commercial adoption.
Embracing Ethereum’s Layer2 solutions and the broader adoption of Rollup technology is expected to persist into the future. In the process of developers constructing Rollups, there remains a series of issues and trade-offs for developers to consider. These include selecting a Rollup solution that aligns with their project’s unique requirements, the intricacies of building and managing a Sequencer, mitigating MEV issues, and choosing appropriate oracles and indexing services. RaaS platforms, serving as one-stop service providers and offering a suite of toolkits, clearly have a relatively stable demand in this context.
Despite its relatively recent emergence, the RaaS field is characterized by its highly competitive environment. Gelato’s competitors in RaaS field are as follows:

Drawing from an analysis of the current landscape, possible ways for RaaS providers to generate revenue or capture value have been identified:
Hosting sequencers and engaging in MEV extraction at the execution layer emerge as the most direct and promising revenue sources.Becoming the settlement layer for Rollups or Appchains.Instead of charging fees on user transactions, RaaS providers can explore revenue generation by offering a suite of integrated infrastructure services, such as wallets, and explorers, and engaging in technical consulting services.RaaS providers can institute subscription fees for access to their services.
Furthermore, the Restaked rollup, which Altlayer is exploring in collaboration with Eigenlayer, is designed to utilize $ALT more as economic bandwidth, coupled with Restaking mechanisms, to capture value for the token. However, this method of value capture is not closely related to the RaaS services they provide.
Overall, due to the limited number of launched RaaS projects, viable business models of revenue generation remain uncertain. Yet, an analysis of the revenue and cost structure of existing Rollups illustrates the challenges RaaS providers face in revenue generation.
In the competitive landscape of RaaS providers, given the primary target audience consisting of developers and project builders, the ability to attract and retain developer interest becomes a pivotal factor. Despite the presence of unique technological features across different RaaS platforms, the scope and nature of the services offered are inherently influenced by the underlying framework on which they operate. This dependency on the core framework results in a level of service homogeneity among RaaS providers.
Given the relatively uniform service offerings within the RaaS space, The ecosystem influence of the project may be a deciding factor for its success.
For RaaS platforms with strong network effects, the business development capabilities are critical determinants of their long-term success and scalability.
In such a niche market that appears ripe with opportunities but may, in reality, be approaching saturation, Gelato does not necessarily have an advantage in terms of influence or business development prowess compared to its competitors. Rather, Gelato’s true strength is rooted in the team’s longstanding commitment to serving the developer community, allowing it to offer a more comprehensive suite of development tools.
Team, fundraising and Partners
Gelato’s co-founders, Hilmar Orth (X: @hilmarxo) and Luis Schliesske (X: @gitpusha), are both esteemed developers. They initially architected the core functionalities that underpin Gelato’s innovative products. They have been close friends since their university days and have worked together ever since. Before founding Gelato, they co-founded a startup aimed at pioneering new business models for large European enterprises through the strategic use of smart contracts. Their prowess and innovation were further showcased through their active participation and notable successes in a series of high-profile hackathons, including ETHParis, ETHBerlin, ETHCapeTown, and the Kyber DeFi Hackathon. These achievements paved the way for securing grants from Gnosis and MetaCartel, which were crucial in the establishment of the Gelato Network.
Gelato has conducted four rounds of fundraising, including three private rounds and one public round:
In September 2020, Gelato embarked on its fundraising journey with a seed round that culminated in $1.2 million, supported by investors including IOSG, Galaxy Digital, D1 VC, The LAO, Ming Ng, MetaCartel, and Christopher Jentzsch. The valuation of $GEL, Gelato’s native token, was pegged at $0.019 during this round.In September 2021, Gelato announced a fundraising of $11 million from investors such as Dragonfly, Parafi, IDEO, Nascent, and Stani Kulechov (the founder of Aave). The cost of $GEL for this round rose to $0.2971.The public sale also occurred in September 2021 and Gelato conducted a public sale that raised $5 million, with $GEL priced equally at $0.2971.In December 2023, Gelato completed a bridge round led by IOSG. The specific details regarding the amount raised and the financing method remain undisclosed.
In addition to the fundraising rounds, Gelato received grants from Gnosis and MetaCartel at the inception of the project.
Partnerships have been a cornerstone of Gelato’s strategy, underpinning its RaaS offering and establishing it as a key player in the developer services industry. The project has numerous partners, which have been listed in the previous part.
Furthermore, a testament to Gelato’s innovation was its recognition as one of the winners of Most Valuable Builders III on BNB Chain in 2021.
Valuation
Whether in the realm of smart contract automation or RaaS, there exists a significant gap in accessing detailed revenue metrics for projects operating in these areas. This scarcity of precise financial data makes the process of accurately valuing these projects challenging. In this context, our analysis focuses on presenting the circulating market cap and the fully diluted market cap of various projects that are in direct competition with Gelato for reference.

👉Due to length constraints, only a part of the report is provided here. Please click the link for the complete content

https://mintventures.fund/pdf/Gelato-A-Veteran-in-Web3-Developer-Services-Embarks-on-a-RaaS-Journey
Covalent Network: The Hidden Gem of Decentralized Infrastructure👉Due to length constraints, only a part of the report is provided here. Please click the link for the complete content https://mintventures.fund/pdf/Covalent-Network-The-Hidden-Gem-of-Decentralized-Infrastructure Author: Alex Xu, Research Partner at Mint Ventures Overview of Covalent Business Positioning Covalent offers Blockchain Indexer services, providing a comprehensive suite of blockchain data APIs that enable developers to conduct queries across multiple blockchains. Messari’s 2023 DePin sector map included indexers as a part of the DePin ecosystem, categorized under the “Digital Resource Networks” segment. Both The Graph and Covalent are featured as representative projects in this map. Source: Messari Target Clients Covalent focuses on serving Business-to-Business (B2B) clientele, a sector comprising a diverse range of DApps and DeFi protocols, as well as centralized crypto enterprises. Notable clients include Consensys (dashboard), CoinGecko (data aggregator), Rotki (tax tools), NFTX (NFT curation), and Rainbow (crypto wallet). Business Logic Product Mechanism Covalent’s core product is the Unified API, which facilitates data transfer between two modules: the client and the server. Through the API, the server can control its system and respond to client requests. Users such as application developers or analytics firms leverage the Unified API to extract a wealth of blockchain data. In the current stage, data providers – predominantly Covalent itself, with plans to integrate third-party providers – retain data ownership. Although many companies have built server-side infrastructure to provide access to blockchain data, most self-built server-side solutions are limited to the RPC layer and often only access unprocessed blockchain data from the target chain. Covalent’s Business Process ​​”Base blockchain data” refers to the information that can be directly queried from the blockchain via RPC. However, more sophisticated data analyses, like complex query executions, and relational and trend analyses of historical data, require higher-level data processing and indexer services. These are the types of services provided by Covalent and The Graph. Leveraging the Unified API, Covalent has also introduced strategic toolkits to facilitate customer integration and product-side presentation. GoldRush GoldRush is an open-source, modular blockchain explorer and toolkit provided by Covalent, designed to be integrated into various Dapps and Web3 applications. In the context of NFT marketplaces like Blur or Web3 gaming platforms, integrating Covalent’s GoldRush module can embed a more user-friendly interface directly within their applications. Source: NFT Wallet Token List Having outlined Covalent’s industry mechanism, let’s explore how it integrates with tokenomics. Decentralized Design Covalent’s decentralized network architecture is designed to accommodate multiple network participants, commonly referred to as “Operators.” Currently, two primary roles have been activated within this network: Block Specimen Producers (BSP) and Refiners. Covalent’s network boasts 15 active BSPs, including notable participants such as Chorus One, Woodstock, StakeWithUs, and 1kx. The proof and data storage solution based on Moonbeam is an interim measure. Covalent plans to launch its Layer 1 blockchain for ledger purposes and will migrate the staking of $CQT to Ethereum. This migration process is expected to commence by the end of February 2024. Let us explore the four operator roles in Covalent’s decentralized indexer Network: Block-Specimen Producer (BSP), which is currently active within Covalent’s network. BSPs are responsible for uploading raw blockchain data to storage instances. These operators have the flexibility to either run these storage instances locally or outsource the task to storage operators. The latter contributes to increasing data availability, especially when proofs are loaded through IPFS and stored locally. The staking stats for BSPs are as illustrated in the following graph, with the current staking APR exceeding 10%.  Staking Stats Refiner, another active role in Covalent’s ecosystem, can access Block Specimens from storage instances and transform this raw data into queryable data objects, known as Block Results. They also publish proofs of their verification work. The following graph is a list of stakers for Refiners: The list of Stakers for Refiners Query Operator. Before responding to API queries, Query Operators load the transformed data into local data warehouses.Delegator. Each network operator, after fulfilling their assigned duties, receives compensation. This compensation is contingent upon the confirmation of each proof by the Delegators for a specified period. Before payment, a group of Delegators is randomly selected from the pool of network operators to serve as Delegators for the audited period. Currently, the operator roles are filled by members who are part of a whitelist established by the Covalent Foundation. The Covalent Foundation has plans to gradually open these roles to a broader pool of applicants in the future. Other Business Operations Ethereum Wayback Machine Following the implementation of EIP-4844, a new data structure called “blob” will be introduced. Blob is primarily used for storing information that doesn’t need permanent retention on the blockchain and requires short-term propagation across the network.  Blob serves as a temporary storage mechanism on the Ethereum network. The decision regarding the duration and preservation of blob data is left to the discretion of individual nodes, leading to concerns about the long-term data availability for blob data. Covalent’s Ethereum Wayback Machine is an open-source solution launched to address the issue of long-term data availability. It aims to provide decentralized, cryptographically secure access to historical data for users, positioning it as a form of Data Availability (DA) solution. For more information, you can read The Ethereum Wayback Machine. Ethereum Wayback Machine is currently under development and has not yet been officially launched. It is worth noting that Ethstorage, another project targeting decentralized blob storage, finished a $7 million funding at a $100 million valuation in its seed round in July 2023. Business Status Number of Blockchains Supported by the Indexer Service Covalent currently offers comprehensive historical transaction data for over 211 different blockchain networks. User Base Fee Collection Details Covalent’s 2023 Revenue Performance and Comparative Analysis with The Graph The Covalent team provided us with some revenue data as follows: Covalent’s revenue from the indexer in 2023 amounted to $600,000. This figure is noteworthy as it represents the first year of Covalent’s formal commercial operations, with the revenue journey starting from a baseline of $0 in January 2023.Covalent gained over 150 paying customers, predominantly consisting of various institutions and projects.Covalent projects an ambitious 100% growth in revenue for the year 2024. The Graph, another prominent player in this domain, has reported an annualized revenue of over $100,000 in the recent three months.  Source: Query Fees Paid The Covalent team plans to launch their detailed business metrics in the coming weeks, allowing users to access in-depth information on Covalent’s revenues. Overview of Covalent’s Clientele Covalent has established itself as a key player in the B2B sector of blockchain services. Its clientele spans various categories: Wallets and Data Dashboards: Popular wallets and data dashboards, including Rainbow and Zerion, leverage Covalent’s API to aggregate historical balances and track the profits of DeFi and NFT assets.  Data Aggregator: Platforms like CoinGecko use Covalent’s services to present detailed market data, including price trends, liquidity metrics, and investment returns. Cross-Chain Projects: Cross-chain liquidity aggregators such as Li Finance rely on Covalent for accessing asset pricing information across different networks. Cryptocurrency Taxation: Portfolio trackers like Rotki utilize it to extract cross-chain historical balances and pricing data for tax reporting purposes. DeFi Platforms: Aave, Balancer, Paraswap, Curve, Lido, Frax, and Yearn use Covalent’s services to integrate user data from various chains. Centralized Exchanges: To comply with tax regulations, exchanges need to extract users’ historical transaction data to generate reports. Traditional Finance and Custodial Institutions: Fidelity, a global wealth management firm, and EY (Ernst & Young), one of the Big Four accounting firms, Jump Crypto, a notable name in the crypto space, are all among its clientele. AI Training and Decision Making: Covalent provides crucial on-chain data to projects like Nomis.cc, a multi-chain identity and reputation protocol, and Network3, a distributed computing protocol. This data helps train and enhance decision-making processes for AI models, especially in large-language models. Additionally, Consensys, the parent company of Metamask, Infura, and Linea, also forms part of Covalent’s diverse client base. In a strategic expansion move, Covalent has started collaborating with RPC service providers like Chainstack, QuickNode, and Infura. This partnership enables Covalent to offer its indexer services through these providers’ channels, broadening its reach. Notably, the fees generated through Infura alone have already scaled to 100,000+ dollars. Team, Financing, and Partners Founders and Team Composition Covalent was founded and led by Ganesh Swami and Levi Aul. Ganesh Swami, with a strong foundation in physics and a rich experience exceeding a decade in data analysis, brings a unique perspective to Covalent. He successfully made his first company public on the New York Stock Exchange. Besides, Ganesh scaled heights as a professional mountaineer, including expeditions to Mount Everest. Levi Aul established the first Bitcoin exchange in Canada and was part of the IBM team that developed CouchDB. The core team at Covalent, numbering between 40 and 60 members, comprises a diverse group of professionals, including network architects, data scientists, and software engineers.  Overall, the founders’ backgrounds and their entrepreneurial successes underscore a strong alignment with Covalent’s mission.  Valuation For assessing Covalent’s valuation, a comparative approach is implemented, aiming to establish a relative valuation by comparing Covalent with two key projects in the decentralized indexing service sector: The Graph and Pocket Network. As a direct competitor, The Graph’s valuation is particularly relevant. Comparing Covalent to The Graph is crucial for understanding Covalent’s competitive position and potential in the market. Pocket Network operates as a decentralized RPC service and is an upstream entity in the same industry. Its valuation also provides a valuable reference for valuation purposes. A key observation in this comparative valuation is the nearly 7-fold gap in FDV between Covalent and The Graph. However, when considering the fact that Covalent’s annualized revenue is approximately 6 times that of The Graph, Covalent’s valuation appears to be quite attractive.  Despite having a higher revenue level, Covalent’s valuation is lower compared to Pocket Network. However, both entities are closely matched in terms of overall valuation, each being valued in the range of several hundred million dollars. Notably, Pokt has experienced an impressive nearly 12x increase in value over the past three months, completing a remarkable round of “value discovery.” In contrast, $CQT has seen a modest increase of about 80% over the same three-month period. This growth appears more aligned with the broader market rebound rather than being indicative of specific capital attraction or investor focus on Covalent.  The Risks There are two primary risks facing Covalent worth noting: If other larger centralized blockchain data service providers, such as Alchemy, Infura, Quicknode, start expanding from RPC into the indexing service field, offering indexing services as well, it could squeeze Covalent’s market share and pricing power. For instance, Alchemy completed the acquisition of the indexing platform Satsuma in September 2023.Indexer remains a relatively niche and less recognized field among general investors, lacking significant attention. This trend could continue unless catalyzed by high-impact events or significant market shifts. Reference Special thanks to Leo and Bruce from the Pocket Network community for their valuable review and insightful comments on this report. Messari: Covalent: A Unified API for Retrieving Blockchain Data1KX: Indexing the universe of blockchains with CovalentThe Graph Dashboard: https://thegraph.com/explorer/network?chain=mainnetPocket Network Dashboard: https://poktscan.com/ 👉Due to length constraints, only a part of the report is provided here. Please click the link for the complete content https://mintventures.fund/pdf/Covalent-Network-The-Hidden-Gem-of-Decentralized-Infrastructure

Covalent Network: The Hidden Gem of Decentralized Infrastructure

👉Due to length constraints, only a part of the report is provided here. Please click the link for the complete content

https://mintventures.fund/pdf/Covalent-Network-The-Hidden-Gem-of-Decentralized-Infrastructure

Author: Alex Xu, Research Partner at Mint Ventures

Overview of Covalent
Business Positioning
Covalent offers Blockchain Indexer services, providing a comprehensive suite of blockchain data APIs that enable developers to conduct queries across multiple blockchains.
Messari’s 2023 DePin sector map included indexers as a part of the DePin ecosystem, categorized under the “Digital Resource Networks” segment. Both The Graph and Covalent are featured as representative projects in this map.

Source: Messari
Target Clients
Covalent focuses on serving Business-to-Business (B2B) clientele, a sector comprising a diverse range of DApps and DeFi protocols, as well as centralized crypto enterprises. Notable clients include Consensys (dashboard), CoinGecko (data aggregator), Rotki (tax tools), NFTX (NFT curation), and Rainbow (crypto wallet).
Business Logic
Product Mechanism
Covalent’s core product is the Unified API, which facilitates data transfer between two modules: the client and the server. Through the API, the server can control its system and respond to client requests. Users such as application developers or analytics firms leverage the Unified API to extract a wealth of blockchain data. In the current stage, data providers – predominantly Covalent itself, with plans to integrate third-party providers – retain data ownership. Although many companies have built server-side infrastructure to provide access to blockchain data, most self-built server-side solutions are limited to the RPC layer and often only access unprocessed blockchain data from the target chain.

Covalent’s Business Process

​​”Base blockchain data” refers to the information that can be directly queried from the blockchain via RPC. However, more sophisticated data analyses, like complex query executions, and relational and trend analyses of historical data, require higher-level data processing and indexer services. These are the types of services provided by Covalent and The Graph.
Leveraging the Unified API, Covalent has also introduced strategic toolkits to facilitate customer integration and product-side presentation.
GoldRush
GoldRush is an open-source, modular blockchain explorer and toolkit provided by Covalent, designed to be integrated into various Dapps and Web3 applications.
In the context of NFT marketplaces like Blur or Web3 gaming platforms, integrating Covalent’s GoldRush module can embed a more user-friendly interface directly within their applications.

Source: NFT Wallet Token List

Having outlined Covalent’s industry mechanism, let’s explore how it integrates with tokenomics.
Decentralized Design
Covalent’s decentralized network architecture is designed to accommodate multiple network participants, commonly referred to as “Operators.” Currently, two primary roles have been activated within this network: Block Specimen Producers (BSP) and Refiners. Covalent’s network boasts 15 active BSPs, including notable participants such as Chorus One, Woodstock, StakeWithUs, and 1kx.
The proof and data storage solution based on Moonbeam is an interim measure. Covalent plans to launch its Layer 1 blockchain for ledger purposes and will migrate the staking of $CQT to Ethereum. This migration process is expected to commence by the end of February 2024.
Let us explore the four operator roles in Covalent’s decentralized indexer Network:
Block-Specimen Producer (BSP), which is currently active within Covalent’s network. BSPs are responsible for uploading raw blockchain data to storage instances. These operators have the flexibility to either run these storage instances locally or outsource the task to storage operators. The latter contributes to increasing data availability, especially when proofs are loaded through IPFS and stored locally. The staking stats for BSPs are as illustrated in the following graph, with the current staking APR exceeding 10%. 

Staking Stats

Refiner, another active role in Covalent’s ecosystem, can access Block Specimens from storage instances and transform this raw data into queryable data objects, known as Block Results. They also publish proofs of their verification work. The following graph is a list of stakers for Refiners:

The list of Stakers for Refiners

Query Operator. Before responding to API queries, Query Operators load the transformed data into local data warehouses.Delegator. Each network operator, after fulfilling their assigned duties, receives compensation. This compensation is contingent upon the confirmation of each proof by the Delegators for a specified period. Before payment, a group of Delegators is randomly selected from the pool of network operators to serve as Delegators for the audited period.
Currently, the operator roles are filled by members who are part of a whitelist established by the Covalent Foundation. The Covalent Foundation has plans to gradually open these roles to a broader pool of applicants in the future.

Other Business Operations
Ethereum Wayback Machine

Following the implementation of EIP-4844, a new data structure called “blob” will be introduced. Blob is primarily used for storing information that doesn’t need permanent retention on the blockchain and requires short-term propagation across the network.  Blob serves as a temporary storage mechanism on the Ethereum network. The decision regarding the duration and preservation of blob data is left to the discretion of individual nodes, leading to concerns about the long-term data availability for blob data.
Covalent’s Ethereum Wayback Machine is an open-source solution launched to address the issue of long-term data availability. It aims to provide decentralized, cryptographically secure access to historical data for users, positioning it as a form of Data Availability (DA) solution.
For more information, you can read The Ethereum Wayback Machine.
Ethereum Wayback Machine is currently under development and has not yet been officially launched.
It is worth noting that Ethstorage, another project targeting decentralized blob storage, finished a $7 million funding at a $100 million valuation in its seed round in July 2023.
Business Status
Number of Blockchains Supported by the Indexer Service
Covalent currently offers comprehensive historical transaction data for over 211 different blockchain networks.
User Base

Fee Collection Details
Covalent’s 2023 Revenue Performance and Comparative Analysis with The Graph
The Covalent team provided us with some revenue data as follows:
Covalent’s revenue from the indexer in 2023 amounted to $600,000. This figure is noteworthy as it represents the first year of Covalent’s formal commercial operations, with the revenue journey starting from a baseline of $0 in January 2023.Covalent gained over 150 paying customers, predominantly consisting of various institutions and projects.Covalent projects an ambitious 100% growth in revenue for the year 2024.
The Graph, another prominent player in this domain, has reported an annualized revenue of over $100,000 in the recent three months. 

Source: Query Fees Paid

The Covalent team plans to launch their detailed business metrics in the coming weeks, allowing users to access in-depth information on Covalent’s revenues.
Overview of Covalent’s Clientele
Covalent has established itself as a key player in the B2B sector of blockchain services. Its clientele spans various categories:
Wallets and Data Dashboards: Popular wallets and data dashboards, including Rainbow and Zerion, leverage Covalent’s API to aggregate historical balances and track the profits of DeFi and NFT assets. 
Data Aggregator: Platforms like CoinGecko use Covalent’s services to present detailed market data, including price trends, liquidity metrics, and investment returns.
Cross-Chain Projects: Cross-chain liquidity aggregators such as Li Finance rely on Covalent for accessing asset pricing information across different networks.
Cryptocurrency Taxation: Portfolio trackers like Rotki utilize it to extract cross-chain historical balances and pricing data for tax reporting purposes.
DeFi Platforms: Aave, Balancer, Paraswap, Curve, Lido, Frax, and Yearn use Covalent’s services to integrate user data from various chains.
Centralized Exchanges: To comply with tax regulations, exchanges need to extract users’ historical transaction data to generate reports.
Traditional Finance and Custodial Institutions: Fidelity, a global wealth management firm, and EY (Ernst & Young), one of the Big Four accounting firms, Jump Crypto, a notable name in the crypto space, are all among its clientele.
AI Training and Decision Making: Covalent provides crucial on-chain data to projects like Nomis.cc, a multi-chain identity and reputation protocol, and Network3, a distributed computing protocol. This data helps train and enhance decision-making processes for AI models, especially in large-language models.
Additionally, Consensys, the parent company of Metamask, Infura, and Linea, also forms part of Covalent’s diverse client base.
In a strategic expansion move, Covalent has started collaborating with RPC service providers like Chainstack, QuickNode, and Infura. This partnership enables Covalent to offer its indexer services through these providers’ channels, broadening its reach. Notably, the fees generated through Infura alone have already scaled to 100,000+ dollars.
Team, Financing, and Partners
Founders and Team Composition
Covalent was founded and led by Ganesh Swami and Levi Aul.
Ganesh Swami, with a strong foundation in physics and a rich experience exceeding a decade in data analysis, brings a unique perspective to Covalent. He successfully made his first company public on the New York Stock Exchange. Besides, Ganesh scaled heights as a professional mountaineer, including expeditions to Mount Everest. Levi Aul established the first Bitcoin exchange in Canada and was part of the IBM team that developed CouchDB. The core team at Covalent, numbering between 40 and 60 members, comprises a diverse group of professionals, including network architects, data scientists, and software engineers. 
Overall, the founders’ backgrounds and their entrepreneurial successes underscore a strong alignment with Covalent’s mission. 
Valuation
For assessing Covalent’s valuation, a comparative approach is implemented, aiming to establish a relative valuation by comparing Covalent with two key projects in the decentralized indexing service sector: The Graph and Pocket Network.
As a direct competitor, The Graph’s valuation is particularly relevant. Comparing Covalent to The Graph is crucial for understanding Covalent’s competitive position and potential in the market. Pocket Network operates as a decentralized RPC service and is an upstream entity in the same industry. Its valuation also provides a valuable reference for valuation purposes.

A key observation in this comparative valuation is the nearly 7-fold gap in FDV between Covalent and The Graph. However, when considering the fact that Covalent’s annualized revenue is approximately 6 times that of The Graph, Covalent’s valuation appears to be quite attractive. 
Despite having a higher revenue level, Covalent’s valuation is lower compared to Pocket Network. However, both entities are closely matched in terms of overall valuation, each being valued in the range of several hundred million dollars. Notably, Pokt has experienced an impressive nearly 12x increase in value over the past three months, completing a remarkable round of “value discovery.”

In contrast, $CQT has seen a modest increase of about 80% over the same three-month period. This growth appears more aligned with the broader market rebound rather than being indicative of specific capital attraction or investor focus on Covalent. 

The Risks
There are two primary risks facing Covalent worth noting:
If other larger centralized blockchain data service providers, such as Alchemy, Infura, Quicknode, start expanding from RPC into the indexing service field, offering indexing services as well, it could squeeze Covalent’s market share and pricing power. For instance, Alchemy completed the acquisition of the indexing platform Satsuma in September 2023.Indexer remains a relatively niche and less recognized field among general investors, lacking significant attention. This trend could continue unless catalyzed by high-impact events or significant market shifts.
Reference
Special thanks to Leo and Bruce from the Pocket Network community for their valuable review and insightful comments on this report.
Messari: Covalent: A Unified API for Retrieving Blockchain Data1KX: Indexing the universe of blockchains with CovalentThe Graph Dashboard: https://thegraph.com/explorer/network?chain=mainnetPocket Network Dashboard: https://poktscan.com/

👉Due to length constraints, only a part of the report is provided here. Please click the link for the complete content

https://mintventures.fund/pdf/Covalent-Network-The-Hidden-Gem-of-Decentralized-Infrastructure
Exploring Rollup Summer: Analyzing Narratives and Uncovering Investment opportunitiesIn this article, we will explore emerging trends in the Rollup market, its future trajectory, and potential investment opportunities. The discussion focuses on several key topics: What is Rollup Summer?Case Studies: ZKFair and MantaProjecting the Evolution of Rollup SummerOpportunities in the Secondary Market: Investing in the Rollup Summer Narrative The insights offered herein represent a snapshot of my analysis, framed by the knowledge available at the time of writing. The analysis is mainly from a business standpoint, with less emphasis on the technical specifics of Rollup. There may be factual inaccuracies or biases. This article is intended for discussion purposes only, and feedback is welcomed. What is Rollup Summer? Echoing the Defi Summer of 2020 and Inscription Summer in 2023, the term ‘Rollup Summer’ has been coined to forecast a significant surge in Rollup technology adoption. This expected trend goes beyond merely an increase in the number of Rollup projects; it’s projected to include substantial growth in key business metrics such as Total Value Locked (TVL), user engagement, and the expansion of the Rollup ecosystem. While there are early indicators, as of now, this concept remains largely a theoretical forecast. The hallmarks of Rollup Summer include: The initiation and launch of a myriad of new Rollup projects and application chains,An influx of users, capital, and developers into the Rollup ecosystem, surpassing the rate of the previous yearA synergy between business metrics and asset values, driving a swift increase in the overall market value of Rollup sectors and associated projects Rollup Summer is anticipated to start towards the end of 2023 and early 2024, potentially evolving into a significant theme spanning half a year or even longer. The driving force behind Rollup Summer is: The Flywheel Effect Generated by New Rollups While Rollups are not a new concept, with established players like Arbitrum and Optimism demonstrating significant market presence, Similarly, other ZKRollups such as Starknet and zkSync have been in operation for a considerable period. The spotlight during Rollup Summer is expected to shine on the new Rollup projects of this cycle, driven by several key factors: The Concept of Modular Blockchains Gains Traction: The idea of modular blockchains has been widely embraced, streamlining the infrastructure for Rollup components. Developments like the OP stack, Celestia (DA layer), decentralized sequencers, and various Rollup-as-a-Service (RaaS) offerings are enhancing the feasibility of developing and sustaining a Rollup on a modular foundation.Bold Strategies by New Rollups: With substantial budgets for initial launches and incentives, these new Rollups are adopting aggressive tactics in token distribution and incentives. This approach is generating heightened interest among users and investors.Significant Growth Potential: The relatively lower market value at the start of these new projects creates opportunities for substantial growth.A Favorable Market Climate: The current bullish market and optimistic sentiment are acting as catalysts, further propelling the interest and growth in the Rollup space. In essence, the new wave of Rollups is characterized by a combination of novel projects, tokenization strategies, modular blockchain designs, and generous incentive programs. These elements collectively contribute to accelerated momentum in both initial business activities and token pricing dynamics. The momentum of ‘Rollup Summer’ might be initially driven by token airdrop campaigns. These campaigns draw in user assets, elevating crucial business metrics like TVL, which in turn boosts the market cap of the projects.  The next phase could involve token ecosystem airdrop schemes, directly subsidizing decentralized applications (dApps) within the ecosystem, indirectly benefiting the users as well. The result is a further influx of user assets, enhancing core metrics like TVL, active users, and gas fees. This step amplifies the project’s market cap and can lead to newer Rollups outperforming their older counterparts. The rapid success and visibility of these new Rollups can trigger market FOMO (Fear of Missing Out). The Cancun Upgrade The Ethereum Cancun upgrade, scheduled for February, is another pivotal factor contributing to the Rollup Summer phenomenon, particularly enhancing the prospects of established Rollups like Arbitrum and Optimism. This upgrade is expected to significantly lower the Layer 1 cost for Rollups and see an expansion in their profit margins. The Cancun upgrade will capture the entire market’s attention, spotlighting the Rollup arena, and funneling both attention and capital toward the new Rollup projects. Case Studies of New Rollups: ZKFair and Manta In analyzing the operational frameworks and distinguishing features of new Rollups, ZKFair, and Manta emerge as significant case studies, embodying the spirit of this Rollup Summer. ZKFair https://ZKFair.io/ ZKFair’s Key Features as a Rollup include: Built on Polygon CDK: ZKFair has been built on the Polygon Chain Development Kit and employs Celestia for its Data Availability (DA) layer, which is managed by an in-house data committee. Moreover, ZKFair is compatible with the Ethereum Virtual Machine (EVM).USDC is utilized as the Gas fee.The native token of ZKFair, ZKF, is entirely allocated to the community. A notable 75% of ZKF tokens are distributed in four phases within 48 hours, targeting those participating in gas consumption activities. In essence, participants engage in the token’s initial market sale by paying Gas to the official sequencer, with the primary market’s fundraising evaluation at just $4 million. ZKFair attracted over $110 million in funds to participate in the activity ZKFair commits to returning 100% of its operational profits to the community. The distribution of profits is well-structured, with 75% allocated to ZKF LP (Liquidity Provider) stakers and 25% to the eligible dApp developers. The Profit here is defined as total fees minus operational costs, which include expenses for essential Rollup infrastructure like explorers and oracles. The distribution of $ZKF based on the total Gas consumption ratio during the event turned the sale into a competitive ‘battle of capital’. A large number of users to bridge USDC to ZKFair mainnet, actively engaging in the campaign. During the airdrop period, ZKFair experienced a meteoric rise in on-chain TVL, soaring from zero to approximately $140 million in just 2-3 days. This remarkable growth rivaled that of Starknet, a top ZKRollup project at the time. Even Post-airdrop, ZKFair has managed to maintain a Total Value Locked (TVL) exceeding $70 million, surpassing Scroll, another earlier launched ZKRollup project. ZKFair’s operational strategy presents several clear advantages: Rapid Market Entry and Popularity Surge: ZKFair’s TVL soared from zero to over $100 million within just three days and maintained a TVL of over $70 million. With over 334,000 addresses participating, ZKFair ranks among the fastest-growing Rollups in terms of user adoption.Immediate Token Utility and Value Addition: The model where sequencer fees are allocated directly to users and developers not only incentivizes participation but also enhances the intrinsic value of $ZKF. ZKF’s Fully Diluted Valuation (FDV) is currently around $100 million. When compared horizontally with other Rollups, its market value is still relatively modest. Manta https://Manta.network/ Key Features of Manta Rollup: Manta utilizes Celestia for its DA layer and Polygon’s zkEVM for ensuring EVM compatibility.Manta offers an extensive range of products beyond Rollup, particularly featuring a suite of services built on Zero-Knowledge (ZK) technology. Manta’s Rollup was officially launched in September 2023, marking its entry into the competitive Rollup market. A significant increase in business activity was observed following the introduction of a ‘new paradigm’ event in mid-December. As per the latest official data, Manta’s TVL has approached an impressive $750 million. The following graph depicting Manta’s TVL highlights a significant growth inflection following the “New Paradigm” event, indicating a strong market response to its offerings and strategies. Moreover, Manta boasts a stellar lineup of investors, with its latest funding round being valued at $500 million. Source: https://www.rootdata.com/ The New Paradigm Event operates on a straightforward concept, enhancing Blur’s blast mode. Here’s a detailed look at its operational concept and expected impact: It incentivizes the tranfers of funds across chains by offering Manta tokens as rewards. This strategy aims to drive a rapid increase in TVL.Users transferring assets cross-chain are guaranteed at least a basic crypto-world interest rate, which combines the Ethereum PoS yield with treasury bond yields provided by stablecoin issuers. This approach minimizes the opportunity cost of capital for users.Airdrop incentives are designed to encourage users to engage with dApps on the Manta platform post-cross-chain transfer. This not only drives user activity but also bolsters developer confidence in the ecosystem. New Paradigm will distribute 45 million Manta tokens, representing 4.5% of the total supply, to NFT holders. Based on the current secondary market data for NFTs (as of January 5, 2024), the estimated FDV of Manta post-token launch is projected to be between $1.5 and $2.5 billion. Beyond ZKFair and Manta, a plethora of similar new Rollups are making their mark, such as the earlier-launched Blast and Layer X, which merges elements from both ZKFair and Manta.  Additionally, existing ZKRollup projects like Scroll and zkSync, which have yet to issue their tokens, are well-positioned to adopt frameworks similar to those of ZKFair and Manta. This adaptation could include implementing clearer token reward mechanisms, potentially attracting more users and capital to their ecosystems. Projected Developments Following Rollup Summer As we project the developments following Rollup Summer, we can map out the likely path and challenges for the Rollup market: New Rollup projects are expected to launch and quickly gain momentum, driven by strategies such as token airdrops and engagement campaigns. These methods will likely lead to optimistic market expectations, fueled by rapid increases in data and enhanced mechanisms, which are already in progress.As these Rollup projects launch their tokens or get listed on exchanges, a significant wealth effect is anticipated, similar to what was observed in ZKF’s 25-30x growth. This phenomenon is likely to attract more participants and capital into the Rollup space.The focus of token incentives is expected to evolve from primarily boosting TVL to nurturing active ecosystems, increasing Gas fee consumption, and introducing new projects. The market’s attention will shift from TVL to the long-term sustainability of ecosystem growth, where more complex challenges may arise.An influx of new Rollups debuting simultaneously could lead to a dilution of capital and interest. This saturation might decrease the wealth effect associated with new Rollup tokens, potentially resulting in a decline in the quality of projects and the emergence of fraudulent schemes or rugs, thereby undermining market confidence.As the initial excitement of Rollup Summer cools down, the competition within the Rollup arena is expected to settle into a more stable equilibrium. However, the strategies developed during this period for initial business activation, particularly in token distribution, might become a blueprint for future Rollup projects. This reveals that the primary challenge for new Rollups and Rollup Summer, following their initial successes, is the incentivization of ongoing ecosystem development post-initial token distribution. The focus shifts to developing a robust and engaging ecosystem within these Rollups that can retain users and capital while continuing to create a wealth effect. This includes a range of offerings from Memecoins and DeFi protocols to additional airdrops and elaborate Ponzi schemes. Opportunities in the Secondary Market: Investing in the Rollup Summer Narrative What secondary targets stand to gain from Rollup Summer’s evolution? Emerging from Rollup Summer, potential sub-narratives include: Speculation 1: Celestia as the “Ethereum” of the Rollup Era New Rollup projects, including ZKFair and Manta are increasingly adopting Celestia’s DA product. This growing adoption is a strong indicator of its transition from concept to reality. The successful application of Celestia’s DA layer in these leading projects is likely to encourage more emerging Rollups to adopt a similar approach. This success may lead to a trend where adopting Celestia’s DA layer becomes a standard practice for new Rollups, positioning Celestia as a foundational consensus layer and a key infrastructure in the Rollup era. This leads to the speculation that Celestia will become the “Ethereum” of the Rollup Era. The eventual realization of this narrative may not be immediately crucial, especially when Celestia’s Celestia’s current market cap of $2.3 billion, when compared to Ethereum’s $270 billion, suggests a vast growth potential.  sourcehttps://www.coingecko.com/ Speculation 2: Interest-Bearing Assets as the Emerging Standard The shift to interest-bearing assets, particularly interest-bearing stablecoins like sDAI, in place of mainstream stablecoins, has traditionally been gradual. In the airdrop campaigns of Rollups like Blast and Manta, interest-bearing ETH and stablecoins, have become almost standard offerings. If more Rollups adopt interest-bearing assets as part of their strategies, it could significantly contribute to expanding the user base and establishing new investment habits within the crypto community. Rollups that integrate yield-generating assets into their ecosystems can directly benefit the issuers of these assets. The partnerships between Blast and entities like Lido and MakerDao, and between Manta and Stakestone and Mountain Protocol, exemplify how such collaborations can be mutually beneficial. Speculation 3: Rollups Overtaking Layer1 Market Share The user experience between Rollups and traditional L1 blockchains has become nearly indistinguishable. The current cycle in the blockchain space might mirror the previous Layer1 public chain wave but with a focus on sovereign Rollups. In their competitive landscape, Rollups are poised to contest directly with Layer1s for user base and capital. Additionally, rapidly emerging native dapps on these Rollups offer significant opportunities. For instance, Manta’s lending project, LayerBank, with a deposit volume exceeding $300 million, demonstrates the potential for Rollups to host top-tier projects, rivaling those on established platforms like Arbitrum. While there is optimism about the swift advancement and potential dominance of Rollups in the blockchain market, it is essential to remain mindful of the risks.  Challenges and Risks in the Rollup Summer Narrative The principal challenge for new Rollups is smoothly transitioning from the initial launch phase, which often involves attracting capital via token airdrop incentives, to a sustainable growth phase. The latter requires more complex strategies to ensure long-term user engagement and capital retention. Several Layer1 blockchains in the previous cycle attempted similar strategies, such as launching large-scale incentive funds to attract top DeFi projects on Ethereum. These DeFi platforms then redistributed these incentives to their users, initially spurring significant business growth, but the impact often diminished over time as similar strategies became widespread across various public chains. Even established Rollups like Arbitrum and Optimism continue to use their tokens as subsidies for ecosystem projects, which in turn allocate these to end-users to maintain activity and capital in their platforms. For instance, Arbitrum’s recent round of incentives exceeded 70 million $ARB tokens, valued at over $140 million. Considering that Rollup Summer is still in its initial, embryonic stage, the future is shrouded in uncertainty. Crypto investors must pay attention to the evolving strategies and development of new Rollup projects. 

Exploring Rollup Summer: Analyzing Narratives and Uncovering Investment opportunities

In this article, we will explore emerging trends in the Rollup market, its future trajectory, and potential investment opportunities. The discussion focuses on several key topics:
What is Rollup Summer?Case Studies: ZKFair and MantaProjecting the Evolution of Rollup SummerOpportunities in the Secondary Market: Investing in the Rollup Summer Narrative
The insights offered herein represent a snapshot of my analysis, framed by the knowledge available at the time of writing. The analysis is mainly from a business standpoint, with less emphasis on the technical specifics of Rollup. There may be factual inaccuracies or biases. This article is intended for discussion purposes only, and feedback is welcomed.
What is Rollup Summer?
Echoing the Defi Summer of 2020 and Inscription Summer in 2023, the term ‘Rollup Summer’ has been coined to forecast a significant surge in Rollup technology adoption. This expected trend goes beyond merely an increase in the number of Rollup projects; it’s projected to include substantial growth in key business metrics such as Total Value Locked (TVL), user engagement, and the expansion of the Rollup ecosystem. While there are early indicators, as of now, this concept remains largely a theoretical forecast.
The hallmarks of Rollup Summer include:
The initiation and launch of a myriad of new Rollup projects and application chains,An influx of users, capital, and developers into the Rollup ecosystem, surpassing the rate of the previous yearA synergy between business metrics and asset values, driving a swift increase in the overall market value of Rollup sectors and associated projects
Rollup Summer is anticipated to start towards the end of 2023 and early 2024, potentially evolving into a significant theme spanning half a year or even longer.
The driving force behind Rollup Summer is:
The Flywheel Effect Generated by New Rollups
While Rollups are not a new concept, with established players like Arbitrum and Optimism demonstrating significant market presence, Similarly, other ZKRollups such as Starknet and zkSync have been in operation for a considerable period.
The spotlight during Rollup Summer is expected to shine on the new Rollup projects of this cycle, driven by several key factors:
The Concept of Modular Blockchains Gains Traction: The idea of modular blockchains has been widely embraced, streamlining the infrastructure for Rollup components. Developments like the OP stack, Celestia (DA layer), decentralized sequencers, and various Rollup-as-a-Service (RaaS) offerings are enhancing the feasibility of developing and sustaining a Rollup on a modular foundation.Bold Strategies by New Rollups: With substantial budgets for initial launches and incentives, these new Rollups are adopting aggressive tactics in token distribution and incentives. This approach is generating heightened interest among users and investors.Significant Growth Potential: The relatively lower market value at the start of these new projects creates opportunities for substantial growth.A Favorable Market Climate: The current bullish market and optimistic sentiment are acting as catalysts, further propelling the interest and growth in the Rollup space.
In essence, the new wave of Rollups is characterized by a combination of novel projects, tokenization strategies, modular blockchain designs, and generous incentive programs. These elements collectively contribute to accelerated momentum in both initial business activities and token pricing dynamics.
The momentum of ‘Rollup Summer’ might be initially driven by token airdrop campaigns. These campaigns draw in user assets, elevating crucial business metrics like TVL, which in turn boosts the market cap of the projects. 
The next phase could involve token ecosystem airdrop schemes, directly subsidizing decentralized applications (dApps) within the ecosystem, indirectly benefiting the users as well. The result is a further influx of user assets, enhancing core metrics like TVL, active users, and gas fees. This step amplifies the project’s market cap and can lead to newer Rollups outperforming their older counterparts. The rapid success and visibility of these new Rollups can trigger market FOMO (Fear of Missing Out).
The Cancun Upgrade
The Ethereum Cancun upgrade, scheduled for February, is another pivotal factor contributing to the Rollup Summer phenomenon, particularly enhancing the prospects of established Rollups like Arbitrum and Optimism. This upgrade is expected to significantly lower the Layer 1 cost for Rollups and see an expansion in their profit margins. The Cancun upgrade will capture the entire market’s attention, spotlighting the Rollup arena, and funneling both attention and capital toward the new Rollup projects.
Case Studies of New Rollups: ZKFair and Manta
In analyzing the operational frameworks and distinguishing features of new Rollups, ZKFair, and Manta emerge as significant case studies, embodying the spirit of this Rollup Summer.
ZKFair
https://ZKFair.io/

ZKFair’s Key Features as a Rollup include:
Built on Polygon CDK: ZKFair has been built on the Polygon Chain Development Kit and employs Celestia for its Data Availability (DA) layer, which is managed by an in-house data committee. Moreover, ZKFair is compatible with the Ethereum Virtual Machine (EVM).USDC is utilized as the Gas fee.The native token of ZKFair, ZKF, is entirely allocated to the community. A notable 75% of ZKF tokens are distributed in four phases within 48 hours, targeting those participating in gas consumption activities. In essence, participants engage in the token’s initial market sale by paying Gas to the official sequencer, with the primary market’s fundraising evaluation at just $4 million.
ZKFair attracted over $110 million in funds to participate in the activity

ZKFair commits to returning 100% of its operational profits to the community. The distribution of profits is well-structured, with 75% allocated to ZKF LP (Liquidity Provider) stakers and 25% to the eligible dApp developers. The Profit here is defined as total fees minus operational costs, which include expenses for essential Rollup infrastructure like explorers and oracles.
The distribution of $ZKF based on the total Gas consumption ratio during the event turned the sale into a competitive ‘battle of capital’. A large number of users to bridge USDC to ZKFair mainnet, actively engaging in the campaign. During the airdrop period, ZKFair experienced a meteoric rise in on-chain TVL, soaring from zero to approximately $140 million in just 2-3 days. This remarkable growth rivaled that of Starknet, a top ZKRollup project at the time. Even Post-airdrop, ZKFair has managed to maintain a Total Value Locked (TVL) exceeding $70 million, surpassing Scroll, another earlier launched ZKRollup project.
ZKFair’s operational strategy presents several clear advantages:
Rapid Market Entry and Popularity Surge: ZKFair’s TVL soared from zero to over $100 million within just three days and maintained a TVL of over $70 million. With over 334,000 addresses participating, ZKFair ranks among the fastest-growing Rollups in terms of user adoption.Immediate Token Utility and Value Addition: The model where sequencer fees are allocated directly to users and developers not only incentivizes participation but also enhances the intrinsic value of $ZKF.
ZKF’s Fully Diluted Valuation (FDV) is currently around $100 million. When compared horizontally with other Rollups, its market value is still relatively modest.

Manta
https://Manta.network/

Key Features of Manta Rollup:
Manta utilizes Celestia for its DA layer and Polygon’s zkEVM for ensuring EVM compatibility.Manta offers an extensive range of products beyond Rollup, particularly featuring a suite of services built on Zero-Knowledge (ZK) technology.

Manta’s Rollup was officially launched in September 2023, marking its entry into the competitive Rollup market. A significant increase in business activity was observed following the introduction of a ‘new paradigm’ event in mid-December. As per the latest official data, Manta’s TVL has approached an impressive $750 million.

The following graph depicting Manta’s TVL highlights a significant growth inflection following the “New Paradigm” event, indicating a strong market response to its offerings and strategies.

Moreover, Manta boasts a stellar lineup of investors, with its latest funding round being valued at $500 million.
Source: https://www.rootdata.com/

The New Paradigm Event operates on a straightforward concept, enhancing Blur’s blast mode. Here’s a detailed look at its operational concept and expected impact:
It incentivizes the tranfers of funds across chains by offering Manta tokens as rewards. This strategy aims to drive a rapid increase in TVL.Users transferring assets cross-chain are guaranteed at least a basic crypto-world interest rate, which combines the Ethereum PoS yield with treasury bond yields provided by stablecoin issuers. This approach minimizes the opportunity cost of capital for users.Airdrop incentives are designed to encourage users to engage with dApps on the Manta platform post-cross-chain transfer. This not only drives user activity but also bolsters developer confidence in the ecosystem.
New Paradigm will distribute 45 million Manta tokens, representing 4.5% of the total supply, to NFT holders. Based on the current secondary market data for NFTs (as of January 5, 2024), the estimated FDV of Manta post-token launch is projected to be between $1.5 and $2.5 billion.

Beyond ZKFair and Manta, a plethora of similar new Rollups are making their mark, such as the earlier-launched Blast and Layer X, which merges elements from both ZKFair and Manta. 
Additionally, existing ZKRollup projects like Scroll and zkSync, which have yet to issue their tokens, are well-positioned to adopt frameworks similar to those of ZKFair and Manta. This adaptation could include implementing clearer token reward mechanisms, potentially attracting more users and capital to their ecosystems.
Projected Developments Following Rollup Summer
As we project the developments following Rollup Summer, we can map out the likely path and challenges for the Rollup market:
New Rollup projects are expected to launch and quickly gain momentum, driven by strategies such as token airdrops and engagement campaigns. These methods will likely lead to optimistic market expectations, fueled by rapid increases in data and enhanced mechanisms, which are already in progress.As these Rollup projects launch their tokens or get listed on exchanges, a significant wealth effect is anticipated, similar to what was observed in ZKF’s 25-30x growth. This phenomenon is likely to attract more participants and capital into the Rollup space.The focus of token incentives is expected to evolve from primarily boosting TVL to nurturing active ecosystems, increasing Gas fee consumption, and introducing new projects. The market’s attention will shift from TVL to the long-term sustainability of ecosystem growth, where more complex challenges may arise.An influx of new Rollups debuting simultaneously could lead to a dilution of capital and interest. This saturation might decrease the wealth effect associated with new Rollup tokens, potentially resulting in a decline in the quality of projects and the emergence of fraudulent schemes or rugs, thereby undermining market confidence.As the initial excitement of Rollup Summer cools down, the competition within the Rollup arena is expected to settle into a more stable equilibrium. However, the strategies developed during this period for initial business activation, particularly in token distribution, might become a blueprint for future Rollup projects.
This reveals that the primary challenge for new Rollups and Rollup Summer, following their initial successes, is the incentivization of ongoing ecosystem development post-initial token distribution. The focus shifts to developing a robust and engaging ecosystem within these Rollups that can retain users and capital while continuing to create a wealth effect. This includes a range of offerings from Memecoins and DeFi protocols to additional airdrops and elaborate Ponzi schemes.
Opportunities in the Secondary Market: Investing in the Rollup Summer Narrative
What secondary targets stand to gain from Rollup Summer’s evolution? Emerging from Rollup Summer, potential sub-narratives include:
Speculation 1: Celestia as the “Ethereum” of the Rollup Era
New Rollup projects, including ZKFair and Manta are increasingly adopting Celestia’s DA product. This growing adoption is a strong indicator of its transition from concept to reality. The successful application of Celestia’s DA layer in these leading projects is likely to encourage more emerging Rollups to adopt a similar approach. This success may lead to a trend where adopting Celestia’s DA layer becomes a standard practice for new Rollups, positioning Celestia as a foundational consensus layer and a key infrastructure in the Rollup era.
This leads to the speculation that Celestia will become the “Ethereum” of the Rollup Era. The eventual realization of this narrative may not be immediately crucial, especially when Celestia’s Celestia’s current market cap of $2.3 billion, when compared to Ethereum’s $270 billion, suggests a vast growth potential. 

sourcehttps://www.coingecko.com/
Speculation 2: Interest-Bearing Assets as the Emerging Standard
The shift to interest-bearing assets, particularly interest-bearing stablecoins like sDAI, in place of mainstream stablecoins, has traditionally been gradual. In the airdrop campaigns of Rollups like Blast and Manta, interest-bearing ETH and stablecoins, have become almost standard offerings. If more Rollups adopt interest-bearing assets as part of their strategies, it could significantly contribute to expanding the user base and establishing new investment habits within the crypto community. Rollups that integrate yield-generating assets into their ecosystems can directly benefit the issuers of these assets. The partnerships between Blast and entities like Lido and MakerDao, and between Manta and Stakestone and Mountain Protocol, exemplify how such collaborations can be mutually beneficial.
Speculation 3: Rollups Overtaking Layer1 Market Share
The user experience between Rollups and traditional L1 blockchains has become nearly indistinguishable. The current cycle in the blockchain space might mirror the previous Layer1 public chain wave but with a focus on sovereign Rollups. In their competitive landscape, Rollups are poised to contest directly with Layer1s for user base and capital. Additionally, rapidly emerging native dapps on these Rollups offer significant opportunities. For instance, Manta’s lending project, LayerBank, with a deposit volume exceeding $300 million, demonstrates the potential for Rollups to host top-tier projects, rivaling those on established platforms like Arbitrum.
While there is optimism about the swift advancement and potential dominance of Rollups in the blockchain market, it is essential to remain mindful of the risks. 
Challenges and Risks in the Rollup Summer Narrative
The principal challenge for new Rollups is smoothly transitioning from the initial launch phase, which often involves attracting capital via token airdrop incentives, to a sustainable growth phase. The latter requires more complex strategies to ensure long-term user engagement and capital retention. Several Layer1 blockchains in the previous cycle attempted similar strategies, such as launching large-scale incentive funds to attract top DeFi projects on Ethereum. These DeFi platforms then redistributed these incentives to their users, initially spurring significant business growth, but the impact often diminished over time as similar strategies became widespread across various public chains.
Even established Rollups like Arbitrum and Optimism continue to use their tokens as subsidies for ecosystem projects, which in turn allocate these to end-users to maintain activity and capital in their platforms. For instance, Arbitrum’s recent round of incentives exceeded 70 million $ARB tokens, valued at over $140 million.
Considering that Rollup Summer is still in its initial, embryonic stage, the future is shrouded in uncertainty. Crypto investors must pay attention to the evolving strategies and development of new Rollup projects. 
Evaluating Vitalik’s Proposals on Ethereum Staking: An Analysis of Their Potential EffectsBy Hank Han, Researcher at Mint Ventures Introduction The landscape of Ethereum staking, along with its derivatives, has been a focal point in the crypto community for the last couple of years. Key developments like the Beacon Chain, The Merge, and Shapella, alongside innovations in Liquid Staking Tokens (LST), Derivative Staking Tokens (DST), Restaking, and LST-fi have marked the rapid evolution of Ethereum staking. This surge is largely attributed to fundamental shifts in Ethereum’s staking model. It’s crucial to explore how Ethereum’s staking framework will continue to evolve and its consequential effects on the ecosystem, including various stakeholders and staking derivatives. Vitalik Buterin, in his article published on October 7 titled “Protocol and Staking Pool Changes That Could Improve Decentralization and Reduce Consensus Overhead,” puts forth a series of optimization proposals for the existing Ethereum staking mechanism. These suggestions offer a reference path for further reducing centralization and minimizing consensus overhead in Ethereum. Some of these ideas could significantly revamp the staking mechanism while aligning with the primary trends in Ethereum’s development. Therefore, we will interpret this article and analyze the potential impacts of these proposals on the staking paradigm and its broader implications in the Ethereum ecosystem. Overview of Vitalik’s Insights The Status of Two-Tiered Staking Vitalik Buterin describes the current Ethereum staking landscape as predominantly two-tiered, where there are two classes of participants: Node operators, individuals or entities actively running Ethereum nodes.Delegators, participants who stake some quantity of ETH in any other way beyond running a node The prevalent method for staking in this environment is through staking pools offering Liquid Staking Tokens (LSTs), notable examples being Lido and Rocket Pool. Existing Challenges This emergent two-tiered staking has brought two main flaws: Centralization risk in node operators. After delegators finish staking $ETH, service providers like Lido assume the responsibility of node selection, inherently carrying the risk of centralization. For instance, in a DAO-voting mechanism where Lido dictates node operators, there’s a tendency for operators to accumulate significant holdings of $LDO tokens to enhance their market share. Similarly, Rocket Pool’s model, which allows anyone to become a node operator by submitting an 8 ETH deposit, favors financially robust operators who can effectively “purchase” market share.Needless consensus layer burden. The current staking model imposes a significant load on Ethereum’s consensus layer, which is tasked with aggregating and verifying about 800,000 signatures per epoch. Achieving Single Slot Finality (SSF) would demand the same volume of signatures to be processed per slot, effectively condensing the time frame to 1/32nd of its original duration. This intensifies the hardware requirements for nodes. From the current two-tiered staking structure, most of the verification work is carried out by node operators. Although there’s a large number of validators, the diversity of these validators is limited. Consequently, increasing node numbers doesn’t necessarily decentralize the network but rather amplifies the consensus layer overhead. A potential solution could be to reduce the number of validating nodes (and thus the number of signatures needed), which might initially seem to favor centralization. However, accompanying strategies to mitigate centralization risks are discussed in subsequent sections. Glossary Slot: This refers to the time required for a new block to be proposed by a validator in the proof-of-stake system. In Ethereum, a slot is approximately 12 seconds. In each slot, the network randomly selects one validator as the block proposer, who is responsible for creating a new block and broadcasting it to other nodes in the network. Additionally, a committee of validators is also randomly chosen for each slot. Their votes determine the validity of the proposed block. Importantly, not every validator participates in the validation process for each slot. Only those selected for the committee are engaged in active validation. Achieving consensus on the slot’s state requires the affirmation of two-thirds of the committee’s votes. This selective participation of validators in different slots is a strategic design to optimize network efficiency and manage load. Epoch: It represents the number of 32 slots. In Ethereum, an epoch is approximately 6.4 minutes. Within any given epoch, a validator is limited to joining a single committee. Throughout the epoch, all active validators on the network are obliged to submit evidence of their ongoing active status. The first slot of each epoch (under normal circumstances) is also known as the checkpoint. Finality: In a distributed network, a transaction has “finality” when it becomes part of a block and cannot be reverted unless an attacker commits to losing a large amount of staked ETH, causing a blockchain rollback. Ethereum manages finality through “checkpoint” blocks. If a pair of checkpoints (the first slot of adjacent epochs) attracts votes representing at least two-thirds of the total staked ETH, the checkpoints are upgraded. The more recent of the two (target) becomes “justified”. The earlier of the two is already justified because it was the “target” in the previous epoch and now it is upgraded to “finalized”. while the older one, previously justified, advances to “finalized” status. On average, finality for a typical transaction occurs in about 2.5 epochs or around 16 minutes. This duration is calculated based on the transaction’s placement in the middle of an epoch and the time taken for subsequent checkpoints to become justified and then finalized. Ideally, achieving justification for an epoch’s checkpoint occurs at its 22nd slot, leading to an average transaction finality time of approximately 14 minutes. Single Slot Finality (SSF): It refers to that blocks could get proposed and finalized in the same slot. The current time to finality has turned out to be too long and most users do not want to wait 15 minutes for finality, and it is inconvenient for applications that might want high transaction throughput. Having a delay between a block’s proposal and finalization also creates an opportunity for short reorgs that an attacker could use to censor certain blocks or extract MEV. The mechanism that deals with upgrading blocks in stages is also quite complex and has been patched several times to close security vulnerabilities, making it one of the parts of the Ethereum codebase where subtle bugs are more likely to arise. These issues could all be eliminated by reducing the time to finality to a single slot. As part of Ethereum’s long-term roadmap, particularly in The Merge branch, SSF is a crucial milestone. However, SSF is in the research phase and it is not expected to ship for several years, likely after other substantial upgrades such as Verkle trees and Danksharding. Proposed Solutions by Vitalik Vitalik Buterin suggests that the current role of delegators in Ethereum staking is not as impactful as intended. He advocates for empowering delegators with more rights and responsibilities to address existing challenges. The two main strategies proposed are ‘Expanding Delegate Selection Powers’ and ‘Consensus Participation’. Expanding Delegate Selection Powers Expanding delegate selection powers aims to provide delegators with greater autonomy in choosing staking service providers and node operators, thereby playing a more active role in the staking process. Delegate selection already exists in a limited form today, in the sense that rETH or stETH holders can withdraw their ETH and switch to a different pool, but lack direct influence over node operator selection and face restrictions in withdrawal flexibility. Vitalik proposed three ways to expand delegate selection powers: Better voting tools within pools. This involves developing more sophisticated voting systems within staking pools, allowing users to directly influence the selection of node operators. The practice does not exist today: in Rocket Pool, anyone can become a node operator, while in Lido, node operator selection is controlled by the LDO token holder although Lido has a proposal for LDO + stETH dual governance.More competition between pools. Vitalik suggests increasing the competitive landscape among staking pools, offering delegators a broader spectrum of choices. However, smaller staking pools face challenges in competing with dominant players like Lido, as their LSTs often lack liquidity, trust, and dApp compatibility. To counter these issues, Vitalik proposes measures like capping slashing penalties to a smaller amount, enabling more flexible withdrawals to enhance LST liquidity and trust, and introducing a unified LST token standard for seamless dApp integration across different staking pools’ LSTs.Enshrined Delegation. This means that delegation functionalities can be directly executed on the Ethereum Mainnet, which would involve protocol-level specifications that require delegators to select a node operator at the time of staking.  What is Slash Ethereum’s protocol design incentivizes validators to make a consensus on the prerequisite of staking a certain amount of ETH. If any validator is found to have dishonest behaviors, a significant part of their staked ETH is burned. There are mainly two types of misconduct that lead to slashing: Proposing two different blocks for the same slot and double voting by attesting to two candidates for the same block. Why Capping the Slash Amount Can Reduce Risks for Delegators In the current two-tiered staking structure, delegators stake their ETH but don’t directly control validator actions, which are managed by node operators. Thus, when a node operator acts maliciously, it’s the delegators who bear the indirect consequences of slashing. Projects like Rocket Pool require node operators to stake ETH as a security measure, addressing the principal-agent dilemma. Capping the slashing amount at the Ethereum protocol level to a threshold that can be covered by the node operator’s share would significantly lower the risk for delegators. This change would allow staking service providers more flexibility in permitting delegators to withdraw their funds at any time, without the need to maintain a high level of liquidity for potential slashings. Consensus participation The idea of “Consensus Participation” aims to engage delegators more directly in Ethereum’s consensus process, without adding extra overhead to the Ethereum consensus layer. Vitalik acknowledges that many delegators prefer to stake their ETH passively, primarily through Liquid Staking Tokens (LSTs). However, he also believes that some delegators might be interested in playing a more active role in the consensus process. This active participation can contribute to a more decentralized and robust network. Vitalik suggests two potential pathways for consensus participation: enshrined two-tiered staking solution in protocol, or implemented as staking pool features. Enshrined in protocol At the protocol level, validators can be divided into two categories: higher-complexity slashable tier and lower-complexity tier, which aims to optimize network performance and enhance decentralization. Higher-complexity Slashable Tier: These validators handle the main verification and computational tasks on Ethereum and are required to remain online at all times. Validators in this tier would need to stake a significantly higher amount of ETH (eg. 2048 ETH) and they would be subject to the risk of slashing. The total number of higher-complexity slashable tier validators in the network would be capped at 10,000. Lower-complexity Tier: These validators face no cap on numbers and have no minimum staking requirement. They are exempt from slashing and their participation in the consensus process is required only during specific slots.Lower-complexity validators also referred to as ‘small-stakers’ in Vitalik’s post, are primarily drawn from two groups: Delegators contributing their ETH to higher-complexity validators and independent participants who opt to become validators without relying on staking services.Operational Modes for Lower-Complexity Validators:In each slot, 10000 small-stakers are randomly chosen, and they can sign off on what they think is the head of that slot.A delegator can send a transaction declaring to the network that they are online and are willing to serve as a small-staker for the next hour. They are responsible for voting on the block header they support. At the end of their duty, they must sign off, indicating the completion of their participation.A delegator can send a transaction declaring to the network that they are online and are willing to serve as a small-staker for the next hour. For each epoch, 10 random delegators are chosen as inclusion list providers, and 10000 more are chosen as voters. These small-stakers do not need to manually sign off and their online status expires naturally over time.These three modes have a common goal: they prevent a 51% majority of node operators and enhance Ethereum’s resistance to censorship. The first and second focus on preventing a majority from engaging in finality reversion. The third focuses more directly on censorship, empowering small-stakers to take on additional responsibilities.Prerequisite for Lightweight Participation: The availability of an ultra-light client for lower-complexity tier validators is essential, enabling them to complete validation tasks via smartphones or web browsers. This involves research into Ethereum’s client architecture, including the integration of technologies like Verkle Trees and statelessness, aimed at lowering the entry barrier for validators. Implemented as Staking Pool Features  Implemented as staking pool features refers to enabling delegators to actively participate in the consensus process through upgrades within staking pools. The core idea is to incorporate joint signatures from delegators and validators in the consensus voting process to reflect the collective will of the delegator group. Vitalik has proposed three methods to facilitate this integration: Each staking pool that wants to become a validator is allowed to specify two staking keys: a persistent staking key ‘P’ and an Ethereum address which, when called, outputs a quick staking key ‘Q’. Nodes track the fork choice of messages signed by P and messages signed by Q. If the two agree, the verification will be successful, Conversely, if the two disagree, they do not accept any block as finalized. Staking pools are responsible for randomly selecting delegators as the Q-key holders for the current slot.Validators randomly generate a staking public key ‘P+Q’ for each slot, which means the signature required for a slot’s vote is a joint computation effort of both validators and delegators. Given that a different key is randomly generated for each slot, accountability in the event of slashing poses a significant challenge. Addressing this requires careful design to ensure traceability and responsibility.Instead of delegators directly holding the Q-key, it could be embedded within a smart contract. This approach allows for more complex and variable triggering conditions and the staking pool can introduce a richer and more dynamic voting logic. Summary Vitalik Buterin says that if the proposed solutions are done right, tweaks to the PoS staking design could solve two birds with one stone: reducing staking centralization and minimizing the consensus layer overhead. Give people who do not have the resources or capability to solo-stake today an opportunity to participate in staking that keeps more power in their hands: both power to select node operators and power to actively participate in consensus in some way that’s lighter but still meaningful. Vitalik mentioned that not all participants would take either or both options, but any that do would significantly improve the current PoS landscape.Reduce the number of signatures that the Ethereum consensus layer needs to process in each slot, even in a single-slot-finality regime, to a smaller number like about 10,000. This would also aid decentralization, by making it much easier for everyone to run a validating node. Many of these solutions, including better voting tools within pools, more competition between pools, and in-protocol enshrinement, operate at different layers of abstraction. However, they share the common goal of addressing the issues of centralization in staking and the consensus layer overhead. Vitalik underscores the importance of meticulous planning and assessment in implementing these solutions, and minimal viable enshrinement, minimizing both protocol complexity and level of change to protocol economics while still achieving the desired goal, is generally the optimal choice. Analysis of Potential Impacts on Staking Landscape Overview of Staking Landscape The Ethereum staking ecosystem, as classified by @StakingRewards, comprises the Validator Layer, Staking Layer, Staking Bridge, DeFi Infrastructure, and Structured Products. The internal logic and individual value propositions of each layer can be outlined as follows: Validator Layer provides essential hardware resources for the staking layer and solo stakers, as represented by node operators like P2P, and Stakefish and including  Distributed Validator Technology (DVT) service providers such as SSV and Obol. This layer addresses hardware and technical needs for the Staking Layer.Staking Layer acts as an intermediary between delegators and node operators, facilitating the consensus validation process on Ethereum, as driven by staking service providers like Lido and Rocket Pool, and innovative EigenLayer, which introduced the concept of Restaking. This layer packages the indirect participation of delegators in the PoS mechanism into a more accessible financial product. By doing so, it lowers the entry barriers for participation in staking and increases the availability of staking shares within the Ethereum ecosystem.Staking Bridge refers to the Liquid Staking Tokens (LSTs) issued by the Staking Layer. LSTs serve as a bridge for users to engage with various DeFi protocols. Staking service providers facilitate LST-ETH trading pairs on platforms like Curve, offering liquidity to delegators. This feature allows delegators to exit their staking positions prematurely if necessary, thus reducing the opportunity cost associated with staking.DeFi Infrastructure and Structured Products: This layer focuses on leveraging the value storage and income-generating capabilities of LSTs to develop derivative products and services, creating more application scenarios for LSTs, enriching the DeFi ecosystem, and attracting users to participate in staking. In the staking ecosystem, the Staking Layer plays a pivotal role. This layer is instrumental in not only expanding the availability of staking shares within Ethereum but also channels liquidity into the DeFi system through LSTs. Given its central role, any modifications or advancements within the Staking Layer have the potential to exert substantial influence across the entire staking ecosystem.  Therefore, our analysis will focus on examining the impact of Vitalik Buterin’s proposed solutions on various projects within the Staking Layer. In the context of this discussion, the term “staking landscape” will be used specifically for the Staking Layer. Potential Impacts of the Proposed Solutions on the Staking Landscape Each of Vitalik Buterin’s proposed solutions, while distinct in their implementation, is likely to influence the dynamics of the Ethereum staking landscape. This section will delve into the possible effects of these solutions and evaluate their practicality. Expanding Delegate Selection Powers Here is a closer examination of the potential impacts of Vitalik’s three proposals for expanding delegators’ power in selecting node operators: Better voting tools within pools: This would involve refining the voting processes within staking pools, empowering pool users to directly choose their node operators.Potential Impact: Optimizing voting systems could lead to increased decentralization within individual staking service providers. Despite these changes, the overall market centralization in the staking landscape may not diminish because users often prefer established, top-tier staking pools due to trust and reliability factors. By shifting some control over the selection of node operators from staking service providers to delegators, this approach could potentially dilute the value captured by the governance tokens originally held by these providers.Analysis of Adoption ProbabilityLow Implementation Cost: This solution requires no changes to the Ethereum consensus layer, only modifications to the internal mechanisms of staking service providers. Lack of Incentives for Existing Staking Providers: This proposal requires current staking service providers to voluntarily implement changes, incurring significant costs, including development expenses and the potential reduction in the utility and value of their governance tokens.Summary: This approach partially addresses the issue of centralization in staking but fails to solve the problem of consensus overhead. The overall effectiveness might be moderate. While implementation costs are relatively low, existing staking service providers lack the motivation to adopt this change, making its likelihood of adoption quite low. However, this could open opportunities for innovation and competition among new entrants in the staking services market.More Competition Between Pools: This involves intensifying the competition among staking pools, providing delegators with a wider range of choices. Currently, the key differentiators among various staking pools in attracting users are the liquidity, trust, and dApp compatibility of their LSTs. Vitalik proposes reducing the amount of slashing penalties and introducing a unified LST standard to minimize these differences, thereby intensifying the competition among staking service providers.Potential Impact: Enhanced competition could diminish the disparities among staking service providers, possibly reducing the dominance of major players like Lido. This shift could lead to decreased centralization in the staking ecosystem. A more competitive landscape could result in the flourishing of the LSTfi ecosystem, as dApps may extend support to LSTs from a larger array of staking pools. Service providers may start to compete on different aspects, such as the staking returns of their LSTs, focusing on strategies to maximize MEV.Analysis of Adoption Probability:Moderate Implementation Cost: Technically, the costs are not substantial, as this does not necessitate changes to the Ethereum consensus layer. The key lies in developing a new LST standard and consensus among staking service providers to lower slashing penalties. During the process, significant migration costs could arise, requiring existing LST holders to transition to the new unified standard.Lack of Incentives for Existing Staking Providers: This proposal requires current staking service providers to voluntarily implement changes. They might be reluctant to adopt this approach due to the development costs, risks of LST migration, and potential market share erosion.Summary: This solution could effectively reduce centralization in the staking landscape but does not tackle the consensus overhead issue. Despite the moderate cost, the lack of strong incentives for existing service providers could hinder adoption. Similar to the previous solution, this scenario might open doors for new staking providers to enter the market, using the proposed changes as a unique competitive edge.Enshrined Delegation: This means that delegation functionalities can be directly executed on the Ethereum Mainnet, which would involve protocol-level specifications that require delegators to select a node operator at the time of staking.Potential Impact: With the backing of the Ethereum protocol layer, the security and legitimacy of the transition process in delegation would be enhanced. However, this integration could add to Ethereum’s consensus overhead, as the delegation process at the protocol level introduces an extra verification workload.Analysis of Adoption Feasibility:High Implementation Cost: This would require an upgrade to the Ethereum consensus layer to support new delegation functionalities natively.Possible Deviation from Ethereum’s Principles: This mechanism increased the consensus overhead and might inadvertently edge toward a Delegated Proof of Stake (DPoS) system, which could diverge from the initial design ethos and goals of Ethereum. Vitalik Buterin might be cautious of such an outcome.Summary: Enshrined Delegation, although promising in terms of reducing centralization, will increase the consensus overhead. Given the high costs and potential deviation from Ethereum’s foundational principles, the likelihood of this solution being adopted is extremely low. Consensus participation The core concept of Consensus Participation is to engage more validators, particularly those in the low-complexity tier, in Ethereum’s consensus process. This can be achieved either through native integration within the Ethereum network or via third-party projects. Enshrined in Protocol According to Vitalik’s concept, validators on the Ethereum network would be categorized into high-complexity and low-complexity tiers. High-complexity tier validators would have a higher staking threshold, potentially set at 2048 ETH, and their numbers would be capped at 10,000. They would be required to be continuously online, handling the primary verification and computational tasks essential for network stability and security. Low-complexity tier validators would operate lightweight clients and participate in consensus processes during specific times. Their tasks would be less demanding, focusing mainly on activities like voting. Note: Vitalik Buterin’s reference to a 2048 ETH staking requirement in his seminal article carries substantial practical implications for the future evolution of Ethereum’s staking mechanism. This figure, as elaborated in “Paths toward single-slot finality” and his citation of EIP-7251, is not merely theoretical but has considerable operational significance. Setting a staking threshold of 2048 ETH is strategically designed to optimize the number of validators, achieving a balanced network state. This approach is pivotal in reducing the consensus overhead for Ethereum, thereby facilitating the transition toward Single Slot Finality (SSF). In his article “Protocol and Staking Pool Changes That Could Improve Decentralization and Reduce Consensus Overhead,” Vitalik proposes a pragmatic path forward: initially adopting EIP-7251 as an interim measure. This step would entail elevating the maximum validator balance to 2048 ETH while maintaining the existing minimum of 32 ETH. Eventually, the 2048 ETH would become the standard staking requirement, enabling validators to autonomously choose their tier. In light of these considerations, the 2048 ETH figure emerges as a critical reference point in our analysis, offering insightful guidance on the potential structuring of Ethereum’s validator tiers. Potential Impact: Enhanced Decentralization and Reduced Consensus Overhead: The proposed native integration offers a streamlined, cost-effective avenue for a large number of delegators and regular users to engage in Ethereum’s consensus process. This inclusivity significantly bolsters the network’s decentralization. Capping high-complexity tier validators at 10,000 and setting a staking requirement of 2048 ETH simplifies the consensus mechanism. It reduces the overall complexity and the volume of aggregated signatures needed per slot, thereby easing the overhead on Ethereum’s consensus system.Increased Value and Penetration of Staking Service Provider and DVT: With the higher responsibilities and continual online presence required from high-complexity tier validators, the operational demands, particularly in terms of hardware, are elevated. This change underscores the importance of security technologies like DVT. The 2048 ETH staking threshold may encourage users who were previously solo staking to opt for a delegator role. This shift could amplify the market presence and adoption of staking service providers and technologies like DVT.Market Limitations for Staking Providers: In Vitalik’s model, low-complexity tier validators participate in the consensus by running light clients independently. Consequently, the ETH staked by these delegators does not contribute to the TVL in staking services. Users who opt to become low-complexity tier validators can do so without the intermediation of staking service providers. By running their ultra-light nodes, they eliminate the need to entrust their stakes to service providers and incur associated fees. As a result of these dynamics, the TVL that staking service providers can capture is likely to reach a maximum threshold, theoretically capped at around 20.48 million ETH.Analysis of Growth Prospects for Staking Service ProvidersShort to Medium-Term Growth Potential with Limitations: Post-EIP-1559 and the Merge, Ethereum’s total supply stabilizes around 120 million, with about 28 million ETH currently staked. This equates to a staking rate of approximately 23.29%, indicating some potential for growth in the staking sector. The increasing wait times for validators to join or exit and the declining staking rewards suggest that the growth in ETH staking is approaching a saturation point. Without a substantial boost in MEV earnings, driven by increased on-chain transactions, the quantity of staked ETH may stabilize, offering limited incentives for further growth.Long-Term Stagnation for Staking Providers and DVT Projects: Staking service providers like Lido and DVT projects such as SSV primarily generate revenue by taking a cut from the staking yields they manage. With a potential upper limit of 20.48 million ETH set for delegators’ funds under the new proposed structure, this cap would be lower than the current staked amount of 28 million ETH. The future growth and revenue potential of these service providers are closely tied to the increase in MEV income. If MEV earnings do not rise significantly (resulting in no substantial increase in the staking ratio), the absolute revenue size within the staking ecosystem may not only cease to grow but could potentially decrease. Analysis of Adoption Probability:Extremely High Implementation Cost: This process involves modifications to Ethereum’s consensus layer.The introduction of a tiered validator structure can align with Ethereum’s strategic long-term objectives and could be integrated into the network. As noted by Vitalik Buterin in “Endgame,” block sizes will increase gradually (due to the state bloat), which may eventually lead to a scenario where only a few dozen or a few hundred nodes can afford to run a fully participating node. Ethereum needs to find another lightweight way to allow more people to participate in consensus, making such a chain acceptably trustless and censorship-resistant. For features like SSF, collaboration between diverse types of validators is essential. The tiered approach, with different validators bearing varying responsibilities, supports this goal.The concept of a tiered validator system is a recurring theme in Ethereum’s roadmaps and blogs. Ongoing research and development projects are focused on creating conditions for low-complexity tier validators, such as through the development of lightweight client solutions.In major upgrades like PBS (Proposer/Builder Separation) and Danksharding, a similar philosophy of tiered validators and division of labor is evident: assigning more demanding tasks (such as storing blobs and constructing blocks) to specialized nodes to ensure efficiency, while enabling a larger number of lightweight nodes to participate in the consensus process to ensure decentralization.In the strategic framework presented in Vitalik Buterin’s “Endgame,” a key concept emerges: the SNARK-ification of the Ethereum verification process. This refers to the implementation of lightweight clients, a cornerstone in facilitating the participation of low-complexity tier validators in the Ethereum consensus mechanism. Within the broader Ethereum roadmap, there are notable research initiatives, such as Stateless Ethereum and The Verge, which are focused on this objective.  Summary: This approach can simultaneously address the issues of centralization in staking and consensus overhead. However, the implementation cost is extremely high, as it requires changes to the PoS rules at the Ethereum consensus layer. Despite this, it aligns with Ethereum’s long-term developmental interests, and the Ethereum roadmap has already shown some preparatory work in this direction. While it may be adopted in the longer term, the likelihood of short-term implementation is relatively low. Implemented as Staking Pool Features Vitalik also proposed an implementation that relies solely on staking pools, without direct modifications to Ethereum’s protocol layers. This method involves the division of a validator’s private key into two components, P and Q, which are allocated to the validators and the user, respectively. The consensus process is then facilitated through the joint signatures of both P and Q keys. Potential Impact: This method may moderately mitigate centralization within staking services, although its overall effectiveness remains uncertain. The intricacies involved in this process, particularly for users, could lead to lower engagement due to its complexity. Given that the solution primarily entails internal modifications within staking service providers, its influence on the broader staking landscape may be restricted.Analysis of Adoption ProbabilityModerate Implementation Cost: This solution does not require significant changes to the Ethereum consensus layer. However, it does necessitate fairly complex upgrades by existing staking service providers, including the management and joint signing of split keys and establishing user-friendly consensus participation mechanisms.Challenges for Existing Service Providers: Existing staking services may encounter considerable costs and complexities in implementing these changes. The required alterations in key management and user experience design could be resource-intensive, without a clear path to increased returns.This method could add complexities to Ethereum’s consensus mechanism, including processes like matching messages signed by both P and Q keys, which might inadvertently contribute to increased overhead.Summary: While this approach offers a novel way to address centralization in staking services, its effectiveness and scope of impact are not assured. The potential costs and complexities associated with this method may deter existing staking service providers from adopting it. New staking service providers might utilize this feature as a unique selling point to distinguish themselves in the market. Summary Vitalik Buterin has not explicitly favored any particular solution in his discussions. However, an analysis of the potential impacts of each proposal, in conjunction with insights from his previous articles and the Ethereum development roadmap, allows us to speculate on possible future directions. Analysis of “Expanding Delegate Selection Powers” ProposalsIssue of Incomplete Resolution: The solutions under this category mainly target the centralization of staking. However, their capacity to effectively resolve this issue is uncertain. The existing two-tiered structure, involving delegators and staking pools, inherently resembles a Delegated Proof of Stake (DPoS) system. The proposed enhancements within this framework don’t fundamentally transform this structure and might even amplify DPoS-like features. In particular, the idea of native integration of delegation could potentially increase the overhead on Ethereum’s consensus mechanism.Conflicting Interests of Existing Providers: Existing staking service providers might find these proposals counterproductive to their interests. The suggested improvements in pool voting mechanisms and increased competition between pools require the cooperation of these providers, who may not be incentivized to support changes that could dilute their market dominance.Opportunities for New Projects: The new staking service providers could leverage these proposals to offer more decentralized staking alternatives in the market, positioning themselves as innovative challengers to established providers.Analysis of “Consensus Participation” ProposalsNative Support as a Long-Term Solution: The proposal for native support within Ethereum is poised to address both the centralization of staking and the overload of Ethereum’s consensus mechanism. Indications from Ethereum’s development roadmap suggest that groundwork for a tiered validator structure is already in progress. Despite the complexities and challenges associated with its implementation, particularly in the short term, the proposal for native integration within Ethereum presents a highly feasible long-term solution.Compared to the proposals under “Expanding Delegate Selection Powers,” the third-party integration approach could more effectively mitigate the issue of centralization in staking. However, like the previous proposals, it falls short in addressing the consensus overhead problem in Ethereum and existing staking service providers may lack sufficient motivation to adopt this approach. However, this proposal opens opportunities for new staking service providers to differentiate themselves in the market by leveraging this feature. Conclusion Vitalik Buterin’s discourse and writings reflect a fundamental philosophy for Ethereum: a commitment to neutrality and minimalism. Ethereum, while capable of integrating numerous advanced features such as Account Abstraction, Liquid Staking Services, and Stealth Address, often refrain from directly incorporating these functionalities into its protocol layer. Instead, it adopts a strategy that encourages the development of these features through third-party projects. This approach has allowed these external entities to address Ethereum’s challenges effectively, carving out unique market niches and contributing to the ecosystem’s diversity and resilience. However, as Ethereum undergoes continuous evolution, the landscape for these third-party projects is also in flux. Adapting to the changing dynamics of Ethereum is not merely a test of these projects’ adaptability; it’s also an opportunity for strategic foresight and market positioning. Being attuned to Ethereum’s trajectory and proactively anticipating future developments is key to long-term success in this space. In our analysis, we have endeavored to interpret the potential future challenges and uncertainties facing projects within the current staking landscape, drawing upon Vitalik Buterin’s visions and insights. While Vitalik has proposed various potential “endgames” for Ethereum, the future of this blockchain platform is inherently unpredictable, subject to shifting market demands and ongoing technological innovation. In this ever-evolving environment, the projects that thrive will likely be those that not only adapt to immediate changes but also strategically position themselves for future scenarios, leveraging their foresight and adaptability to stay ahead in the long-term race. Reference <Protocol and staking pool changes that could improve decentralization and reduce consensus overhead><Should Ethereum be okay with enshrining more things in the protocol?><Paths toward single-slot finality>Ethereum Roadmap: Single slot finality<A Proof of Stake overview><Can we find Goldilocks? Musings on “two-tiered” staking, a native Liquid Staking Token design.><Endgame><The Beacon Chain Ethereum 2.0 explainer you need to read first>FAQ on EIP-7251; Increasing the MAX_EFFECTIVE_BALANCE – HackMD

Evaluating Vitalik’s Proposals on Ethereum Staking: An Analysis of Their Potential Effects

By Hank Han, Researcher at Mint Ventures

Introduction
The landscape of Ethereum staking, along with its derivatives, has been a focal point in the crypto community for the last couple of years. Key developments like the Beacon Chain, The Merge, and Shapella, alongside innovations in Liquid Staking Tokens (LST), Derivative Staking Tokens (DST), Restaking, and LST-fi have marked the rapid evolution of Ethereum staking. This surge is largely attributed to fundamental shifts in Ethereum’s staking model. It’s crucial to explore how Ethereum’s staking framework will continue to evolve and its consequential effects on the ecosystem, including various stakeholders and staking derivatives.
Vitalik Buterin, in his article published on October 7 titled “Protocol and Staking Pool Changes That Could Improve Decentralization and Reduce Consensus Overhead,” puts forth a series of optimization proposals for the existing Ethereum staking mechanism. These suggestions offer a reference path for further reducing centralization and minimizing consensus overhead in Ethereum. Some of these ideas could significantly revamp the staking mechanism while aligning with the primary trends in Ethereum’s development. Therefore, we will interpret this article and analyze the potential impacts of these proposals on the staking paradigm and its broader implications in the Ethereum ecosystem.
Overview of Vitalik’s Insights
The Status of Two-Tiered Staking
Vitalik Buterin describes the current Ethereum staking landscape as predominantly two-tiered, where there are two classes of participants:
Node operators, individuals or entities actively running Ethereum nodes.Delegators, participants who stake some quantity of ETH in any other way beyond running a node
The prevalent method for staking in this environment is through staking pools offering Liquid Staking Tokens (LSTs), notable examples being Lido and Rocket Pool.
Existing Challenges
This emergent two-tiered staking has brought two main flaws:
Centralization risk in node operators. After delegators finish staking $ETH , service providers like Lido assume the responsibility of node selection, inherently carrying the risk of centralization. For instance, in a DAO-voting mechanism where Lido dictates node operators, there’s a tendency for operators to accumulate significant holdings of $LDO tokens to enhance their market share. Similarly, Rocket Pool’s model, which allows anyone to become a node operator by submitting an 8 ETH deposit, favors financially robust operators who can effectively “purchase” market share.Needless consensus layer burden. The current staking model imposes a significant load on Ethereum’s consensus layer, which is tasked with aggregating and verifying about 800,000 signatures per epoch. Achieving Single Slot Finality (SSF) would demand the same volume of signatures to be processed per slot, effectively condensing the time frame to 1/32nd of its original duration. This intensifies the hardware requirements for nodes. From the current two-tiered staking structure, most of the verification work is carried out by node operators. Although there’s a large number of validators, the diversity of these validators is limited. Consequently, increasing node numbers doesn’t necessarily decentralize the network but rather amplifies the consensus layer overhead. A potential solution could be to reduce the number of validating nodes (and thus the number of signatures needed), which might initially seem to favor centralization. However, accompanying strategies to mitigate centralization risks are discussed in subsequent sections.

Glossary
Slot: This refers to the time required for a new block to be proposed by a validator in the proof-of-stake system. In Ethereum, a slot is approximately 12 seconds. In each slot, the network randomly selects one validator as the block proposer, who is responsible for creating a new block and broadcasting it to other nodes in the network. Additionally, a committee of validators is also randomly chosen for each slot. Their votes determine the validity of the proposed block. Importantly, not every validator participates in the validation process for each slot. Only those selected for the committee are engaged in active validation. Achieving consensus on the slot’s state requires the affirmation of two-thirds of the committee’s votes. This selective participation of validators in different slots is a strategic design to optimize network efficiency and manage load.
Epoch: It represents the number of 32 slots. In Ethereum, an epoch is approximately 6.4 minutes. Within any given epoch, a validator is limited to joining a single committee. Throughout the epoch, all active validators on the network are obliged to submit evidence of their ongoing active status. The first slot of each epoch (under normal circumstances) is also known as the checkpoint.
Finality: In a distributed network, a transaction has “finality” when it becomes part of a block and cannot be reverted unless an attacker commits to losing a large amount of staked ETH, causing a blockchain rollback. Ethereum manages finality through “checkpoint” blocks. If a pair of checkpoints (the first slot of adjacent epochs) attracts votes representing at least two-thirds of the total staked ETH, the checkpoints are upgraded. The more recent of the two (target) becomes “justified”. The earlier of the two is already justified because it was the “target” in the previous epoch and now it is upgraded to “finalized”. while the older one, previously justified, advances to “finalized” status. On average, finality for a typical transaction occurs in about 2.5 epochs or around 16 minutes. This duration is calculated based on the transaction’s placement in the middle of an epoch and the time taken for subsequent checkpoints to become justified and then finalized. Ideally, achieving justification for an epoch’s checkpoint occurs at its 22nd slot, leading to an average transaction finality time of approximately 14 minutes.
Single Slot Finality (SSF): It refers to that blocks could get proposed and finalized in the same slot. The current time to finality has turned out to be too long and most users do not want to wait 15 minutes for finality, and it is inconvenient for applications that might want high transaction throughput. Having a delay between a block’s proposal and finalization also creates an opportunity for short reorgs that an attacker could use to censor certain blocks or extract MEV. The mechanism that deals with upgrading blocks in stages is also quite complex and has been patched several times to close security vulnerabilities, making it one of the parts of the Ethereum codebase where subtle bugs are more likely to arise. These issues could all be eliminated by reducing the time to finality to a single slot. As part of Ethereum’s long-term roadmap, particularly in The Merge branch, SSF is a crucial milestone. However, SSF is in the research phase and it is not expected to ship for several years, likely after other substantial upgrades such as Verkle trees and Danksharding.
Proposed Solutions by Vitalik
Vitalik Buterin suggests that the current role of delegators in Ethereum staking is not as impactful as intended. He advocates for empowering delegators with more rights and responsibilities to address existing challenges. The two main strategies proposed are ‘Expanding Delegate Selection Powers’ and ‘Consensus Participation’.
Expanding Delegate Selection Powers
Expanding delegate selection powers aims to provide delegators with greater autonomy in choosing staking service providers and node operators, thereby playing a more active role in the staking process. Delegate selection already exists in a limited form today, in the sense that rETH or stETH holders can withdraw their ETH and switch to a different pool, but lack direct influence over node operator selection and face restrictions in withdrawal flexibility.
Vitalik proposed three ways to expand delegate selection powers:
Better voting tools within pools. This involves developing more sophisticated voting systems within staking pools, allowing users to directly influence the selection of node operators. The practice does not exist today: in Rocket Pool, anyone can become a node operator, while in Lido, node operator selection is controlled by the LDO token holder although Lido has a proposal for LDO + stETH dual governance.More competition between pools. Vitalik suggests increasing the competitive landscape among staking pools, offering delegators a broader spectrum of choices. However, smaller staking pools face challenges in competing with dominant players like Lido, as their LSTs often lack liquidity, trust, and dApp compatibility. To counter these issues, Vitalik proposes measures like capping slashing penalties to a smaller amount, enabling more flexible withdrawals to enhance LST liquidity and trust, and introducing a unified LST token standard for seamless dApp integration across different staking pools’ LSTs.Enshrined Delegation. This means that delegation functionalities can be directly executed on the Ethereum Mainnet, which would involve protocol-level specifications that require delegators to select a node operator at the time of staking. 

What is Slash
Ethereum’s protocol design incentivizes validators to make a consensus on the prerequisite of staking a certain amount of ETH. If any validator is found to have dishonest behaviors, a significant part of their staked ETH is burned. There are mainly two types of misconduct that lead to slashing: Proposing two different blocks for the same slot and double voting by attesting to two candidates for the same block.
Why Capping the Slash Amount Can Reduce Risks for Delegators
In the current two-tiered staking structure, delegators stake their ETH but don’t directly control validator actions, which are managed by node operators. Thus, when a node operator acts maliciously, it’s the delegators who bear the indirect consequences of slashing. Projects like Rocket Pool require node operators to stake ETH as a security measure, addressing the principal-agent dilemma. Capping the slashing amount at the Ethereum protocol level to a threshold that can be covered by the node operator’s share would significantly lower the risk for delegators. This change would allow staking service providers more flexibility in permitting delegators to withdraw their funds at any time, without the need to maintain a high level of liquidity for potential slashings.
Consensus participation
The idea of “Consensus Participation” aims to engage delegators more directly in Ethereum’s consensus process, without adding extra overhead to the Ethereum consensus layer. Vitalik acknowledges that many delegators prefer to stake their ETH passively, primarily through Liquid Staking Tokens (LSTs). However, he also believes that some delegators might be interested in playing a more active role in the consensus process. This active participation can contribute to a more decentralized and robust network. Vitalik suggests two potential pathways for consensus participation: enshrined two-tiered staking solution in protocol, or implemented as staking pool features.
Enshrined in protocol
At the protocol level, validators can be divided into two categories: higher-complexity slashable tier and lower-complexity tier, which aims to optimize network performance and enhance decentralization.
Higher-complexity Slashable Tier: These validators handle the main verification and computational tasks on Ethereum and are required to remain online at all times. Validators in this tier would need to stake a significantly higher amount of ETH (eg. 2048 ETH) and they would be subject to the risk of slashing. The total number of higher-complexity slashable tier validators in the network would be capped at 10,000. Lower-complexity Tier: These validators face no cap on numbers and have no minimum staking requirement. They are exempt from slashing and their participation in the consensus process is required only during specific slots.Lower-complexity validators also referred to as ‘small-stakers’ in Vitalik’s post, are primarily drawn from two groups: Delegators contributing their ETH to higher-complexity validators and independent participants who opt to become validators without relying on staking services.Operational Modes for Lower-Complexity Validators:In each slot, 10000 small-stakers are randomly chosen, and they can sign off on what they think is the head of that slot.A delegator can send a transaction declaring to the network that they are online and are willing to serve as a small-staker for the next hour. They are responsible for voting on the block header they support. At the end of their duty, they must sign off, indicating the completion of their participation.A delegator can send a transaction declaring to the network that they are online and are willing to serve as a small-staker for the next hour. For each epoch, 10 random delegators are chosen as inclusion list providers, and 10000 more are chosen as voters. These small-stakers do not need to manually sign off and their online status expires naturally over time.These three modes have a common goal: they prevent a 51% majority of node operators and enhance Ethereum’s resistance to censorship. The first and second focus on preventing a majority from engaging in finality reversion. The third focuses more directly on censorship, empowering small-stakers to take on additional responsibilities.Prerequisite for Lightweight Participation: The availability of an ultra-light client for lower-complexity tier validators is essential, enabling them to complete validation tasks via smartphones or web browsers. This involves research into Ethereum’s client architecture, including the integration of technologies like Verkle Trees and statelessness, aimed at lowering the entry barrier for validators.

Implemented as Staking Pool Features 
Implemented as staking pool features refers to enabling delegators to actively participate in the consensus process through upgrades within staking pools. The core idea is to incorporate joint signatures from delegators and validators in the consensus voting process to reflect the collective will of the delegator group. Vitalik has proposed three methods to facilitate this integration:
Each staking pool that wants to become a validator is allowed to specify two staking keys: a persistent staking key ‘P’ and an Ethereum address which, when called, outputs a quick staking key ‘Q’. Nodes track the fork choice of messages signed by P and messages signed by Q. If the two agree, the verification will be successful, Conversely, if the two disagree, they do not accept any block as finalized. Staking pools are responsible for randomly selecting delegators as the Q-key holders for the current slot.Validators randomly generate a staking public key ‘P+Q’ for each slot, which means the signature required for a slot’s vote is a joint computation effort of both validators and delegators. Given that a different key is randomly generated for each slot, accountability in the event of slashing poses a significant challenge. Addressing this requires careful design to ensure traceability and responsibility.Instead of delegators directly holding the Q-key, it could be embedded within a smart contract. This approach allows for more complex and variable triggering conditions and the staking pool can introduce a richer and more dynamic voting logic.
Summary
Vitalik Buterin says that if the proposed solutions are done right, tweaks to the PoS staking design could solve two birds with one stone: reducing staking centralization and minimizing the consensus layer overhead.
Give people who do not have the resources or capability to solo-stake today an opportunity to participate in staking that keeps more power in their hands: both power to select node operators and power to actively participate in consensus in some way that’s lighter but still meaningful. Vitalik mentioned that not all participants would take either or both options, but any that do would significantly improve the current PoS landscape.Reduce the number of signatures that the Ethereum consensus layer needs to process in each slot, even in a single-slot-finality regime, to a smaller number like about 10,000. This would also aid decentralization, by making it much easier for everyone to run a validating node.
Many of these solutions, including better voting tools within pools, more competition between pools, and in-protocol enshrinement, operate at different layers of abstraction. However, they share the common goal of addressing the issues of centralization in staking and the consensus layer overhead. Vitalik underscores the importance of meticulous planning and assessment in implementing these solutions, and minimal viable enshrinement, minimizing both protocol complexity and level of change to protocol economics while still achieving the desired goal, is generally the optimal choice.
Analysis of Potential Impacts on Staking Landscape
Overview of Staking Landscape
The Ethereum staking ecosystem, as classified by @StakingRewards, comprises the Validator Layer, Staking Layer, Staking Bridge, DeFi Infrastructure, and Structured Products. The internal logic and individual value propositions of each layer can be outlined as follows:
Validator Layer provides essential hardware resources for the staking layer and solo stakers, as represented by node operators like P2P, and Stakefish and including  Distributed Validator Technology (DVT) service providers such as SSV and Obol. This layer addresses hardware and technical needs for the Staking Layer.Staking Layer acts as an intermediary between delegators and node operators, facilitating the consensus validation process on Ethereum, as driven by staking service providers like Lido and Rocket Pool, and innovative EigenLayer, which introduced the concept of Restaking. This layer packages the indirect participation of delegators in the PoS mechanism into a more accessible financial product. By doing so, it lowers the entry barriers for participation in staking and increases the availability of staking shares within the Ethereum ecosystem.Staking Bridge refers to the Liquid Staking Tokens (LSTs) issued by the Staking Layer. LSTs serve as a bridge for users to engage with various DeFi protocols. Staking service providers facilitate LST-ETH trading pairs on platforms like Curve, offering liquidity to delegators. This feature allows delegators to exit their staking positions prematurely if necessary, thus reducing the opportunity cost associated with staking.DeFi Infrastructure and Structured Products: This layer focuses on leveraging the value storage and income-generating capabilities of LSTs to develop derivative products and services, creating more application scenarios for LSTs, enriching the DeFi ecosystem, and attracting users to participate in staking.

In the staking ecosystem, the Staking Layer plays a pivotal role. This layer is instrumental in not only expanding the availability of staking shares within Ethereum but also channels liquidity into the DeFi system through LSTs. Given its central role, any modifications or advancements within the Staking Layer have the potential to exert substantial influence across the entire staking ecosystem.  Therefore, our analysis will focus on examining the impact of Vitalik Buterin’s proposed solutions on various projects within the Staking Layer. In the context of this discussion, the term “staking landscape” will be used specifically for the Staking Layer.
Potential Impacts of the Proposed Solutions on the Staking Landscape
Each of Vitalik Buterin’s proposed solutions, while distinct in their implementation, is likely to influence the dynamics of the Ethereum staking landscape. This section will delve into the possible effects of these solutions and evaluate their practicality.
Expanding Delegate Selection Powers
Here is a closer examination of the potential impacts of Vitalik’s three proposals for expanding delegators’ power in selecting node operators:
Better voting tools within pools: This would involve refining the voting processes within staking pools, empowering pool users to directly choose their node operators.Potential Impact: Optimizing voting systems could lead to increased decentralization within individual staking service providers. Despite these changes, the overall market centralization in the staking landscape may not diminish because users often prefer established, top-tier staking pools due to trust and reliability factors. By shifting some control over the selection of node operators from staking service providers to delegators, this approach could potentially dilute the value captured by the governance tokens originally held by these providers.Analysis of Adoption ProbabilityLow Implementation Cost: This solution requires no changes to the Ethereum consensus layer, only modifications to the internal mechanisms of staking service providers. Lack of Incentives for Existing Staking Providers: This proposal requires current staking service providers to voluntarily implement changes, incurring significant costs, including development expenses and the potential reduction in the utility and value of their governance tokens.Summary: This approach partially addresses the issue of centralization in staking but fails to solve the problem of consensus overhead. The overall effectiveness might be moderate. While implementation costs are relatively low, existing staking service providers lack the motivation to adopt this change, making its likelihood of adoption quite low. However, this could open opportunities for innovation and competition among new entrants in the staking services market.More Competition Between Pools: This involves intensifying the competition among staking pools, providing delegators with a wider range of choices. Currently, the key differentiators among various staking pools in attracting users are the liquidity, trust, and dApp compatibility of their LSTs. Vitalik proposes reducing the amount of slashing penalties and introducing a unified LST standard to minimize these differences, thereby intensifying the competition among staking service providers.Potential Impact: Enhanced competition could diminish the disparities among staking service providers, possibly reducing the dominance of major players like Lido. This shift could lead to decreased centralization in the staking ecosystem. A more competitive landscape could result in the flourishing of the LSTfi ecosystem, as dApps may extend support to LSTs from a larger array of staking pools. Service providers may start to compete on different aspects, such as the staking returns of their LSTs, focusing on strategies to maximize MEV.Analysis of Adoption Probability:Moderate Implementation Cost: Technically, the costs are not substantial, as this does not necessitate changes to the Ethereum consensus layer. The key lies in developing a new LST standard and consensus among staking service providers to lower slashing penalties. During the process, significant migration costs could arise, requiring existing LST holders to transition to the new unified standard.Lack of Incentives for Existing Staking Providers: This proposal requires current staking service providers to voluntarily implement changes. They might be reluctant to adopt this approach due to the development costs, risks of LST migration, and potential market share erosion.Summary: This solution could effectively reduce centralization in the staking landscape but does not tackle the consensus overhead issue. Despite the moderate cost, the lack of strong incentives for existing service providers could hinder adoption. Similar to the previous solution, this scenario might open doors for new staking providers to enter the market, using the proposed changes as a unique competitive edge.Enshrined Delegation: This means that delegation functionalities can be directly executed on the Ethereum Mainnet, which would involve protocol-level specifications that require delegators to select a node operator at the time of staking.Potential Impact: With the backing of the Ethereum protocol layer, the security and legitimacy of the transition process in delegation would be enhanced. However, this integration could add to Ethereum’s consensus overhead, as the delegation process at the protocol level introduces an extra verification workload.Analysis of Adoption Feasibility:High Implementation Cost: This would require an upgrade to the Ethereum consensus layer to support new delegation functionalities natively.Possible Deviation from Ethereum’s Principles: This mechanism increased the consensus overhead and might inadvertently edge toward a Delegated Proof of Stake (DPoS) system, which could diverge from the initial design ethos and goals of Ethereum. Vitalik Buterin might be cautious of such an outcome.Summary: Enshrined Delegation, although promising in terms of reducing centralization, will increase the consensus overhead. Given the high costs and potential deviation from Ethereum’s foundational principles, the likelihood of this solution being adopted is extremely low.
Consensus participation
The core concept of Consensus Participation is to engage more validators, particularly those in the low-complexity tier, in Ethereum’s consensus process. This can be achieved either through native integration within the Ethereum network or via third-party projects.
Enshrined in Protocol
According to Vitalik’s concept, validators on the Ethereum network would be categorized into high-complexity and low-complexity tiers. High-complexity tier validators would have a higher staking threshold, potentially set at 2048 ETH, and their numbers would be capped at 10,000. They would be required to be continuously online, handling the primary verification and computational tasks essential for network stability and security. Low-complexity tier validators would operate lightweight clients and participate in consensus processes during specific times. Their tasks would be less demanding, focusing mainly on activities like voting.
Note: Vitalik Buterin’s reference to a 2048 ETH staking requirement in his seminal article carries substantial practical implications for the future evolution of Ethereum’s staking mechanism. This figure, as elaborated in “Paths toward single-slot finality” and his citation of EIP-7251, is not merely theoretical but has considerable operational significance. Setting a staking threshold of 2048 ETH is strategically designed to optimize the number of validators, achieving a balanced network state. This approach is pivotal in reducing the consensus overhead for Ethereum, thereby facilitating the transition toward Single Slot Finality (SSF). In his article “Protocol and Staking Pool Changes That Could Improve Decentralization and Reduce Consensus Overhead,” Vitalik proposes a pragmatic path forward: initially adopting EIP-7251 as an interim measure. This step would entail elevating the maximum validator balance to 2048 ETH while maintaining the existing minimum of 32 ETH. Eventually, the 2048 ETH would become the standard staking requirement, enabling validators to autonomously choose their tier. In light of these considerations, the 2048 ETH figure emerges as a critical reference point in our analysis, offering insightful guidance on the potential structuring of Ethereum’s validator tiers.

Potential Impact: Enhanced Decentralization and Reduced Consensus Overhead: The proposed native integration offers a streamlined, cost-effective avenue for a large number of delegators and regular users to engage in Ethereum’s consensus process. This inclusivity significantly bolsters the network’s decentralization. Capping high-complexity tier validators at 10,000 and setting a staking requirement of 2048 ETH simplifies the consensus mechanism. It reduces the overall complexity and the volume of aggregated signatures needed per slot, thereby easing the overhead on Ethereum’s consensus system.Increased Value and Penetration of Staking Service Provider and DVT: With the higher responsibilities and continual online presence required from high-complexity tier validators, the operational demands, particularly in terms of hardware, are elevated. This change underscores the importance of security technologies like DVT. The 2048 ETH staking threshold may encourage users who were previously solo staking to opt for a delegator role. This shift could amplify the market presence and adoption of staking service providers and technologies like DVT.Market Limitations for Staking Providers: In Vitalik’s model, low-complexity tier validators participate in the consensus by running light clients independently. Consequently, the ETH staked by these delegators does not contribute to the TVL in staking services. Users who opt to become low-complexity tier validators can do so without the intermediation of staking service providers. By running their ultra-light nodes, they eliminate the need to entrust their stakes to service providers and incur associated fees. As a result of these dynamics, the TVL that staking service providers can capture is likely to reach a maximum threshold, theoretically capped at around 20.48 million ETH.Analysis of Growth Prospects for Staking Service ProvidersShort to Medium-Term Growth Potential with Limitations: Post-EIP-1559 and the Merge, Ethereum’s total supply stabilizes around 120 million, with about 28 million ETH currently staked. This equates to a staking rate of approximately 23.29%, indicating some potential for growth in the staking sector. The increasing wait times for validators to join or exit and the declining staking rewards suggest that the growth in ETH staking is approaching a saturation point. Without a substantial boost in MEV earnings, driven by increased on-chain transactions, the quantity of staked ETH may stabilize, offering limited incentives for further growth.Long-Term Stagnation for Staking Providers and DVT Projects: Staking service providers like Lido and DVT projects such as SSV primarily generate revenue by taking a cut from the staking yields they manage. With a potential upper limit of 20.48 million ETH set for delegators’ funds under the new proposed structure, this cap would be lower than the current staked amount of 28 million ETH. The future growth and revenue potential of these service providers are closely tied to the increase in MEV income. If MEV earnings do not rise significantly (resulting in no substantial increase in the staking ratio), the absolute revenue size within the staking ecosystem may not only cease to grow but could potentially decrease.

Analysis of Adoption Probability:Extremely High Implementation Cost: This process involves modifications to Ethereum’s consensus layer.The introduction of a tiered validator structure can align with Ethereum’s strategic long-term objectives and could be integrated into the network. As noted by Vitalik Buterin in “Endgame,” block sizes will increase gradually (due to the state bloat), which may eventually lead to a scenario where only a few dozen or a few hundred nodes can afford to run a fully participating node. Ethereum needs to find another lightweight way to allow more people to participate in consensus, making such a chain acceptably trustless and censorship-resistant. For features like SSF, collaboration between diverse types of validators is essential. The tiered approach, with different validators bearing varying responsibilities, supports this goal.The concept of a tiered validator system is a recurring theme in Ethereum’s roadmaps and blogs. Ongoing research and development projects are focused on creating conditions for low-complexity tier validators, such as through the development of lightweight client solutions.In major upgrades like PBS (Proposer/Builder Separation) and Danksharding, a similar philosophy of tiered validators and division of labor is evident: assigning more demanding tasks (such as storing blobs and constructing blocks) to specialized nodes to ensure efficiency, while enabling a larger number of lightweight nodes to participate in the consensus process to ensure decentralization.In the strategic framework presented in Vitalik Buterin’s “Endgame,” a key concept emerges: the SNARK-ification of the Ethereum verification process. This refers to the implementation of lightweight clients, a cornerstone in facilitating the participation of low-complexity tier validators in the Ethereum consensus mechanism. Within the broader Ethereum roadmap, there are notable research initiatives, such as Stateless Ethereum and The Verge, which are focused on this objective. 

Summary: This approach can simultaneously address the issues of centralization in staking and consensus overhead. However, the implementation cost is extremely high, as it requires changes to the PoS rules at the Ethereum consensus layer. Despite this, it aligns with Ethereum’s long-term developmental interests, and the Ethereum roadmap has already shown some preparatory work in this direction. While it may be adopted in the longer term, the likelihood of short-term implementation is relatively low.
Implemented as Staking Pool Features
Vitalik also proposed an implementation that relies solely on staking pools, without direct modifications to Ethereum’s protocol layers. This method involves the division of a validator’s private key into two components, P and Q, which are allocated to the validators and the user, respectively. The consensus process is then facilitated through the joint signatures of both P and Q keys.
Potential Impact: This method may moderately mitigate centralization within staking services, although its overall effectiveness remains uncertain. The intricacies involved in this process, particularly for users, could lead to lower engagement due to its complexity. Given that the solution primarily entails internal modifications within staking service providers, its influence on the broader staking landscape may be restricted.Analysis of Adoption ProbabilityModerate Implementation Cost: This solution does not require significant changes to the Ethereum consensus layer. However, it does necessitate fairly complex upgrades by existing staking service providers, including the management and joint signing of split keys and establishing user-friendly consensus participation mechanisms.Challenges for Existing Service Providers: Existing staking services may encounter considerable costs and complexities in implementing these changes. The required alterations in key management and user experience design could be resource-intensive, without a clear path to increased returns.This method could add complexities to Ethereum’s consensus mechanism, including processes like matching messages signed by both P and Q keys, which might inadvertently contribute to increased overhead.Summary: While this approach offers a novel way to address centralization in staking services, its effectiveness and scope of impact are not assured. The potential costs and complexities associated with this method may deter existing staking service providers from adopting it. New staking service providers might utilize this feature as a unique selling point to distinguish themselves in the market.
Summary
Vitalik Buterin has not explicitly favored any particular solution in his discussions. However, an analysis of the potential impacts of each proposal, in conjunction with insights from his previous articles and the Ethereum development roadmap, allows us to speculate on possible future directions.
Analysis of “Expanding Delegate Selection Powers” ProposalsIssue of Incomplete Resolution: The solutions under this category mainly target the centralization of staking. However, their capacity to effectively resolve this issue is uncertain. The existing two-tiered structure, involving delegators and staking pools, inherently resembles a Delegated Proof of Stake (DPoS) system. The proposed enhancements within this framework don’t fundamentally transform this structure and might even amplify DPoS-like features. In particular, the idea of native integration of delegation could potentially increase the overhead on Ethereum’s consensus mechanism.Conflicting Interests of Existing Providers: Existing staking service providers might find these proposals counterproductive to their interests. The suggested improvements in pool voting mechanisms and increased competition between pools require the cooperation of these providers, who may not be incentivized to support changes that could dilute their market dominance.Opportunities for New Projects: The new staking service providers could leverage these proposals to offer more decentralized staking alternatives in the market, positioning themselves as innovative challengers to established providers.Analysis of “Consensus Participation” ProposalsNative Support as a Long-Term Solution: The proposal for native support within Ethereum is poised to address both the centralization of staking and the overload of Ethereum’s consensus mechanism. Indications from Ethereum’s development roadmap suggest that groundwork for a tiered validator structure is already in progress. Despite the complexities and challenges associated with its implementation, particularly in the short term, the proposal for native integration within Ethereum presents a highly feasible long-term solution.Compared to the proposals under “Expanding Delegate Selection Powers,” the third-party integration approach could more effectively mitigate the issue of centralization in staking. However, like the previous proposals, it falls short in addressing the consensus overhead problem in Ethereum and existing staking service providers may lack sufficient motivation to adopt this approach. However, this proposal opens opportunities for new staking service providers to differentiate themselves in the market by leveraging this feature.

Conclusion
Vitalik Buterin’s discourse and writings reflect a fundamental philosophy for Ethereum: a commitment to neutrality and minimalism. Ethereum, while capable of integrating numerous advanced features such as Account Abstraction, Liquid Staking Services, and Stealth Address, often refrain from directly incorporating these functionalities into its protocol layer. Instead, it adopts a strategy that encourages the development of these features through third-party projects. This approach has allowed these external entities to address Ethereum’s challenges effectively, carving out unique market niches and contributing to the ecosystem’s diversity and resilience. However, as Ethereum undergoes continuous evolution, the landscape for these third-party projects is also in flux. Adapting to the changing dynamics of Ethereum is not merely a test of these projects’ adaptability; it’s also an opportunity for strategic foresight and market positioning. Being attuned to Ethereum’s trajectory and proactively anticipating future developments is key to long-term success in this space.
In our analysis, we have endeavored to interpret the potential future challenges and uncertainties facing projects within the current staking landscape, drawing upon Vitalik Buterin’s visions and insights. While Vitalik has proposed various potential “endgames” for Ethereum, the future of this blockchain platform is inherently unpredictable, subject to shifting market demands and ongoing technological innovation. In this ever-evolving environment, the projects that thrive will likely be those that not only adapt to immediate changes but also strategically position themselves for future scenarios, leveraging their foresight and adaptability to stay ahead in the long-term race.
Reference
<Protocol and staking pool changes that could improve decentralization and reduce consensus overhead><Should Ethereum be okay with enshrining more things in the protocol?><Paths toward single-slot finality>Ethereum Roadmap: Single slot finality<A Proof of Stake overview><Can we find Goldilocks? Musings on “two-tiered” staking, a native Liquid Staking Token design.><Endgame><The Beacon Chain Ethereum 2.0 explainer you need to read first>FAQ on EIP-7251; Increasing the MAX_EFFECTIVE_BALANCE – HackMD
Phased Reflections on The BRC-20 Market During The Inscription SummerBy Alex Xu, Research Partner at Mint Ventures Introduction Inscriptions, especially epitomized by the BRC-20 series, are currently experiencing a notable surge in market interest. This trend marks a pivotal second wave of valuation growth, following the transformative introduction of the Ordinals protocol earlier this year. My recent endeavors in exploring the burgeoning inscribed asset landscape have involved an in-depth analysis of market trends, data synthesis, and active participation in both online and offline forums centered on BRC-20 assets. These experiences have shaped my preliminary insights into inscriptions, which I aim to coherently present in this article. This piece seeks to demystify and address some issues, specifically: What are Inscriptions and what are the criteria determining the popularity of BRC-20 tokens?What is the inherent value proposition of inscriptions, particularly those under the BRC20 umbrella?What is the viability of inscriptions as a novel business model?What are the underlying mechanisms and catalysts propelling the rapid growth of inscriptions?What is the potential evolution of inscriptions? What are the hypothetical scenarios leading to their collapse? The insights offered herein represent a snapshot of my analysis, framed by the knowledge available at the time of writing. There may be factual inaccuracies or biases. This article is intended for discussion purposes only, and feedback is welcomed. What are Inscriptions and The Criteria for High-Quality Inscriptions? Inscriptions refer to digital assets by embedding information in a designated format on the Bitcoin blockchain or other blockchains and subsequently leveraging specific indexing protocols to convert this data into fungible assets. The versatility of inscriptions is evident in their forms, ranging from fungible tokens like BRC-20 tokens to non-fungible tokens such as Bitcoin Frogs. The traditional role of the Bitcoin blockchain is a public ledger that contains every transaction on the network. The Taproot upgrade completed in November 2021 enhanced the data storage capabilities of the Bitcoin blockchain, allowing for lower-cost uploads of both text and multimedia content. This technological advancement has laid the essential foundation for the development and growth of inscriptions. Diverging from Ethereum’s reliance on smart contracts for asset issuance and management, BTC inscriptions pivot on the synergy of on-chain recorded data and indexing standards for their issuance and operation.  Despite these mechanism variances, both Bitcoin inscriptions and Ethscriptions share a common thread: they are both assets based on the input data of public blockchain ledgers.  Fungible Token Inscriptions on Bitcoin, Source: BTCTOOL Non-Fungible Token Inscriptions on Bitcoin, Source: BTCTOOL Based on distinct indexing standards, fungible token inscriptions can be categorized into BRC-20 tokens and ORC-20 tokens. This article delves into the inscriptions on Bitcoin, particularly focusing on the BRC-20 tokens due to their leading position in the market cap. Despite the impressive performance of BRC-20 tokens like $ORDI and $SATS, which have seen considerable gains this year, and others such as $RATS experiencing notable increases, the broader landscape for BRC-20 tokens has been less optimistic. A majority of BRC-20 tokens have demonstrated a concerning trend: a rapid decline to zero value, often losing all liquidity within just a few months of their issuance. Through my active participation in several discussions about inscriptions and BRC-20 tokens, certain attributes have been identified that mark BRC-20 assets as ‘promising’. These include: A catchy name: A name that resonates with Bitcoin and inscription culture (examples being Ordi and Sats), or it is animal-themed. For example, many projects are named after animals in the 2021 memecoin mania.Originality: It should not be a simplistic imitation of existing memes but should represent an original intellectual property.Community Engagement: The success of a BRC-20 token often hinges on the presence of influential leaders and a vibrant, high-profile community that is actively engaged in promoting and expanding the project.Token Distribution: In the early stages, it’s crucial that a lower proportion of tokens are held by whales, particularly those who are purely speculators and do not contribute to the promotion of the project. However, applying these criteria in a practical context presents challenges.  The appeal of a token’s name is inherently subjective, and even within the niche realm of inscriptions, trends can shift rapidly. What the market favors in terms of ‘naming logic’ today might lose its appeal in just a matter of weeks. The Value Proposition and Business Innovation of BRC-20 Tokens There are diverse opinions on the value proposition of BRC-20 tokens. I have summarized the principal viewpoints as follows: Fair Launch Mechanism: BRC-20 tokens are reputed for their equitable distribution approach, a stark contrast to many mainstream Web3 projects. In such projects, venture capitalists often get the opportunity for early investment, acquiring tokens at lower prices and later selling them at a premium to the public.Simple Protocol Format and Functionality: The BRC-20 protocol is straightforward and focused, thereby avoiding the risks associated with smart contracts, such as vulnerabilities to rug pulls, blacklisting incidents, or breaches in contract security.The emergence of BRC20 assets has expanded the recording capabilities of the Bitcoin network, introducing a new class of assets. This diversification could potentially lead to significant network fees, which may play a crucial role in compensating for the diminishing block rewards on the Bitcoin blockchain, ensuring its long-term sustainability and robustness. While the previously mentioned aspects of BRC-20 tokens hold significance, in my assessment, they do not fully explain the surge in popularity and the substantial wealth effect associated with these inscriptions this year.  So, the question arises: are inscriptions represented by BRC-20 tokens a promising business innovation?  The answer to this hinges on the criteria used to define “promising” in this context. From the viewpoint of trading platforms, speculators, and professionals in the Bitcoin mining industry, the advent and rising popularity of BRC-20 and similar assets are indeed advantageous innovations. These groups have seen a tangible increase in their revenue streams owing to the emergence of these assets. However, when we broaden the perspective to evaluate whether BRC-20 tokens and other inscriptions contribute to overarching commercial value — such as lowering the costs of producing goods and services, boosting business efficiency, or optimizing the allocation of resources — my current observation leads me to a more skeptical conclusion.  In essence, BRC-20 token is a new category of Memecoins. I am not averse to Memecoins. Acknowledging their role as mediums for speculation, it’s clear that they essentially create a negative sum game. In such a market, exchanges and project operators often extract value from ongoing speculative activities. Despite this, Memecoins do fulfill a specific demand by providing the excitement and thrill similar to gambling. This aspect meets the needs of a segment of users who, similar to casino-goers, engage in such activities fully aware of the mathematical unprofitability but are drawn by the allure of speculation and chance. The enduring appeal of Memecoins can be attributed to a fundamental aspect of human nature — the irresistible pull towards gambling and greed. These impulses are as strong and primal as basic physiological needs like hunger and sexual desire.  Within the broader spectrum of Memecoins, BRC-20 tokens expand the range of available products rather than a radical innovation.  For example, the much-touted ‘Fair Launch Mechanism’ of BRC-20 tokens, while commendable, isn’t exclusive to this category. A well-designed fair distribution mechanism can also be implemented using smart contracts, suggesting that this feature is not uniquely inherent to inscriptions.  The debate around the “VCs invest early, retail investors buy in later” model highlights the inherent risk and reward dynamics in early-stage investments. Venture Capitalists (VCs) typically invest in projects at a nascent stage when little more than a business plan exists. This stage is riddled with uncertainties, compelling VCs to commit capital at very low valuations to balance their risks. By the time these projects reach the secondary market, where their tokens are traded publicly, many initial uncertainties have been resolved. For instance, the project might have developed a functional product, accumulated measurable data, entered a more mature market, and attracted interest from exchanges for token listing. This transition from a high-risk venture, primarily existing in whitepapers and pitch decks, to a more concrete entity with tangible assets and prospects, justifies the increase in token prices at this stage. Drawing a parallel, Memecoins without VC backing, like $SHIB and $PEPE, exhibit a similar trajectory. Initially, they start as mere concepts with unpredictable futures concerning public attention, speculative investment, and potential endorsements from key opinion leaders (KOLs). This high degree of uncertainty renders their tokens extremely cheap. However, as their community grows, resources accumulate, and optimism about their future increases, these uncertainties give way to certainties, leading to a dramatic rise in their value in the secondary market. Given this context, as a not-so-new variant of Memecoins, what exactly is the reason behind the particular surge in popularity of BRC20 tokens, especially in the latter half of this year? The Mechanism Logic and Driving Forces Behind the Rapid Rise of The BRC-20 Tokens In understanding the wealth frenzy phenomenon of BRC-20 tokens, it’s essential to dissect it from two angles: the underlying mechanism logic and the driving forces propelling their growth. The Mechanism Logic BRC-20 tokens primarily serve as a medium for speculation in the crypto market. Their principal appeal to participants is rooted in the possibility of quickly generating significant wealth. This allure is magnified by the psychological phenomenon known as the law of large numbers, which suggests that individuals are more inclined to publicize their successes while remaining silent about their failures. In the speculative world, this leads to a biased public perception. Stories of remarkable profits and wealth accumulation are more prominently shared and discussed, creating an impression of widespread financial success. However, this narrative often overlooks the reality that a majority of market participants may actually experience losses. The disproportionate emphasis on success stories, as opposed to the more common experiences of loss, paints an overly optimistic view of the potential returns from investing in BRC-20 tokens, obscuring the inherent risks involved in such speculative ventures. In comparison to traditional ERC-20 Memecoins, BRC-20 tokens have demonstrated a more potent capability in generating a “wealth effect.”  This heightened efficiency stems primarily from the trading mechanisms employed in the early stages of BRC-20 tokens, which continue to some extent today. Most BRC-20 tokens are traded via over-the-counter (OTC) order matching, unlike ERC-20 memecoins, which are primarily traded on Automated Market Maker (AMM) Decentralized Exchanges (DEX) or Centralized Exchanges (CEX). The liquidity in OTC markets is significantly shallower than that in AMM DEX or CEX environments. This means that the capital needed to influence BRC-20 token prices is considerably less compared to ERC-20 memecoins. Especially in a bullish market sentiment, the same amount of buying pressure can lead to more pronounced price surges for BRC-20 tokens. Indeed, due to the low liquidity inherent in OTC trading, the same level of purchasing power can lead to more pronounced price increases of BRC-20 tokens compared to ERC-20 Memecoins, thereby creating a more intense and rapid “wealth effect.” Besides the difference in the cost of market manipulation, BRC-20 tokens, which are not traded on CEX, lack short-selling mechanisms such as perpetual contracts. This further removes a key counterbalance to upward price movements, potentially fueling short-term price surges. A noteworthy aspect of the BRC-20 market is the substantial participation of what can be termed “dama-type” investors. These individuals differ from the typical DeFi enthusiasts or mainstream cryptocurrency investors. Many are drawn into the market by the persuasive efforts of community leaders and grassroots marketing campaigns. These Chinese dama tend to follow guidance more readily and are more susceptible to influence, with some even lacking the knowledge of how to sell their holdings. This behavior often leads to a (3,3) pattern, where investors hold onto their tokens without selling, thereby reducing the short-term selling pressure and contributing to price stability or increases. This year, the enhanced user experience provided by OKX Wallet, a wallet heavily promoted by OKX, has played a crucial role in facilitating the entry of these novice investors into the BRC-20 market. The comprehensive infrastructure of these wallets has streamlined the process, making it easier and more accessible for “dama-type speculators” to engage in the market. The Driving Forces The burgeoning interest in BRC-20 inscriptions has emerged as a beacon of speculative activity in an otherwise bearish market, offering a lucrative opportunity for various stakeholders who have strong incentives to sustain and amplify this trend. Miners: The increase in transactions involving inscriptions has been a boon for miners and mining pools, resulting in significant fee income. Additionally, this trend has spurred the sales of mining machines. As direct beneficiaries of this “Inscription Summer,” miners have a vested interest in maintaining and furthering this momentum.Exchanges: The rise of inscriptions as a new speculative avenue has brought in additional revenue through trading fees. It has also attracted new users to these platforms and effectively activated strategic products like wallets. Exchanges like OKX have reaped considerable benefits from this trend, becoming a notable winner. Following this success, Binance is also entering the fray.Network Marketing People: The advent of inscriptions provides network marketers with fresh narratives and themes to explore, allowing them to monetize their resources anew. The collective efforts of these three groups — miners, exchanges, and network marketers — create a powerful synergy that could significantly accelerate the momentum of inscriptions into the next phase of their popularity and development. The Next Stage of The BRC-20 Market As we contemplate the future of BRC-20 tokens, it’s essential to envision a roadmap that can further expand and intensify the already burgeoning market. Several key developments could play a pivotal role in this next phase: Expansion by Secondary CEXs: The continued listing of inscription tokens beyond the likes of $ORDI and $SATS, could be a significant catalyst. Such expansion would not only elevate the status of other Memecoins but also potentially draw traditional crypto investors into the speculative orbit of BRC-20 tokens.Involvement of Leading CEXs like Binance: Their listing of a broader range of inscription tokens could channel substantial mainstream crypto funds into the BRC-20 ecosystem.Innovations Beyond Pure Memecoins: The introduction of BRC-20 Memecoins that incorporate Ponzi-like mechanisms could create even more pronounced wealth effects. However, the crucial question remains: Will the inscription market successfully transition to an even hotter phase, or will it experience a gradual cooling, or possibly an abrupt crash? To understand this, it is vital to explore the possible scenarios and conditions that could lead to a potential collapse of the BRC-20 market.  Theoretical Pathways to the Collapse of the Inscription Market The initial wealth boom of BRC-20 inscriptions has been driven by a confluence of factors, notably the low liquidity inherent in OTC trading, a high proportion of “dama-type” investors, and the engagement of various stakeholders with vested interests. This low liquidity, while a catalyst for rapid price surges, also poses a risk of equally sharp declines during market downturns. This vulnerability arises from the potential scarcity of buyers when the tide turns. A critical point often overlooked by investors is that a large portion of the perceived wealth in their portfolios, sometimes as high as 95%, may be nothing more than “illusory market value.” This valuation is contingent on the ability to withdraw positions swiftly and the assumption that there will always be sufficient market liquidity to facilitate such withdrawals.  The potential collapse of the BRC-20 market could be attributed to several key factors: Prevalence of Short-Lived Rug-Pull Projects: In a market with finite users and capital, the primary competitors for BRC-20 projects are the market-making forces behind other similar projects. As capital and user attention shift to new BRC-20 projects, the available resources to support existing tokens diminish. This intense competition often prompts market makers, or ‘whales’, to exit their positions increasingly early to secure profits. Consequently, the window for users to achieve speculative gains narrows progressively. When the market narrative shifts towards loss rather than gain, and negative sentiment and ridicule become the norm, users are less likely to participate, resulting in a liquidity crunch in the BRC-20 market.Slowed User Acquisition: The first point could make CEXs more hesitant to list new BRC-20 tokens. This reluctance, in turn, may slow the inflow of new users and capital into the market, failing to match the rapid movements of market makers.Transition of Liquidity Battleground to CEXs: Once BRC-20 tokens are listed on CEXs, the primary arena for liquidity shifts to these centralized platforms. This transition introduces greater resistance to price manipulation due to deeper market liquidity. Additionally, the availability of short-selling tools like derivatives on CEXs can significantly diminish the wealth effect post-listing if there is no change in the underlying mechanism.The cumulative effect of these dynamics can create a feedback loop, reinforcing each factor and potentially leading to a collapse pattern in the BRC-20 market.  Conclusion The unyielding forces of greed and the inherent allure of gambling continue to be powerful drivers in the cryptocurrency industry. The recent phenomenon of the Inscription Summer represents a fascinating experiment within this context, and as this landscape continues to unfold, I remain committed to closely monitoring and analyzing these developments. For further insights into Memecoins, you can read Mastering the Dip: Dodging Meme Coin Mania.

Phased Reflections on The BRC-20 Market During The Inscription Summer

By Alex Xu, Research Partner at Mint Ventures
Introduction
Inscriptions, especially epitomized by the BRC-20 series, are currently experiencing a notable surge in market interest. This trend marks a pivotal second wave of valuation growth, following the transformative introduction of the Ordinals protocol earlier this year. My recent endeavors in exploring the burgeoning inscribed asset landscape have involved an in-depth analysis of market trends, data synthesis, and active participation in both online and offline forums centered on BRC-20 assets. These experiences have shaped my preliminary insights into inscriptions, which I aim to coherently present in this article.
This piece seeks to demystify and address some issues, specifically:
What are Inscriptions and what are the criteria determining the popularity of BRC-20 tokens?What is the inherent value proposition of inscriptions, particularly those under the BRC20 umbrella?What is the viability of inscriptions as a novel business model?What are the underlying mechanisms and catalysts propelling the rapid growth of inscriptions?What is the potential evolution of inscriptions? What are the hypothetical scenarios leading to their collapse?
The insights offered herein represent a snapshot of my analysis, framed by the knowledge available at the time of writing. There may be factual inaccuracies or biases. This article is intended for discussion purposes only, and feedback is welcomed.
What are Inscriptions and The Criteria for High-Quality Inscriptions?
Inscriptions refer to digital assets by embedding information in a designated format on the Bitcoin blockchain or other blockchains and subsequently leveraging specific indexing protocols to convert this data into fungible assets. The versatility of inscriptions is evident in their forms, ranging from fungible tokens like BRC-20 tokens to non-fungible tokens such as Bitcoin Frogs.
The traditional role of the Bitcoin blockchain is a public ledger that contains every transaction on the network. The Taproot upgrade completed in November 2021 enhanced the data storage capabilities of the Bitcoin blockchain, allowing for lower-cost uploads of both text and multimedia content. This technological advancement has laid the essential foundation for the development and growth of inscriptions.
Diverging from Ethereum’s reliance on smart contracts for asset issuance and management, BTC inscriptions pivot on the synergy of on-chain recorded data and indexing standards for their issuance and operation. 
Despite these mechanism variances, both Bitcoin inscriptions and Ethscriptions share a common thread: they are both assets based on the input data of public blockchain ledgers. 

Fungible Token Inscriptions on Bitcoin, Source: BTCTOOL

Non-Fungible Token Inscriptions on Bitcoin, Source: BTCTOOL

Based on distinct indexing standards, fungible token inscriptions can be categorized into BRC-20 tokens and ORC-20 tokens.
This article delves into the inscriptions on Bitcoin, particularly focusing on the BRC-20 tokens due to their leading position in the market cap.
Despite the impressive performance of BRC-20 tokens like $ORDI and $SATS, which have seen considerable gains this year, and others such as $RATS experiencing notable increases, the broader landscape for BRC-20 tokens has been less optimistic. A majority of BRC-20 tokens have demonstrated a concerning trend: a rapid decline to zero value, often losing all liquidity within just a few months of their issuance.
Through my active participation in several discussions about inscriptions and BRC-20 tokens, certain attributes have been identified that mark BRC-20 assets as ‘promising’. These include:
A catchy name: A name that resonates with Bitcoin and inscription culture (examples being Ordi and Sats), or it is animal-themed. For example, many projects are named after animals in the 2021 memecoin mania.Originality: It should not be a simplistic imitation of existing memes but should represent an original intellectual property.Community Engagement: The success of a BRC-20 token often hinges on the presence of influential leaders and a vibrant, high-profile community that is actively engaged in promoting and expanding the project.Token Distribution: In the early stages, it’s crucial that a lower proportion of tokens are held by whales, particularly those who are purely speculators and do not contribute to the promotion of the project.
However, applying these criteria in a practical context presents challenges. 
The appeal of a token’s name is inherently subjective, and even within the niche realm of inscriptions, trends can shift rapidly. What the market favors in terms of ‘naming logic’ today might lose its appeal in just a matter of weeks.
The Value Proposition and Business Innovation of BRC-20 Tokens
There are diverse opinions on the value proposition of BRC-20 tokens. I have summarized the principal viewpoints as follows:
Fair Launch Mechanism: BRC-20 tokens are reputed for their equitable distribution approach, a stark contrast to many mainstream Web3 projects. In such projects, venture capitalists often get the opportunity for early investment, acquiring tokens at lower prices and later selling them at a premium to the public.Simple Protocol Format and Functionality: The BRC-20 protocol is straightforward and focused, thereby avoiding the risks associated with smart contracts, such as vulnerabilities to rug pulls, blacklisting incidents, or breaches in contract security.The emergence of BRC20 assets has expanded the recording capabilities of the Bitcoin network, introducing a new class of assets. This diversification could potentially lead to significant network fees, which may play a crucial role in compensating for the diminishing block rewards on the Bitcoin blockchain, ensuring its long-term sustainability and robustness.
While the previously mentioned aspects of BRC-20 tokens hold significance, in my assessment, they do not fully explain the surge in popularity and the substantial wealth effect associated with these inscriptions this year. 
So, the question arises: are inscriptions represented by BRC-20 tokens a promising business innovation? 
The answer to this hinges on the criteria used to define “promising” in this context.
From the viewpoint of trading platforms, speculators, and professionals in the Bitcoin mining industry, the advent and rising popularity of BRC-20 and similar assets are indeed advantageous innovations. These groups have seen a tangible increase in their revenue streams owing to the emergence of these assets.
However, when we broaden the perspective to evaluate whether BRC-20 tokens and other inscriptions contribute to overarching commercial value — such as lowering the costs of producing goods and services, boosting business efficiency, or optimizing the allocation of resources — my current observation leads me to a more skeptical conclusion. 
In essence, BRC-20 token is a new category of Memecoins.
I am not averse to Memecoins. Acknowledging their role as mediums for speculation, it’s clear that they essentially create a negative sum game. In such a market, exchanges and project operators often extract value from ongoing speculative activities. Despite this, Memecoins do fulfill a specific demand by providing the excitement and thrill similar to gambling. This aspect meets the needs of a segment of users who, similar to casino-goers, engage in such activities fully aware of the mathematical unprofitability but are drawn by the allure of speculation and chance.
The enduring appeal of Memecoins can be attributed to a fundamental aspect of human nature — the irresistible pull towards gambling and greed. These impulses are as strong and primal as basic physiological needs like hunger and sexual desire. 
Within the broader spectrum of Memecoins, BRC-20 tokens expand the range of available products rather than a radical innovation. 
For example, the much-touted ‘Fair Launch Mechanism’ of BRC-20 tokens, while commendable, isn’t exclusive to this category. A well-designed fair distribution mechanism can also be implemented using smart contracts, suggesting that this feature is not uniquely inherent to inscriptions. 
The debate around the “VCs invest early, retail investors buy in later” model highlights the inherent risk and reward dynamics in early-stage investments. Venture Capitalists (VCs) typically invest in projects at a nascent stage when little more than a business plan exists. This stage is riddled with uncertainties, compelling VCs to commit capital at very low valuations to balance their risks. By the time these projects reach the secondary market, where their tokens are traded publicly, many initial uncertainties have been resolved. For instance, the project might have developed a functional product, accumulated measurable data, entered a more mature market, and attracted interest from exchanges for token listing. This transition from a high-risk venture, primarily existing in whitepapers and pitch decks, to a more concrete entity with tangible assets and prospects, justifies the increase in token prices at this stage.
Drawing a parallel, Memecoins without VC backing, like $SHIB and $PEPE, exhibit a similar trajectory. Initially, they start as mere concepts with unpredictable futures concerning public attention, speculative investment, and potential endorsements from key opinion leaders (KOLs). This high degree of uncertainty renders their tokens extremely cheap. However, as their community grows, resources accumulate, and optimism about their future increases, these uncertainties give way to certainties, leading to a dramatic rise in their value in the secondary market.
Given this context, as a not-so-new variant of Memecoins, what exactly is the reason behind the particular surge in popularity of BRC20 tokens, especially in the latter half of this year?
The Mechanism Logic and Driving Forces Behind the Rapid Rise of The BRC-20 Tokens
In understanding the wealth frenzy phenomenon of BRC-20 tokens, it’s essential to dissect it from two angles: the underlying mechanism logic and the driving forces propelling their growth.
The Mechanism Logic
BRC-20 tokens primarily serve as a medium for speculation in the crypto market. Their principal appeal to participants is rooted in the possibility of quickly generating significant wealth. This allure is magnified by the psychological phenomenon known as the law of large numbers, which suggests that individuals are more inclined to publicize their successes while remaining silent about their failures. In the speculative world, this leads to a biased public perception. Stories of remarkable profits and wealth accumulation are more prominently shared and discussed, creating an impression of widespread financial success. However, this narrative often overlooks the reality that a majority of market participants may actually experience losses. The disproportionate emphasis on success stories, as opposed to the more common experiences of loss, paints an overly optimistic view of the potential returns from investing in BRC-20 tokens, obscuring the inherent risks involved in such speculative ventures.
In comparison to traditional ERC-20 Memecoins, BRC-20 tokens have demonstrated a more potent capability in generating a “wealth effect.” 
This heightened efficiency stems primarily from the trading mechanisms employed in the early stages of BRC-20 tokens, which continue to some extent today. Most BRC-20 tokens are traded via over-the-counter (OTC) order matching, unlike ERC-20 memecoins, which are primarily traded on Automated Market Maker (AMM) Decentralized Exchanges (DEX) or Centralized Exchanges (CEX). The liquidity in OTC markets is significantly shallower than that in AMM DEX or CEX environments. This means that the capital needed to influence BRC-20 token prices is considerably less compared to ERC-20 memecoins. Especially in a bullish market sentiment, the same amount of buying pressure can lead to more pronounced price surges for BRC-20 tokens.
Indeed, due to the low liquidity inherent in OTC trading, the same level of purchasing power can lead to more pronounced price increases of BRC-20 tokens compared to ERC-20 Memecoins, thereby creating a more intense and rapid “wealth effect.”
Besides the difference in the cost of market manipulation, BRC-20 tokens, which are not traded on CEX, lack short-selling mechanisms such as perpetual contracts. This further removes a key counterbalance to upward price movements, potentially fueling short-term price surges.
A noteworthy aspect of the BRC-20 market is the substantial participation of what can be termed “dama-type” investors. These individuals differ from the typical DeFi enthusiasts or mainstream cryptocurrency investors. Many are drawn into the market by the persuasive efforts of community leaders and grassroots marketing campaigns. These Chinese dama tend to follow guidance more readily and are more susceptible to influence, with some even lacking the knowledge of how to sell their holdings. This behavior often leads to a (3,3) pattern, where investors hold onto their tokens without selling, thereby reducing the short-term selling pressure and contributing to price stability or increases.
This year, the enhanced user experience provided by OKX Wallet, a wallet heavily promoted by OKX, has played a crucial role in facilitating the entry of these novice investors into the BRC-20 market. The comprehensive infrastructure of these wallets has streamlined the process, making it easier and more accessible for “dama-type speculators” to engage in the market.
The Driving Forces
The burgeoning interest in BRC-20 inscriptions has emerged as a beacon of speculative activity in an otherwise bearish market, offering a lucrative opportunity for various stakeholders who have strong incentives to sustain and amplify this trend.
Miners: The increase in transactions involving inscriptions has been a boon for miners and mining pools, resulting in significant fee income. Additionally, this trend has spurred the sales of mining machines. As direct beneficiaries of this “Inscription Summer,” miners have a vested interest in maintaining and furthering this momentum.Exchanges: The rise of inscriptions as a new speculative avenue has brought in additional revenue through trading fees. It has also attracted new users to these platforms and effectively activated strategic products like wallets. Exchanges like OKX have reaped considerable benefits from this trend, becoming a notable winner. Following this success, Binance is also entering the fray.Network Marketing People: The advent of inscriptions provides network marketers with fresh narratives and themes to explore, allowing them to monetize their resources anew.
The collective efforts of these three groups — miners, exchanges, and network marketers — create a powerful synergy that could significantly accelerate the momentum of inscriptions into the next phase of their popularity and development.
The Next Stage of The BRC-20 Market
As we contemplate the future of BRC-20 tokens, it’s essential to envision a roadmap that can further expand and intensify the already burgeoning market. Several key developments could play a pivotal role in this next phase:
Expansion by Secondary CEXs: The continued listing of inscription tokens beyond the likes of $ORDI and $SATS, could be a significant catalyst. Such expansion would not only elevate the status of other Memecoins but also potentially draw traditional crypto investors into the speculative orbit of BRC-20 tokens.Involvement of Leading CEXs like Binance: Their listing of a broader range of inscription tokens could channel substantial mainstream crypto funds into the BRC-20 ecosystem.Innovations Beyond Pure Memecoins: The introduction of BRC-20 Memecoins that incorporate Ponzi-like mechanisms could create even more pronounced wealth effects.
However, the crucial question remains: Will the inscription market successfully transition to an even hotter phase, or will it experience a gradual cooling, or possibly an abrupt crash?
To understand this, it is vital to explore the possible scenarios and conditions that could lead to a potential collapse of the BRC-20 market. 
Theoretical Pathways to the Collapse of the Inscription Market
The initial wealth boom of BRC-20 inscriptions has been driven by a confluence of factors, notably the low liquidity inherent in OTC trading, a high proportion of “dama-type” investors, and the engagement of various stakeholders with vested interests.
This low liquidity, while a catalyst for rapid price surges, also poses a risk of equally sharp declines during market downturns. This vulnerability arises from the potential scarcity of buyers when the tide turns. A critical point often overlooked by investors is that a large portion of the perceived wealth in their portfolios, sometimes as high as 95%, may be nothing more than “illusory market value.” This valuation is contingent on the ability to withdraw positions swiftly and the assumption that there will always be sufficient market liquidity to facilitate such withdrawals. 
The potential collapse of the BRC-20 market could be attributed to several key factors:
Prevalence of Short-Lived Rug-Pull Projects: In a market with finite users and capital, the primary competitors for BRC-20 projects are the market-making forces behind other similar projects. As capital and user attention shift to new BRC-20 projects, the available resources to support existing tokens diminish. This intense competition often prompts market makers, or ‘whales’, to exit their positions increasingly early to secure profits. Consequently, the window for users to achieve speculative gains narrows progressively. When the market narrative shifts towards loss rather than gain, and negative sentiment and ridicule become the norm, users are less likely to participate, resulting in a liquidity crunch in the BRC-20 market.Slowed User Acquisition: The first point could make CEXs more hesitant to list new BRC-20 tokens. This reluctance, in turn, may slow the inflow of new users and capital into the market, failing to match the rapid movements of market makers.Transition of Liquidity Battleground to CEXs: Once BRC-20 tokens are listed on CEXs, the primary arena for liquidity shifts to these centralized platforms. This transition introduces greater resistance to price manipulation due to deeper market liquidity. Additionally, the availability of short-selling tools like derivatives on CEXs can significantly diminish the wealth effect post-listing if there is no change in the underlying mechanism.The cumulative effect of these dynamics can create a feedback loop, reinforcing each factor and potentially leading to a collapse pattern in the BRC-20 market. 
Conclusion
The unyielding forces of greed and the inherent allure of gambling continue to be powerful drivers in the cryptocurrency industry. The recent phenomenon of the Inscription Summer represents a fascinating experiment within this context, and as this landscape continues to unfold, I remain committed to closely monitoring and analyzing these developments.
For further insights into Memecoins, you can read Mastering the Dip: Dodging Meme Coin Mania.
Will Rapidly-Growing Morpho Become A Formidable Challenger to Aave?By Alex Xu, Research Partner at Mint Ventures Preface Having witnessed the volatility of crypto markets, from the bullish surge of 2020 to the bearish grind of 2023, we’ve come to realize that within the Web3 business world, the bedrock of established business models is found firmly within the domain of DeFi. Central to this ecosystem are decentralized exchanges (DEXs), lending protocols, and stablecoins, with derivatives also gaining considerable traction in the space. Notably, these sectors have exhibited a remarkable degree of resilience even amidst a bear market. At Mint Ventures, our analysis has deeply delved into the intricacies of DEXs and stablecoins, covering a wide array of ve(3,3) projects such as Curve, Trader Joe, Syncswap, iZUMi, and Velodrome, as well as stablecoin ventures including MakerDao, Frax, Terra, Liquity, Angle, Celo, and others. In this latest edition of Clips, we pivot our focus to the lending protocols, casting a spotlight on the new player Morpho, which has seen rapid growth in business metrics over the past year. This clip will navigate through the operational fabric and strategic expansions of Morpho, alongside its freshly introduced lending infrastructure, Morpho Blue. Our inquiry will navigate through the intricacies of the current DeFi lending market, Morpho’s innovative approach to decentralized credit, and the potential market shake-up instigated by Morpho Blue. How does the present landscape of the DeFi lending sphere appear?What operational facets does Morpho encompass, which market inefficiencies is it addressing, and what progress has it charted in its business voyage?What are the prospects for the newly launched Morpho Blue? Could it challenge the leading positions of Aave and Compound and what broader impacts could we anticipate? The insights offered herein represent a snapshot of my analysis, framed by the knowledge available at the time of writing. There may be factual inaccuracies or biases. This article is intended for discussion purposes only, and feedback is welcomed. The Landscape of Decentralized Lending Protocols Organic Demand Rises as Mainstream Focus, Eclipsing Ponzi-Like Models Decentralized lending has consistently ranked at the forefront regarding capital capacity, with its Total Value Locked (TVL) now surpassing that of decentralized exchanges (Dex), becoming the premier avenue for capital allocation within the DeFi sector. Source: https://defillama.com/categories Decentralized lending is also one of the rare categories in the Web3 domain that has achieved “Product-Market Fit” (PMF). Despite the ‘DeFi summer’ phenomenon between 2020 and 2021, where many projects aggressively fueled lending activities with token incentives, such practices have waned in the wake of a bearish market turn. In the decentralized finance realm, Aave, a vanguard in the lending protocol sector, has charted a course of robust fiscal performance. Data delineated below underscores a pivotal trend: Aave’s protocol revenue has consistently eclipsed its token incentive expenditures since December 2022. A September snapshot reveals protocol earnings reaching $1.6 million, towering over the relatively modest sum of $230,000 in token incentives. Moreover, Aave’s token incentives are mainly aimed at bolstering the staking of $AAVE as a safety module to secure the protocol against bad debt and to compensate for the treasury when it falls short, rather than incentivizing deposit and borrowing behaviors. Consequently, Aave’s deposit and borrowing activities are enjoying a period of ‘organic’ growth, not sustained by liquidity mining drives that can often echo the unsustainable mechanics of Ponzi frameworks. Aave’s Monthly Protocol Revenues vs. Monthly Claimed Token Incentives Source: https://tokenterminal.com/ Parallel strides are evident within Venus, a heavyweight lending protocol within the BNBchain ecosystem, which has also seen its protocol revenue surpass its incentive expenditures since March 2023. This shift marks a transition towards a more healthy operational framework, weaning off the need for aggressive deposit and borrowing incentives. Venus’ Monthly Protocol Revenues vs. Monthly Claimed Token Incentives Source: https://tokenterminal.com/ However, many lending protocols continue to rely heavily on token subsidies to fuel supply and demand within their ecosystems, where the value of the incentives provided for lending activities far exceeds the revenue they can generate. For instance, Compound V3 continues to provide $COMP token subsidies for deposit and borrowing activities. Nearly half of the USDC deposit APR on Compound V3(Ethereum) is sustained by token subsidies. Source: https://app.compound.finance/markets/weth-mainnet 84% of the USDC deposit APR on Compound V3(Base) is sustained by token subsidies. Source: https://app.compound.finance/markets/weth-basemainnet If Compound relies heavily on token subsidies to maintain its market share, then another protocol, Radiant, could be described as a purely Ponzi structure. Delving into the lending market page on Radiant’s platform, two aberrant phenomena are immediately conspicuous: Source: https://app.radiant.capital/ Firstly, there is a significant disparity in borrow APY when compared to traditional market standards. Whereas typical borrow APY for stablecoins is generally within the 3-5% range, Radiant’s borrowing rates are markedly elevated, averaging between 14-15%. This trend extends to other assets within Radiant’s ecosystem, where the borrow APYs are observed to be as much as 8-10 times higher than those in mainstream protocols. Secondly, Radiant’s platform strategy prominently features the promotion of ‘looping loans’. This mechanism allows users to deposit a single type of asset as collateral, subsequently engaging in a continuous cycle of depositing and borrowing. The primary objective of this strategy is to artificially inflate the user’s total deposit-borrow volume. This, in turn, maximizes the mining incentives accrued in Radiant’s native token, $RDNT. In essence, this approach could be interpreted as an indirect method employed by the Radiant team to distribute RDNT tokens to its users, in exchange for the fees collected on borrowings.  At the heart of Radiant’s economic model lies a critical issue: the primary driver of its fee generation is not the authentic lending activity but rather the pursuit of RDNT tokens. This creates a precarious economic structure that resembles a Ponzi scheme, where the system is essentially self-feeding. In this environment, there aren’t genuine financial consumers engaging in lending practices. The practice of loop loans, where a user acts as both depositor and borrower of the same asset, further complicates the scenario. This model is inherently flawed as it creates an economic cycle where the source of RDNT dividends is the user themselves, lacking any external or organic financial input. In this structure, the only entity assured of risk-free profits is the platform operator, which garners gains from transaction fees, appropriating 15% of the interest income. While the Radiant project team has temporarily mitigated the immediate risks of a financial implosion – typically associated with a drop in the price of $RDNT – through the implementation of the dLP staking mechanism, the long-term viability of this model remains in question. Unless there is a strategic pivot from this Ponzi scheme to a more sustainable business model, the potential for a systemic collapse – a ‘death spiral’ – remains a significant concern. However, in the broader decentralized lending market, exemplified by leading projects like Aave, there’s a discernible shift away from heavy reliance on subsidies to sustain operational revenues. Instead, there’s a move towards more sustainable and healthier business practices.  The following graph shows the active loan volume in the Web3 lending market from May 2019 to October 2023. Starting from a modest base in the hundreds of thousands, the market peaked at $22.5 billion in November 2021, experienced a downturn to $3.8 billion in November 2022, and is currently valued at around $5 billion. This trend indicates a gradual recovery from its lowest point, showcasing remarkable resilience and business adaptability within the sector, even amid challenging market conditions. Source: https://tokenterminal.com/terminal/markets/lending Defined Competitive Edges and Elevated Market Concentration As the infrastructures of DeFi, compared to the DEX market, which exhibits a more competitive landscape, a cornerstone of DeFi, the lending sector benefits from stronger competitive moats, a fact that becomes evident when considering several key factors: 1. Stability in Market Share: The chart below demonstrates the review of the active borrowing volumes among lending protocols, calculated as a percentage of total market share from May 2019 to October 2023. Since mid-2021, post an intensified push by Aave, its market share has consistently hovered within the 50-60% bracket, showcasing remarkable stability and resilience. Compound, despite experiencing a compression in its market share, continues to maintain a robust and stable position. Source: https://tokenterminal.com/terminal/markets/lending In stark contrast, the Dex sector has experienced more volatile shifts in market share. Uniswap, following its introduction, swiftly clinched nearly 90% of the trading volume market share. Nonetheless, this dominance was challenged by the rapid ascent of platforms like Sushiswap, Curve, and Pancakeswap. This competition eroded Uniswap’s hold, reducing its share to approximately 37% at one point, although it has since recovered to about 55%. Furthermore, the Dex sector is characterized by a significantly larger number of active projects compared to the lending space. Source: https://tokenterminal.com/terminal/markets/lending 2. Enhanced Profitability in Lending Projects: As mentioned in the previous section, projects like Aave have managed to generate positive cash flow without resorting to subsidizing borrowing activities. reporting consistent monthly interest spread revenues in the range of approximately $1.5-2 million. This contrasts sharply with the scenario in most Dex projects. For instance, Uniswap, despite its market prominence, hasn’t activated protocol-level fees (only frontend fees are active), and many Dex projects are operating at an effective loss, where the cost of token emissions for liquidity incentives outweighs the revenue generated from protocol fees. The competitive edges around top lending protocols can be broadly attributed to their brand strength in terms of security, which can be further broken down into two points: Long History of Secure Operations: Since the DeFi Summer of 2020, there’s been a proliferation of fork projects inspired by Aave or Compound across various blockchain networks. However, many of these new entrants have encountered security attacks or substantial bad debt losses soon after their launch. In contrast, Aave and Compound have maintained a clean track record, free from significant attacks or insurmountable bad debt incidents. This history of safe, reliable operation in a real-world network environment serves as a vital assurance of security for depositors. Newer lending protocols, even those offering potentially more innovative features and higher short-term APYs, struggle to win user trust, particularly from large-scale investors (whales), in the absence of a proven, long-term operational record.Sufficient security budgets: The top lending protocols, thanks to their higher commercial revenues and more substantial treasury funds, are able to allocate significant resources for security audits and asset risk control. This is crucial both for the development of new features and the introduction of new assets. In summary, the lending market has demonstrated organic demand and a sustainable business model, characterized by a relatively concentrated market share.  Morpho’s Business and Operational Status The Interest Rate Optimizer Morpho is a peer-to-peer lending protocol, also known as lending pool optimizer, built on top of Aave and Compound. It is designed to address the issue of inefficient capital utilization in pool-to-peer lending protocols like Aave, there is often a mismatch between the total funds deposited and the total funds borrowed. The value proposition of Morpho is straightforward and impactful.: to provide enhanced interest rates for both lenders and borrowers. This means higher returns on deposits and lower interest rates for loans. The inefficiency in capital utilization within pool-to-peer lending models, as employed by platforms like Aave and Compound, lies in the inherent mechanism of these models, where the total volume of deposited funds (the pool) consistently exceeds the total volume of loaned funds (the point). Typically, a scenario may present itself where there’s a total of 1 billion $USDT in deposits within the currency market, but only 600 million $USDT is lent out. For depositors, this scenario means that their returns are diluted. Although the idle 400 million USDT is not directly loaned out, it still forms part of the pool generating interest from the 600 million USDT that is actively loaned. Consequently, the interest accrued is distributed across the entire 1 billion USDT deposited, leading to lower earnings per depositor. On the flip side, despite utilizing only a fraction of the total pool, borrowers are subjected to interest costs as if they were borrowing against the entire pool’s funds. This results in a higher interest burden per borrower and the mismatch between deposited and borrowed funds. Let’s take the rate optimizer module on top of Aave V2, which currently has the largest volume of Morpho’s deposit business, as an example of how Morpho’s rate optimization service addresses capital inefficiency. Deposit: Bob opts to deposit 10,000 DAI into Morpho. Morpho then places these funds into the Aave V2 market. At this point, the deposit interest rate provided by the Aave platform for DAI is 3.67%.Borrowing and Collateral: On the other side, Alice intends to borrow 10,000 DAI. To facilitate this, she deposits 20 ETH as collateral into Morpho. Morpho secures Alice’s collateral in the Aave V2 market, mirroring the process it uses for handling deposits.Loan Matching: Morpho redeems the 10,000 DAI previously deposited by Bob in Aave and directly lends it to Alice. This direct matching means Bob’s deposit is fully utilized without any idle funds. For Alice, the benefit is that she is liable for interest only on the actual amount of DAI borrowed, not on the total pool volume. As a result of this direct peer-to-peer (P2P) lending mechanism, both parties enjoy optimized rates: Bob Earns a higher deposit APY of 4.46%, compared to the Aave pool APY of 3.67%, Alice Benefits from a lower borrowing APY of 4.46%, as opposed to Aave’s pool APY of 6.17%. Note: In this scenario, the specific Peer-to-Peer APY of 4.46% and whether it is closer to the lower limit (deposit APY) or upper limit (borrow APY),  is determined by Morpho’s internal parameters. These parameters are controlled by the community governance.  Solving Liquidity Mismatches: Consider a scenario where Bob, having deposited 10,000 DAI, decides to withdraw his funds. However, at this moment, Alice, who borrowed these funds, hasn’t repaid her loan yet. In this case, if no other depositors are available to match, Morpho uses the 20 ETH Alice deposited as collateral to borrow the equivalent amount (principal plus accrued interest, over 10,000 DAI) from Aave. This action allows Bob to successfully withdraw his deposit.Matching Order: Morpho’s operational protocol regarding the matching of funds follows a “larger funds first serve” principle. This approach means that the platform gives priority to matching larger deposits and borrows before smaller ones. The underlying rationale for this strategy is to keep the proportion of gas costs to the total transaction value low. If the gas cost required to execute a match is disproportionately high relative to the amount being matched, the system will not proceed with the match. Source: https://aavev2.morpho.org/?network=mainnet The essence of Morpho’s business model is its innovative approach to using the existing capital pools of Aave and Compound as foundational elements. By acting as a rate optimization service, Morpho effectively matches depositors and borrowers. The ingenuity of this model lies in leveraging the composability of the DeFi ecosystem, showcasing how Morpho has effectively attracted user funds in a way that can be likened to “making something out of nothing.” This approach is particularly appealing to users for several reasons: At a basic level, users on Morpho are guaranteed to earn returns or incur borrowing costs equivalent to those on Aave and Compound. However, when Morpho successfully matches depositors and borrowers directly, it leads to higher returns for depositors and lower costs for borrowers.Morpho’s product architecture, built atop Aave and Compound, also replicates the risk parameters of these platforms. By allocating funds within Aave and Compound, Morpho inherits not just the operational mechanics but also the brand reputation of these two protocols.  The effectiveness of Morpho’s design and its clear value proposition have been reflected in its remarkable growth. Within just over a year of its launch, Morpho has managed to accumulate nearly $1,000,000,000 in deposits, ranking just behind giants like Aave and Compound in terms of deposit volumes.  Business Metrics and Tokenomics Business Metrics The following chart illustrates Morpho’s total supply volume (blue line), total borrow volume (light brown line), and matched amounts (dark brown).  Source: https://analytics.morpho.org/ These metrics collectively paint a picture of the continuous growth in all business scales of Morpho. The deposit matching rate stands at 33.4% and the borrow fund matching rate reaches 63.9, which are quite impressive figures. Tokenomics Source: Morpho Morpho has a total token supply capped at 1 billion tokens, with 51% of the total token supply allocated to the community, 19% of the token sold to investors, 24% of the token supply is held by the founders, the development company Morpho Labs, and the operating entity, Morpho Association, the remaining percentage is earmarked for advisors and contributors. The Morpho token, though already issued and being utilized for governance voting and project incentives, is in a non-transferable state. Consequently, it does not have a secondary market price. This means that token holders can engage in governance decisions but are unable to sell their holdings.   Unlike some projects that have a predefined structure for token distribution and incentives (like Curve), Morpho’s approach to token incentives is more dynamic and flexible. Incentives are determined on a quarterly or monthly basis, allowing the governance team to adjust the intensity and strategies of these incentives in response to market conditions.  This pragmatic and flexible method of token incentive distribution might become a more mainstream model in the Web3 business world.  Morpho’s approach to incentives is comprehensive, targeting both deposit and borrowing behaviors. Over the past year, Morpho has distributed 30.8 million tokens for incentives, which is about 3.08% of the total token supply. This may seem like a modest proportion. Moreover, as indicated in the graph below, Morpho’s official token expenditure shows a trend of decreasing spending on incentives. Interestingly, this reduction has not negatively impacted the growth rate of Morpho’s operations.   This transition is a positive signal for the platform, indicating that Morpho has a sufficiently full Product-Market Fit (PMF), with user demand becoming increasingly organic. With 51% of the community token share still having approximately 48% remaining, Morpho retains a substantial budget and allows flexibility and room for strategic allocation in future growth phases or new market ventures. Despite this, Morpho has not yet started charging for its services.   Team and Financing The core team of Morpho originates from France, primarily located in Paris. The key team members have been publicly identified, and the three founders have their roots in the telecommunications and computer industries, boasting rich backgrounds in blockchain entrepreneurship and development. Morpho has completed two rounds of funding to date: a $1.3 million seed round in October 2021 and an $18 million Series A round in July 2022, led by investors A16z, Nascent, and Variant. Source: Morpho If the aforementioned financing amounts correspond to the officially disclosed 19% investor share, it can be inferred that the total valuation of the project stands at around $100 million.  Morpho Blue and Its Potential Influence What is Morpho Blue? Put simply, Morpho Blue acts as a permissionless lending protocol layer. This sets it apart from platforms like Aave and Compound, as it opens up a broader spectrum of lending possibilities. Anyone can use Morpho Blue to create their own lending market based on it. The dimensions available for builders to choose from include: Asset Collateralization OptionsChoice of Assets to Lend OutOracle Selection for Price FeedsDetermining Loan-to-Value(LTV) and Liquidation Loan-to-Value(LLTV) ratiosImplementing an Interest Rate Model(IRM)  What value will Morpho Blue bring? As outlined in its official documentation, the characteristics can be summarized as follows: TrustlessMorpho Blue is designed to be immutable. It means once its code is deployed, it cannot be modified, reflecting a commitment to minimal governance. With just 650 lines of Solidity code, Morpho Blue stands out for its simplicity and security.EfficientUsers have the flexibility to choose higher Loan-to-Value (LTV) ratios and more favorable interest rates.Morpho Blue sidesteps the need to pay fees for third-party audit and risk management services. Employing a singleton smart contract with a simple code structure, Morpho Blue significantly cuts down on gas costs, achieving a reduction of about 70%.  *Note: A singleton smart contract refers to a protocol using a single contract for execution, rather than a combination of multiple contracts. Uniswap V4 also adopted a singleton contract approach. FlexibleIn Morpho Blue, both market building and risk management aspects, including oracles and lending parameters, are permissionless. This deviates from the uniform model adopted by platforms like Aave and Compound, where the entire platform adheres to a standard set of protocols and rules governed by a DAO.Morpho Blue is tailored to be developer-friendly, integrating a range of contemporary smart contract patterns. It offers account management features that facilitate gasless interactions and account abstraction. Additionally, the platform provides free flash loans, enabling users to access assets across all its markets in a single call, with the condition that the loan is repaid within the same transaction. Morpho Blue employs a product philosophy akin to that of Uniswap V4, positioning itself as a foundational layer for a wide array of financial services. This approach involves opening up the modules above this foundational layer, thereby enabling various parties to come in and offer their distinct services.  Morpho Blue’s approach differs from Aave in a key aspect: while Aave’s lending and borrowing processes are permissionless, the decisions regarding which assets can be borrowed or lent, the nature of risk control rules (whether conservative or aggressive), the selection of oracles, and the setting of interest and liquidation parameters, are all governed and managed by the Aave DAO and the various service providers like Gauntlet and Chaos, who monitor and manage over 600 risk parameters on a daily basis. Morpho Blue functions more like an open lending operating system. It allows users to construct their own optimal lending combinations on its platform, much like they would on Aave. Additionally, professional risk management firms such as Gauntlet and Chaos have the opportunity to seek partnerships in the market, providing their risk management expertise to others and earning corresponding fees. From my perspective, the fundamental value proposition of Morpho Blue lies not just in its trustlessness, efficiency, or flexibility, but primarily in its establishment of a free market for lending. This platform enables collaboration among all participants in the lending market, thus offering a more diverse and enriched array of market options for clients at every stage of their lending journey. Will Morpho Blue Pose a Threat to Aave? Possibly. Morpho stands distinct among the myriad competitors that have emerged to challenge Aave, having accrued multiple advantages over the past year: Morpho, with its managed capital hitting the $1 billion mark, is fast approaching the league of Aave, which oversees a substantial $7 billion in capital management. While these funds are presently employed within the Morpho Interest Rate Optimizer, there are many ways to channel them into new features and functionalities.Morpho, recognized as the quickest expanding lending protocol in the past year and with its token still awaiting official release, presents a realm of possibilities. The introduction of major new features by Morpho could readily captivate user interest and drive participation.Morpho boasts a substantial and adaptable token budget, well-equipped to entice early-stage users through attractive subsidy offerings. Morpho’s consistent operational track record, coupled with its significant fund volume, has enabled it to establish a notable degree of security branding.  This doesn’t automatically put Aave on a back foot in upcoming competitive landscapes. Many users might not have the bandwidth or interest to sift through a plethora of lending choices. The lending solutions curated under the centralized governance of the Aave DAO might still hold the upper hand, continuing to be the go-to choice for the majority of users. Second, Morpho Rate Optimizer largely inherits the security credentials of Aave and Compound, which has gradually put more money at ease. However, Morpho Blue, as a distinct and novel product featuring its own codebase, may initially encounter cautiousness from whales – to take a beat before diving in with full trust. This cautious approach is understandable, given the still-lingering sting from security mishaps in earlier permissionless lending markets like Euler, which remain top of mind in the crypto community. Furthermore, Aave is fully capable of developing a set of functions similar to the Morpho Rate Optimizer on its existing framework to meet user demands for increased capital matching efficiency, potentially pushing Morpho out of the P2P lending market. However, that possibility seems unlikely at the moment, given that Aave also issued Grants to NillaConnect, a Morpho-like P2P lending product, this past July, instead of creating their own. Lastly, the lending business model adopted by Morpho Blue doesn’t markedly diverge from the existing paradigms set by Aave. Aave is also capable to observe, monitor, and potentially replicate any effective lending mechanisms that Morpho Blue demonstrates. But in any case, the launch of Morpho Blue is set to introduce an open lending testbed, paving the way for inclusive participation and combination opportunity across the full spectrum of the lending process.  Could these newly interconnected lending collectives emerging from Morpho Blue spearhead a solution that stands a chance against Aave? We will see what happens.

Will Rapidly-Growing Morpho Become A Formidable Challenger to Aave?

By Alex Xu, Research Partner at Mint Ventures
Preface
Having witnessed the volatility of crypto markets, from the bullish surge of 2020 to the bearish grind of 2023, we’ve come to realize that within the Web3 business world, the bedrock of established business models is found firmly within the domain of DeFi. Central to this ecosystem are decentralized exchanges (DEXs), lending protocols, and stablecoins, with derivatives also gaining considerable traction in the space. Notably, these sectors have exhibited a remarkable degree of resilience even amidst a bear market.
At Mint Ventures, our analysis has deeply delved into the intricacies of DEXs and stablecoins, covering a wide array of ve(3,3) projects such as Curve, Trader Joe, Syncswap, iZUMi, and Velodrome, as well as stablecoin ventures including MakerDao, Frax, Terra, Liquity, Angle, Celo, and others. In this latest edition of Clips, we pivot our focus to the lending protocols, casting a spotlight on the new player Morpho, which has seen rapid growth in business metrics over the past year.
This clip will navigate through the operational fabric and strategic expansions of Morpho, alongside its freshly introduced lending infrastructure, Morpho Blue. Our inquiry will navigate through the intricacies of the current DeFi lending market, Morpho’s innovative approach to decentralized credit, and the potential market shake-up instigated by Morpho Blue.
How does the present landscape of the DeFi lending sphere appear?What operational facets does Morpho encompass, which market inefficiencies is it addressing, and what progress has it charted in its business voyage?What are the prospects for the newly launched Morpho Blue? Could it challenge the leading positions of Aave and Compound and what broader impacts could we anticipate?
The insights offered herein represent a snapshot of my analysis, framed by the knowledge available at the time of writing. There may be factual inaccuracies or biases. This article is intended for discussion purposes only, and feedback is welcomed.
The Landscape of Decentralized Lending Protocols
Organic Demand Rises as Mainstream Focus, Eclipsing Ponzi-Like Models
Decentralized lending has consistently ranked at the forefront regarding capital capacity, with its Total Value Locked (TVL) now surpassing that of decentralized exchanges (Dex), becoming the premier avenue for capital allocation within the DeFi sector.
Source: https://defillama.com/categories

Decentralized lending is also one of the rare categories in the Web3 domain that has achieved “Product-Market Fit” (PMF). Despite the ‘DeFi summer’ phenomenon between 2020 and 2021, where many projects aggressively fueled lending activities with token incentives, such practices have waned in the wake of a bearish market turn.
In the decentralized finance realm, Aave, a vanguard in the lending protocol sector, has charted a course of robust fiscal performance. Data delineated below underscores a pivotal trend: Aave’s protocol revenue has consistently eclipsed its token incentive expenditures since December 2022. A September snapshot reveals protocol earnings reaching $1.6 million, towering over the relatively modest sum of $230,000 in token incentives. Moreover, Aave’s token incentives are mainly aimed at bolstering the staking of $AAVE as a safety module to secure the protocol against bad debt and to compensate for the treasury when it falls short, rather than incentivizing deposit and borrowing behaviors. Consequently, Aave’s deposit and borrowing activities are enjoying a period of ‘organic’ growth, not sustained by liquidity mining drives that can often echo the unsustainable mechanics of Ponzi frameworks.
Aave’s Monthly Protocol Revenues vs. Monthly Claimed Token Incentives
Source: https://tokenterminal.com/

Parallel strides are evident within Venus, a heavyweight lending protocol within the BNBchain ecosystem, which has also seen its protocol revenue surpass its incentive expenditures since March 2023. This shift marks a transition towards a more healthy operational framework, weaning off the need for aggressive deposit and borrowing incentives.
Venus’ Monthly Protocol Revenues vs. Monthly Claimed Token Incentives
Source: https://tokenterminal.com/

However, many lending protocols continue to rely heavily on token subsidies to fuel supply and demand within their ecosystems, where the value of the incentives provided for lending activities far exceeds the revenue they can generate.
For instance, Compound V3 continues to provide $COMP token subsidies for deposit and borrowing activities.
Nearly half of the USDC deposit APR on Compound V3(Ethereum) is sustained by token subsidies.
Source: https://app.compound.finance/markets/weth-mainnet

84% of the USDC deposit APR on Compound V3(Base) is sustained by token subsidies.
Source: https://app.compound.finance/markets/weth-basemainnet

If Compound relies heavily on token subsidies to maintain its market share, then another protocol, Radiant, could be described as a purely Ponzi structure.
Delving into the lending market page on Radiant’s platform, two aberrant phenomena are immediately conspicuous:
Source: https://app.radiant.capital/

Firstly, there is a significant disparity in borrow APY when compared to traditional market standards. Whereas typical borrow APY for stablecoins is generally within the 3-5% range, Radiant’s borrowing rates are markedly elevated, averaging between 14-15%. This trend extends to other assets within Radiant’s ecosystem, where the borrow APYs are observed to be as much as 8-10 times higher than those in mainstream protocols.
Secondly, Radiant’s platform strategy prominently features the promotion of ‘looping loans’. This mechanism allows users to deposit a single type of asset as collateral, subsequently engaging in a continuous cycle of depositing and borrowing. The primary objective of this strategy is to artificially inflate the user’s total deposit-borrow volume. This, in turn, maximizes the mining incentives accrued in Radiant’s native token, $RDNT . In essence, this approach could be interpreted as an indirect method employed by the Radiant team to distribute RDNT tokens to its users, in exchange for the fees collected on borrowings. 
At the heart of Radiant’s economic model lies a critical issue: the primary driver of its fee generation is not the authentic lending activity but rather the pursuit of RDNT tokens. This creates a precarious economic structure that resembles a Ponzi scheme, where the system is essentially self-feeding. In this environment, there aren’t genuine financial consumers engaging in lending practices. The practice of loop loans, where a user acts as both depositor and borrower of the same asset, further complicates the scenario. This model is inherently flawed as it creates an economic cycle where the source of RDNT dividends is the user themselves, lacking any external or organic financial input. In this structure, the only entity assured of risk-free profits is the platform operator, which garners gains from transaction fees, appropriating 15% of the interest income. While the Radiant project team has temporarily mitigated the immediate risks of a financial implosion – typically associated with a drop in the price of $RDNT – through the implementation of the dLP staking mechanism, the long-term viability of this model remains in question. Unless there is a strategic pivot from this Ponzi scheme to a more sustainable business model, the potential for a systemic collapse – a ‘death spiral’ – remains a significant concern.
However, in the broader decentralized lending market, exemplified by leading projects like Aave, there’s a discernible shift away from heavy reliance on subsidies to sustain operational revenues. Instead, there’s a move towards more sustainable and healthier business practices. 
The following graph shows the active loan volume in the Web3 lending market from May 2019 to October 2023. Starting from a modest base in the hundreds of thousands, the market peaked at $22.5 billion in November 2021, experienced a downturn to $3.8 billion in November 2022, and is currently valued at around $5 billion. This trend indicates a gradual recovery from its lowest point, showcasing remarkable resilience and business adaptability within the sector, even amid challenging market conditions.
Source: https://tokenterminal.com/terminal/markets/lending

Defined Competitive Edges and Elevated Market Concentration
As the infrastructures of DeFi, compared to the DEX market, which exhibits a more competitive landscape, a cornerstone of DeFi, the lending sector benefits from stronger competitive moats, a fact that becomes evident when considering several key factors:
1. Stability in Market Share: The chart below demonstrates the review of the active borrowing volumes among lending protocols, calculated as a percentage of total market share from May 2019 to October 2023. Since mid-2021, post an intensified push by Aave, its market share has consistently hovered within the 50-60% bracket, showcasing remarkable stability and resilience. Compound, despite experiencing a compression in its market share, continues to maintain a robust and stable position.
Source: https://tokenterminal.com/terminal/markets/lending
In stark contrast, the Dex sector has experienced more volatile shifts in market share. Uniswap, following its introduction, swiftly clinched nearly 90% of the trading volume market share. Nonetheless, this dominance was challenged by the rapid ascent of platforms like Sushiswap, Curve, and Pancakeswap. This competition eroded Uniswap’s hold, reducing its share to approximately 37% at one point, although it has since recovered to about 55%. Furthermore, the Dex sector is characterized by a significantly larger number of active projects compared to the lending space.
Source: https://tokenterminal.com/terminal/markets/lending

2. Enhanced Profitability in Lending Projects: As mentioned in the previous section, projects like Aave have managed to generate positive cash flow without resorting to subsidizing borrowing activities. reporting consistent monthly interest spread revenues in the range of approximately $1.5-2 million. This contrasts sharply with the scenario in most Dex projects. For instance, Uniswap, despite its market prominence, hasn’t activated protocol-level fees (only frontend fees are active), and many Dex projects are operating at an effective loss, where the cost of token emissions for liquidity incentives outweighs the revenue generated from protocol fees.
The competitive edges around top lending protocols can be broadly attributed to their brand strength in terms of security, which can be further broken down into two points:
Long History of Secure Operations: Since the DeFi Summer of 2020, there’s been a proliferation of fork projects inspired by Aave or Compound across various blockchain networks. However, many of these new entrants have encountered security attacks or substantial bad debt losses soon after their launch. In contrast, Aave and Compound have maintained a clean track record, free from significant attacks or insurmountable bad debt incidents. This history of safe, reliable operation in a real-world network environment serves as a vital assurance of security for depositors. Newer lending protocols, even those offering potentially more innovative features and higher short-term APYs, struggle to win user trust, particularly from large-scale investors (whales), in the absence of a proven, long-term operational record.Sufficient security budgets: The top lending protocols, thanks to their higher commercial revenues and more substantial treasury funds, are able to allocate significant resources for security audits and asset risk control. This is crucial both for the development of new features and the introduction of new assets.
In summary, the lending market has demonstrated organic demand and a sustainable business model, characterized by a relatively concentrated market share. 
Morpho’s Business and Operational Status
The Interest Rate Optimizer
Morpho is a peer-to-peer lending protocol, also known as lending pool optimizer, built on top of Aave and Compound. It is designed to address the issue of inefficient capital utilization in pool-to-peer lending protocols like Aave, there is often a mismatch between the total funds deposited and the total funds borrowed.
The value proposition of Morpho is straightforward and impactful.: to provide enhanced interest rates for both lenders and borrowers. This means higher returns on deposits and lower interest rates for loans.
The inefficiency in capital utilization within pool-to-peer lending models, as employed by platforms like Aave and Compound, lies in the inherent mechanism of these models, where the total volume of deposited funds (the pool) consistently exceeds the total volume of loaned funds (the point). Typically, a scenario may present itself where there’s a total of 1 billion $USDT in deposits within the currency market, but only 600 million $USDT is lent out.
For depositors, this scenario means that their returns are diluted. Although the idle 400 million USDT is not directly loaned out, it still forms part of the pool generating interest from the 600 million USDT that is actively loaned. Consequently, the interest accrued is distributed across the entire 1 billion USDT deposited, leading to lower earnings per depositor. On the flip side, despite utilizing only a fraction of the total pool, borrowers are subjected to interest costs as if they were borrowing against the entire pool’s funds. This results in a higher interest burden per borrower and the mismatch between deposited and borrowed funds.
Let’s take the rate optimizer module on top of Aave V2, which currently has the largest volume of Morpho’s deposit business, as an example of how Morpho’s rate optimization service addresses capital inefficiency.
Deposit: Bob opts to deposit 10,000 DAI into Morpho. Morpho then places these funds into the Aave V2 market. At this point, the deposit interest rate provided by the Aave platform for DAI is 3.67%.Borrowing and Collateral: On the other side, Alice intends to borrow 10,000 DAI. To facilitate this, she deposits 20 ETH as collateral into Morpho. Morpho secures Alice’s collateral in the Aave V2 market, mirroring the process it uses for handling deposits.Loan Matching: Morpho redeems the 10,000 DAI previously deposited by Bob in Aave and directly lends it to Alice. This direct matching means Bob’s deposit is fully utilized without any idle funds. For Alice, the benefit is that she is liable for interest only on the actual amount of DAI borrowed, not on the total pool volume. As a result of this direct peer-to-peer (P2P) lending mechanism, both parties enjoy optimized rates: Bob Earns a higher deposit APY of 4.46%, compared to the Aave pool APY of 3.67%, Alice Benefits from a lower borrowing APY of 4.46%, as opposed to Aave’s pool APY of 6.17%.
Note: In this scenario, the specific Peer-to-Peer APY of 4.46% and whether it is closer to the lower limit (deposit APY) or upper limit (borrow APY),  is determined by Morpho’s internal parameters. These parameters are controlled by the community governance. 
Solving Liquidity Mismatches: Consider a scenario where Bob, having deposited 10,000 DAI, decides to withdraw his funds. However, at this moment, Alice, who borrowed these funds, hasn’t repaid her loan yet. In this case, if no other depositors are available to match, Morpho uses the 20 ETH Alice deposited as collateral to borrow the equivalent amount (principal plus accrued interest, over 10,000 DAI) from Aave. This action allows Bob to successfully withdraw his deposit.Matching Order: Morpho’s operational protocol regarding the matching of funds follows a “larger funds first serve” principle. This approach means that the platform gives priority to matching larger deposits and borrows before smaller ones. The underlying rationale for this strategy is to keep the proportion of gas costs to the total transaction value low. If the gas cost required to execute a match is disproportionately high relative to the amount being matched, the system will not proceed with the match.

Source: https://aavev2.morpho.org/?network=mainnet
The essence of Morpho’s business model is its innovative approach to using the existing capital pools of Aave and Compound as foundational elements. By acting as a rate optimization service, Morpho effectively matches depositors and borrowers.
The ingenuity of this model lies in leveraging the composability of the DeFi ecosystem, showcasing how Morpho has effectively attracted user funds in a way that can be likened to “making something out of nothing.” This approach is particularly appealing to users for several reasons:
At a basic level, users on Morpho are guaranteed to earn returns or incur borrowing costs equivalent to those on Aave and Compound. However, when Morpho successfully matches depositors and borrowers directly, it leads to higher returns for depositors and lower costs for borrowers.Morpho’s product architecture, built atop Aave and Compound, also replicates the risk parameters of these platforms. By allocating funds within Aave and Compound, Morpho inherits not just the operational mechanics but also the brand reputation of these two protocols. 
The effectiveness of Morpho’s design and its clear value proposition have been reflected in its remarkable growth. Within just over a year of its launch, Morpho has managed to accumulate nearly $1,000,000,000 in deposits, ranking just behind giants like Aave and Compound in terms of deposit volumes. 
Business Metrics and Tokenomics
Business Metrics
The following chart illustrates Morpho’s total supply volume (blue line), total borrow volume (light brown line), and matched amounts (dark brown). 
Source: https://analytics.morpho.org/

These metrics collectively paint a picture of the continuous growth in all business scales of Morpho. The deposit matching rate stands at 33.4% and the borrow fund matching rate reaches 63.9, which are quite impressive figures.
Tokenomics
Source: Morpho

Morpho has a total token supply capped at 1 billion tokens, with 51% of the total token supply allocated to the community, 19% of the token sold to investors, 24% of the token supply is held by the founders, the development company Morpho Labs, and the operating entity, Morpho Association, the remaining percentage is earmarked for advisors and contributors.
The Morpho token, though already issued and being utilized for governance voting and project incentives, is in a non-transferable state. Consequently, it does not have a secondary market price. This means that token holders can engage in governance decisions but are unable to sell their holdings.  
Unlike some projects that have a predefined structure for token distribution and incentives (like Curve), Morpho’s approach to token incentives is more dynamic and flexible. Incentives are determined on a quarterly or monthly basis, allowing the governance team to adjust the intensity and strategies of these incentives in response to market conditions. 
This pragmatic and flexible method of token incentive distribution might become a more mainstream model in the Web3 business world. 
Morpho’s approach to incentives is comprehensive, targeting both deposit and borrowing behaviors. Over the past year, Morpho has distributed 30.8 million tokens for incentives, which is about 3.08% of the total token supply. This may seem like a modest proportion. Moreover, as indicated in the graph below, Morpho’s official token expenditure shows a trend of decreasing spending on incentives. Interestingly, this reduction has not negatively impacted the growth rate of Morpho’s operations.  

This transition is a positive signal for the platform, indicating that Morpho has a sufficiently full Product-Market Fit (PMF), with user demand becoming increasingly organic. With 51% of the community token share still having approximately 48% remaining, Morpho retains a substantial budget and allows flexibility and room for strategic allocation in future growth phases or new market ventures.
Despite this, Morpho has not yet started charging for its services.  
Team and Financing
The core team of Morpho originates from France, primarily located in Paris. The key team members have been publicly identified, and the three founders have their roots in the telecommunications and computer industries, boasting rich backgrounds in blockchain entrepreneurship and development.
Morpho has completed two rounds of funding to date: a $1.3 million seed round in October 2021 and an $18 million Series A round in July 2022, led by investors A16z, Nascent, and Variant.
Source: Morpho

If the aforementioned financing amounts correspond to the officially disclosed 19% investor share, it can be inferred that the total valuation of the project stands at around $100 million. 
Morpho Blue and Its Potential Influence
What is Morpho Blue?
Put simply, Morpho Blue acts as a permissionless lending protocol layer. This sets it apart from platforms like Aave and Compound, as it opens up a broader spectrum of lending possibilities. Anyone can use Morpho Blue to create their own lending market based on it. The dimensions available for builders to choose from include:
Asset Collateralization OptionsChoice of Assets to Lend OutOracle Selection for Price FeedsDetermining Loan-to-Value(LTV) and Liquidation Loan-to-Value(LLTV) ratiosImplementing an Interest Rate Model(IRM) 

What value will Morpho Blue bring?
As outlined in its official documentation, the characteristics can be summarized as follows:
TrustlessMorpho Blue is designed to be immutable. It means once its code is deployed, it cannot be modified, reflecting a commitment to minimal governance. With just 650 lines of Solidity code, Morpho Blue stands out for its simplicity and security.EfficientUsers have the flexibility to choose higher Loan-to-Value (LTV) ratios and more favorable interest rates.Morpho Blue sidesteps the need to pay fees for third-party audit and risk management services. Employing a singleton smart contract with a simple code structure, Morpho Blue significantly cuts down on gas costs, achieving a reduction of about 70%. 
*Note: A singleton smart contract refers to a protocol using a single contract for execution, rather than a combination of multiple contracts. Uniswap V4 also adopted a singleton contract approach.
FlexibleIn Morpho Blue, both market building and risk management aspects, including oracles and lending parameters, are permissionless. This deviates from the uniform model adopted by platforms like Aave and Compound, where the entire platform adheres to a standard set of protocols and rules governed by a DAO.Morpho Blue is tailored to be developer-friendly, integrating a range of contemporary smart contract patterns. It offers account management features that facilitate gasless interactions and account abstraction. Additionally, the platform provides free flash loans, enabling users to access assets across all its markets in a single call, with the condition that the loan is repaid within the same transaction.
Morpho Blue employs a product philosophy akin to that of Uniswap V4, positioning itself as a foundational layer for a wide array of financial services. This approach involves opening up the modules above this foundational layer, thereby enabling various parties to come in and offer their distinct services. 
Morpho Blue’s approach differs from Aave in a key aspect: while Aave’s lending and borrowing processes are permissionless, the decisions regarding which assets can be borrowed or lent, the nature of risk control rules (whether conservative or aggressive), the selection of oracles, and the setting of interest and liquidation parameters, are all governed and managed by the Aave DAO and the various service providers like Gauntlet and Chaos, who monitor and manage over 600 risk parameters on a daily basis.
Morpho Blue functions more like an open lending operating system. It allows users to construct their own optimal lending combinations on its platform, much like they would on Aave. Additionally, professional risk management firms such as Gauntlet and Chaos have the opportunity to seek partnerships in the market, providing their risk management expertise to others and earning corresponding fees.
From my perspective, the fundamental value proposition of Morpho Blue lies not just in its trustlessness, efficiency, or flexibility, but primarily in its establishment of a free market for lending. This platform enables collaboration among all participants in the lending market, thus offering a more diverse and enriched array of market options for clients at every stage of their lending journey.
Will Morpho Blue Pose a Threat to Aave?
Possibly.
Morpho stands distinct among the myriad competitors that have emerged to challenge Aave, having accrued multiple advantages over the past year:
Morpho, with its managed capital hitting the $1 billion mark, is fast approaching the league of Aave, which oversees a substantial $7 billion in capital management. While these funds are presently employed within the Morpho Interest Rate Optimizer, there are many ways to channel them into new features and functionalities.Morpho, recognized as the quickest expanding lending protocol in the past year and with its token still awaiting official release, presents a realm of possibilities. The introduction of major new features by Morpho could readily captivate user interest and drive participation.Morpho boasts a substantial and adaptable token budget, well-equipped to entice early-stage users through attractive subsidy offerings. Morpho’s consistent operational track record, coupled with its significant fund volume, has enabled it to establish a notable degree of security branding. 
This doesn’t automatically put Aave on a back foot in upcoming competitive landscapes. Many users might not have the bandwidth or interest to sift through a plethora of lending choices. The lending solutions curated under the centralized governance of the Aave DAO might still hold the upper hand, continuing to be the go-to choice for the majority of users.
Second, Morpho Rate Optimizer largely inherits the security credentials of Aave and Compound, which has gradually put more money at ease. However, Morpho Blue, as a distinct and novel product featuring its own codebase, may initially encounter cautiousness from whales – to take a beat before diving in with full trust. This cautious approach is understandable, given the still-lingering sting from security mishaps in earlier permissionless lending markets like Euler, which remain top of mind in the crypto community.
Furthermore, Aave is fully capable of developing a set of functions similar to the Morpho Rate Optimizer on its existing framework to meet user demands for increased capital matching efficiency, potentially pushing Morpho out of the P2P lending market. However, that possibility seems unlikely at the moment, given that Aave also issued Grants to NillaConnect, a Morpho-like P2P lending product, this past July, instead of creating their own.
Lastly, the lending business model adopted by Morpho Blue doesn’t markedly diverge from the existing paradigms set by Aave. Aave is also capable to observe, monitor, and potentially replicate any effective lending mechanisms that Morpho Blue demonstrates.
But in any case, the launch of Morpho Blue is set to introduce an open lending testbed, paving the way for inclusive participation and combination opportunity across the full spectrum of the lending process.  Could these newly interconnected lending collectives emerging from Morpho Blue spearhead a solution that stands a chance against Aave?
We will see what happens.
An Overview of Decentralized Reserve Stablecoins: Historical Evolution and Model AnalysisBy Lawrence Lee, Researcher at Mint Ventures At the end of July, the leading figure in decentralized stablecoins, Liquity, announced that its V2 would introduce “Delta Neutral Stablecoins”. Similarly, the newly-funded Ethena Finance aims to hedge its reserve assets through risk mitigation strategies, thereby achieving high capital efficiency in a truly decentralized framework. In this article, we will delve deeply into these stablecoin protocols that attempt to solve the stablecoin trilemma. The Stablecoin Trilemma In the realm of crypto stablecoins, there exists a trilemma: the incompatibility among price stability, decentralization, and capital efficiency. Established centralized stablecoins, such as USDT and USDC, have carved a niche by delivering unparalleled price stability within the blockchain framework, boasting up to 100% capital efficiency. However, they come with the inherent risk of centralization. This was demonstrated when BUSD had to halt new business due to regulatory constraints and the ripple effect the March SVB incident imparted on USDC’s operations. The algorithmic stablecoin craze that began in the latter half of 2020 tried to achieve under-collateralization on a decentralized basis. High-profile projects of this wave, such as Empty Set Dollar and Basis Cash, met with a swift downfall. Subsequently, Luna used the credit of an entire blockchain as implicit collateral. It didn’t require over-collateralization for users minting UST and managed to merge decentralization, capital efficiency, and price stability for a considerable time (from 2020 to May 2022). Regrettably, even this formidable player wasn’t immune to market forces and succumbed to a credit collapse, spiraling into decline. Emerging initiatives like Beanstalk ventured into the realm of under-collateralized tokens, but they failed to capture significant market traction. The difficulty in maintaining price stability has been the Achilles’ heel of these tokens. Another path originated with MakerDAO, which hoped to achieve price stability through the over-collateralization of decentralized assets at the sacrifice of capital efficiency. Currently, LUSD, created by Liquity, is the largest-scale stablecoin fully backed by decentralized assets. Yet, in its quest for LUSD’s price stability, Liquity grapples with palpable capital inefficiency. The system’s collateral ratio consistently hovers above 250%, meaning every circulating LUSD requires $ETH worth over $2.5 as the collateral. Synthetix’s sUSD amplifies this dilemma due to the higher volatility of its collateral, SNX, leading to a usual minimum collateralization ratio exceeding 500%. Low capital efficiency inherently caps the potential growth scale and, invariably, diminishes its attraction for users. Liquity’s planned V2 primarily aims to address the low capital efficiency of V1, and Synthetix also plans to diversify its collateral portfolio in its V3 to reduce the minimum collateralization requirement. Rewinding the clock to 2020 and earlier, DAI also struggled with low capital efficiency. Moreover, due to the smaller market capitalization of the entire crypto market at the time and the substantial volatility of DAI’s collateral, ETH, DAI’s price was quite unstable. To address this issue, MakerDAO introduced the PSM (Peg Stability Module) in 2020, paving the way for centralized stablecoins like USDC to mint DAI. This move represented a compromise in DAI’s balance among decentralization, capital efficiency, and price stability, partially sacrificing decentralization. This strategy endowed DAI with a more stable price and higher capital efficiency, enabling it to grow rapidly alongside the overall development of DeFi. Launched at the end of 2020, FRAX similarly uses centralized stablecoins as its primary collateral. Currently, DAI and FRAX are the top two decentralized stablecoins in circulation, a fact that undoubtedly validates their strategies. They provide users with stablecoins that more closely meet their needs but also indirectly highlight the constraints that maintaining decentralization places on stablecoin scalability. However, a distinct breed of stablecoins is emerging, driven by the ambition to meld high capital efficiency with robust price stability while maintaining the principles of decentralization. The key attributes they endeavor to provide users are: The capacity to mint stablecoins from decentralized assets, such as ETH, ensures freedom from censorship risks.A one-to-one generation mechanism with the U.S. dollar, sidestepping the cumbersome over-collateralization hurdles, thereby bolstering scalability.A relentless commitment to maintain price stability Conceptually, this offers an almost utopian vision of what a decentralized stablecoin should embody. We adopt Liquity V2’s terminology for such protocols—Decentralized Reserve Protocols—as the designation for this type of stablecoin. It’s important to note that these differentiate these from traditional over-collateralized stablecoins. In the realm of Decentralized Reserve Protocols, once users deposit their assets to mint stablecoins, the protocol assumes ownership of these assets. Consequently, from a user’s perspective, it mirrors an $ETH-to-stablecoin swap transaction. This mechanism bears a striking resemblance to the workings of centralized stablecoins, such as USDT. Here, an asset worth $1 can seamlessly be traded for a stablecoin valued equivalently, and the reverse holds true. The distinguishing facet of Decentralized Reserve Protocols is their readiness to embrace crypto assets. Some might argue that since the collateral no longer belongs to the user, such stablecoins miss out on offering a leverage component, which, to them, is an inherent advantage of certain stablecoin models. However, I believe that as they function in our daily transactions, they don’t necessarily offer or require a leverage facet. Notably, centralized stablecoins like USDT and USDC don’t embed this leverage feature. At the crux of any currency—be it fiat or crypto—are three foundational pillars: a medium for settlement, a unit for accounting, and a store of value. Leverage is merely a specific feature of CDP (Collateralized Debt Position) type stablecoins, not a general use case for stablecoins. The reason why previous stablecoin protocols faltered in delivering this idealized stablecoin model. The primary culprit is the inherent volatility associated with decentralized assets. Faced with these erratic price fluctuations, how can protocols ensure the redemption of the stablecoins they issue while maintaining a 100% collateralization ratio? When viewed from the perspective of a stablecoin protocol’s balance sheet, the collateral deposited by users is the asset, while the stablecoins issued by the protocol are liabilities. The critical question is: how can it be ensured that assets will always cover the liabilities? Imagine a situation where a user deposits 1 ETH (valued at 2,000 USD) into the protocol. In return, they receive 2,000 stablecoins. Now, the unpredictable nature of the crypto market comes into play, and the value of ETH plummets to 1,000 USD. Here lies the dilemma: How can the protocol guarantee the redemption of the 2,000 stablecoins for assets equivalent to 2,000 USD? Historically, decentralized reserve protocols have crafted two primary strategies to address this challenge: utilizing governance tokens as reserves and risk hedging with reserve assets. Based on the method of risk hedging, there are further subdivisions: protocol-driven hedging and user-driven hedging. Let’s delve into each of these approaches. Decentralized Reserve Protocols: Governance Tokens as Reserves The first category of protocols operates on the premise of using the protocol’s governance tokens as supplementary collateral. When the price of the collateralized assets experiences a sharp decline, the protocol issues additional governance tokens to redeem the stablecoins held by users. This can be referred to as a Decentralized Reserve Protocol with governance tokens serving as reserves. To break this down with our previous example: when $ETH drops from $2,000 to $1,000, these specialized reserve protocols would intervene, supplementing the deficit by minting governance tokens. In essence, if a user seeks to redeem their 2,000 stablecoins, they would receive assets comprising 1,000 USD in ETH and an equivalent value, of 1,000 USD, in the protocol’s governance tokens. Among the projects employing this methodology are the Celo and Fei Protocol. Celo Since its launch in 2020, Celo has been a noteworthy presence in the stablecoin space. Originally operating as an independent Layer1 blockchain, it undertook a pivotal change in strategy. In a proposal initiated by its core team and passed this July, Celo began its transition to the Ethereum ecosystem, leveraging the capabilities of the OP stack. Here’s a deep dive into how Celo’s stablecoin mechanics function: Celo’s stablecoin is backed by a reserve pool consisting of a diversified set of underlying assets. The reserve ratio (the value of reserve assets divided by the circulating market cap of stablecoins) consistently stays well above 1, providing fundamental support for the intrinsic value of its stablecoin.The process of minting Celo’s stablecoins doesn’t involve over-collateralization. Instead, they are minted by sending $Celo to the official stability module, Mento. Here’s how it works: Users send 1 USD worth of CELO to the Mento and receive 1 USD worth of stablecoins like $cUSD. The process works seamlessly in reverse as well, allowing users to exchange $cUSD for an equivalent amount in $CELO. Under this mechanism, if the market price of cUSD falls below $1, opportunistic traders will buy $cUSD at this discounted rate to exchange for $CELO each valued at one dollar. Similarly, when the price of $cUSD exceeds 1 USD, traders will opt to mint $cUSD by sending $CELO and then sell it, capturing the profit. This continuous arbitrage mechanism acts as a self-correcting tool, ensuring that the market price of $cUSD rarely strays far from its intended peg.Celo has meticulously put into place a triad of mechanisms to ensure the continued solvency of its reserve pool: 1. If the reserve ratio slips below a predetermined threshold, newly minted $CELO from block rewards is funneled into the reserve pool to ensure the reserve remains adequately funded. 2. Though presently inactivated, a transaction fee could be levied to supplement the pool. 3. The Mento module charges a stability fee, acting as yet another tool to help the reserve pool maintain its financial health.To enhance the security of the reserve funds, Celo’s asset portfolio is diversified. As of now, the pool incorporates $CELO, $BTC, $ETH, $DAI, and an environmentally-conscious choice, the carbon credit token $cMCO2. Such diversification stands in stark contrast to projects like Terra, where the native token $LUNA singularly shoulders the responsibility as the primary collateral for its stablecoin. Source: Mint Ventures It’s evident that Celo shares similarities with Luna, operating as a stablecoin-centric Layer1 blockchain. Its minting and redemption mechanisms are also closely aligned with Luna’s $UST. The main distinction lies in its approach during potential under-collateralization scenarios: Celo primarily utilizes block-produced $CELO as protocol collateral to ensure the redemption of its stablecoin $cUSD. Source: https://reserve.mento.org/ With a total reserve holding of $116 million against a $46 million market cap in circulating stablecoins, Celo boasts a substantial over-collateralization ratio of 254%. Despite the system’s over-collateralized state, users can effortlessly swap $CELO valued at $1 into one $cUSD stablecoin, demonstrating the excellent capital efficiency of Celo. However, it’s worth noting that half of its collateral originates from the centralized stablecoin $USDC and the semi-decentralized $DAI, which means Celo cannot be considered a fully decentralized stablecoin. At present, Celo’s stablecoin ranks 16th among decentralized stablecoins in terms of market cap. However, if we set aside UST and flexUSD (owing to their depegging), $CELO ascends to the 14th position. Sourcehttps://defillama.com/stablecoins?backing=CRYPTOSTABLES&backing=ALGOSTABLES Fei In early 2021, Fei Protocol, which attracted significant market attention due to its trendy algorithmic stablecoin concept at the time, raised $19 million from VCs like A16Z and Coinbase. At its initial launch phase at the end of 2021 March, it drew in 639,000 $ETH for the minting of its stablecoin, $FEI, pumping the circulating volume to a massive 1.3 billion $FEI. In a short span, the market cap of $FEI sprinted, positioning itself as the second largest decentralized stablecoin, with $DAI leading at a $3 billion market cap. However, the demand for FEI was overly satisfied during the genesis phase, driven primarily by the allure of Fei Protocol’s governance token, $TRIBE. The resultant $FEI oversupply, combined with its nascent stage and lack of robust use cases, resulted in its price remaining below $1 for an extended period. This situation was soon followed by market volatility in May, where panic due to plummeting prices led users to redeem their $FEI, leaving the protocol struggling ever since its launch. To get back on track, Fei Protocol unveiled its V2 version at the end of 2021. The refreshed blueprint brought nuanced changes, such as a refined price stability mechanism. This revamped model allowed users to mint $FEI by depositing collaterals like $ETH, $DAI, and $LUSD, maintaining a 100% collateralization ratio. Post-minting, these assets merged into the Protocol Controlled Value (PCV). When the collateralization ratio(PCV/total circulating $FEI) is above 100%, indicating healthy appreciation of protocol assets and stress-free $FEI redemption, the protocol mints some $FEI to purchase $TRIBE, thereby reducing the collateralization ratio. Conversely, when the ratio falls below 100%, signaling potential default on $FEI redemption, the protocol mints $TRIBE to buy $FEI, thereby increasing the collateralization ratio. Under this mechanism, the governance token $TRIBE acts as a reserve for potential risks, also reaping additional rewards during system growth (a mechanism somewhat similar to the Float Protocol, which launched alongside Fei V1). Regrettably, the launch of Fei V2 coincided with the peak of the bull market. Subsequently, the price of ETH began to fall, and FEI Protocol unfortunately suffered a hack in April 2022, losing 80 million $FEI. Ultimately, the decision was made to cease protocol development in August 2022. Decentralized reserve protocols, which utilize governance tokens as reserves, essentially dilute the stake of all governance token holders to ensure stablecoin redemption. During a bull market, as the market cap increases, the value of governance token tends to rise as well, easily creating a flywheel effect. However, in a bear market, as the value of the collateral assets decreases, the market cap of the governance token also falls. Excessive minting can then exacerbate the price drop, leading to a “death spiral” of the governance token. If the governance token’s market cap falls below that of total circulating stablecoins, doubt about the protocol’s capacity to redeem the stablecoin among holders will increase, accelerating the exodus and causing a systemic collapse. The survival game, particularly during bear markets, becomes pivotal for these stablecoins. Celo’s successful survival amid the bear market is closely tied to its over-collateralization strategy. During the previous market high, Celo allocated a relatively larger portion of its reserves to assets like $USDC, $DAI, $BTC, and $ETH, enabling the protocol to remain secure even as the price of $CELO fell from $10 to $0.5. Decentralized Reserve Protocols: Embracing Risk Hedging for Stability This type of protocol is also called “risk-neutral stablecoin protocol, which involves risk hedging for reserve assets(usually in the form of cryptocurrency) of the protocol. When the price of the collateral assets plummets drastically, the hedging generates profits, ensuring that the stablecoin protocol’s assets can always repay its liabilities. We refer to this type of protocol as decentralized reserve protocols with reserve assets risk hedging, or risk-neutral stablecoin protocols. If you were to deposit 1 $ETH valued at $2000 into one of these protocols, it would, in anticipation of potential market volatility, hedge this asset, possibly by initiating a short position on a crypto exchange. When $ETH drops from $2000 to $1000, the profit ($1000) generated by hedging can compensate for the dip in value, ensuring that the protocol can still redeem 2000 stablecoins to the user. Specifically, depending on the actual hedger, these are further divided into decentralized reserve protocols where the protocol itself hedges the risk, and decentralized reserve protocols where the users hedge the risk. Decentralized Reserve Protocols with Protocol-Driven Risk Hedging Stablecoin protocols adopting this approach include Pika Protocol V1, UXD Protocol, and Ethena, which recently announced its financing. Pika V1 Pika Protocol is a derivatives protocol deployed on the Optimism network, but in its initial V1 version, Pika had planned to launch a stablecoin, with hedging achieved through Bitmex’s Inverse Perpetual. An original creation of Bitmex, Inverse Perpetuals stands apart from the typical linear perpetual saturating the market. Unlike the popular “linear perpetual”, which employs USD as a foundation for tracking crypto values, inverse perpetual has a distinct architecture – they harness the token itself as a margin to trace its USD equivalent price. Here’s an example to illustrate the earnings from inverse perpetual: A trader goes long 50,000 contracts of XBTUSD at a price of 10,000. A few days later the price of the contract increased to 11,000.The trader’s profit will be: 50,000 1 (1/10,000 – 1/11,000) = 0.4545 XBTIf the price had in fact dropped to 9,000, the trader’s loss would have been: 50,000 1 (1/10,000 – 1/9,000) = -0.5556 XBT. The loss is greater because of the inverse and non-linear nature of the contract. Conversely, if the trader was short then the trader’s profit would be greater if the price moved down than the loss if it moved up. Sourcehttps://www.bitmex.com/app/inversePerpetualsGuide Upon closer analysis, it’s evident that inverse perpetual is a perfect match for Decentralized Reserve Protocols aimed at hedging risks for reserve assets. Taking our previous example, let’s assume that when ETH is priced at $2000, Pika Protocol, on receiving 1 ETH from a user, strategically deploys it as collateral, shorting 2000 units of $ETH inverse perpetual on Bitmex. Should the market act up and ETH value slide to 1000 USD,  then Pika Protocol’s profit is: 2000 1 (1/1000 – 1/2000) = 1 ETH = $1000 This means when the $ETH price falls from $2000 to $1000, Pika Protocol’s reserves change from 1 $ETH to 2 $ETH, still effectively covering the redemption of 2000 stablecoins held by users (transaction fees and funding fees are not considered here). Pika Protocol V1’s product design is entirely in line with the $NUSD product design mentioned by Bitmex founder Arthur Hayes in his blog post, able to perpetually and perfectly hedge long positions in coin margin. Regrettably,  characterized by their counterintuitive and non-linear return properties (the relationship between token price and contract price is not linear), the inverse perpetual are not easily understood by the average investors who get used to USD-Margined features. Fast forward to the present, it’s clear that inverse perpetual, or coin-margined perpetual contracts, have struggled to find their footing in a market dominated by their more accessible counterparts – the linear perpetual, otherwise known as USD-margined perpetual contracts. A snapshot of major exchanges reveals a stark contrast: Inverse Perpetuals account for a mere 20-25% of the trading volume when stacked against Linear Perpetuals. Affected by regulations, BitMex has gradually receded from being a top-tier perpetual exchange to one with less than 0.5% of the current perpetual market share. Recognizing the limitations of linear perpetual contracts for their hedging strategies and the dwindling market appetite for inverse perpetual, Pika embarked on a strategic pivot. In its V2 version, Pika abandoned the stablecoin business and officially shifted to a derivatives protocol.  UXD UXD Protocol operates on the Solana network and was launched in January 2022. In 2021, UXD finished $3 million in funding led by Multicoin and raised $57 million during its IDO. In January 2023, UXD decided to expand its operations to the Ethereum ecosystem, launching on Arbitrum in April, with plans to debut on Optimism in the near future. Upon its initial launch, UXD Protocol allowed users to deposit $SOL, $BTC, and $ETH to mint its stablecoin, UXD, on a 1:1 value basis pegged to the USD. To maintain the integrity and redemption of UXD, the deposited collaterals were hedged using short positions through Solana’s renowned lending and perpetual platform, Mango Markets. The funding fees collected from the short positions would serve as protocol revenue, while funds raised by the protocol would cover any payable funding fee costs. For an extended period post-launch, the UXD Protocol functioned smoothly. The protocol even had to impose a cap on UXD issuance due to Mango Markets’ overall open interest being under the $100 million mark. A surge in UXD’s short positions, especially if they escalated into the tens of millions, could pose tangible redemption threats. Additionally, an excess of short positions could tilt the funding rate towards becoming negative, subsequently elevating hedging costs. Unfortunately, in October 2022, Mango Markets suffered from a governance attack, leading to a nearly $20 million loss for UXD Protocol. At the time, UXD’s insurance fund had a balance of over $55 million, allowing for normal redemptions for UXD. Although Mango Markets later returned the funds to UXD Protocol, the damage was done, and Mango Markets struggled to recover. The situation was exacerbated by the FTX crisis, leading to a rapid capital outflow from Solana. As a result, UXD couldn’t find suitable exchanges to hedge their long positions. The protocol limited its collateral support to just $USDC, a stablecoin that inherently did not necessitate hedging. They channeled deposited $USDC collateral into various on-chain $USDC vaults and Real-World Assets (RWA). Recognizing the need to diversify and mitigate risks, they made the strategic decision to foray into the Ethereum ecosystem. Following a successful launch on Arbitrum in April, they have plans to deploy on Optimism. Amidst this expansion, UXD remains on the hunt for optimal on-chain hedging platforms to fortify its operations. Currently, UXD has a circulating supply of $14.3 million, with a protocol insurance fund balance of $53.2 million. Source: https://dashboard.uxd.fi/ Ethena Finance, a stablecoin protocol that recently announced  $6 million in funding led by Dragonfly, with investments from centralized exchanges such as Bybit, OKX, Deribit, Gemini, and Huobi, also employs hedging strategies to hedge its reserve assets. Given the backing from numerous second-tier derivative exchanges, this will likely benefit their collateral hedging efforts. Furthermore, Ethena has set its sights on collaborating with Synthetix, a renowned decentralized derivatives protocol. This partnership aims to initiate short positions utilizing Synthetix as a liquidity provider. Moreover, it will diversify the utility of its stablecoin, USDe, by enabling it to serve as collateral within select pools. The merits of such decentralized reserve protocols with risk-hedging strategies are evident. By hedging the crypto assets that serve as collateral, the protocol can achieve a risk-neutral position, ensuring stablecoin redemptions. This approach melds the promise of 100% capital efficiency with the trustless nature of decentralization—a match largely dependent on the quality of hedging venues. Furthermore, if the protocol can hedge its position with high capital efficiency, the reserved collateral can be channeled into multiple avenues to yield returns. Additionally, funding fees can serve as protocol revenue. This flexibility allows for multiple possibilities: the generated returns could be distributed to stablecoin holders, creating an interest-bearing stablecoin and enhancing its utility. Alternatively, the profits could also be distributed to governance token holders. The governance tokens within a stablecoin protocol inherently act as the ultimate lender for their associated stablecoin. In situations where reserve asset risks are hedged, these stablecoin protocols can leverage their governance tokens as a redemption mechanism during extreme circumstances. For stablecoin holders, this dynamic offers an extra protective layer when compared to stablecoins solely backed by governance tokens. Conceptually, the act of hedging reserve assets is streamlined. The underlying principle is that it should remain resilient across market cycles. Consequently, this mitigates the need to verify the robustness of the governance token during downtrends.  But as with all innovations, there are hurdles on the horizon. Here’s a breakdown: The Centralization Paradox: The current financial landscape sees centralized exchanges reigning supreme in perpetual contract liquidity. Furthermore, most decentralized derivatives exchanges are not designed for stablecoin protocols to do hedging, leading to an inevitable centralization risk. This risk poses a twofold problem: Centralized exchanges, by their very nature, come with inherent risks. With limited hedging venues, any single venue becomes crucial to a protocol’s health. A hiccup in one venue can have a substantial impact on the protocol. The UXD Protocol’s halt in operations due to the attack on Mango Markets is an extreme example of such centralization risks.Limitations in Choosing Hedging Tools. The current mainstream linear perpetual contracts cannot perfectly hedge their long positions. Taking $ETH as an example, a stablecoin protocol would ideally seek a hedge using ETH-denominated short positions with ETH as collateral. However, prominent linear perpetual contracts lean on $USDT as collateral, mapping their profit curves to USD, which cannot perfectly hedge against the $ETH position. Even if the stablecoin protocol tries to borrow against $ETH for $USDT, it increases operational costs and complexities in position risk management, while also reducing capital efficiency. Pika Protocol’s experience underscores how inverse perpetual contracts might be the holy grail for these protocols. Sadly, their market footprint remains undersized.Scale Growth Has Inherent Limitations. With the growth of a protocol’s stablecoin market cap, comes an amplified demand for perpetual contract short positions for hedging purposes. Beyond the complexities of obtaining a sufficient number of short positions, the more short positions the protocol holds, the higher the liquidity requirement from the counterparty when closing positions. This can lead to potentially negative funding rates, implying potentially higher hedging costs and operational complexities. For a stablecoin with a scale of tens of millions of dollars, this might not pose significant challenges. However, if one aims to scale further, reaching hundreds of millions or even billions, this issue can significantly cap its growth potential.Operational Risks. Regardless of the hedging mechanism employed, there’s inevitably a high frequency of opening positions, adjusting portfolios, and managing collateral. These processes require manual intervention, leading to significant operational and even moral risks. Decentralized Reserve Stablecoin Protocols with User-Driven Risk Hedging Protocols that have adopted this approach include Angle Protocol V1 and Liquity V2. Angle V1 Angle Protocol was launched on the Ethereum network in November 2021. Before its launch, they closed a $5 million financing round led by a16z. While the details of the Angle Protocol V1 are extensively discussed in a report by Mint Ventures, here we provide a brief overview: Like other decentralized reserve stablecoin protocols, under ideal conditions, Angle allows users to mint its stablecoin, agUSD, by depositing 1$ worth of collateral in the form of $ETH. What differentiates Angle is its target audience. Beyond the archetypal stablecoin users, Angle Protocol has a dedicated offering for perpetual contract traders. Within the ecosystem of Angle, these traders are distinctly termed as the Hedging Agency (HA). Note: Angle’s first stablecoin was pegged to the Euro, termed agEUR, the logic remains the same. For the sake of consistency in this context, we’ll use the USD-pegged stablecoin as an example.  Using the same example, suppose ETH is currently valued at $2000. A user deposits 1 $ETH to Angle, consequently minting 2000 USD-pegged stablecoins. Concurrently, Angle sanctions a leveraged position equivalent to 1 $ETH for the benefit of traders. Under the assumption that the Hedging Agency (HA) utilizes 0.2 $ETH (worth $400) as collateral, and opens a position with 5x leverage, the protocol’s total collateral amounts to 1.2 $ETH. In monetary terms, this is a sum of $2400 stacked against a  liability of $2000 in stablecoins. When the price of $ETH rises to $2200, the protocol only needs to retain enough $ETH to back the $2000 stablecoins, which is approximately 0.909 $ETH. The remaining 0.291 $ETH (valued at $640) can be withdrawn by the HA. Conversely, if $ETH drops to $1800, the protocol still has to maintain enough $ETH to back the $2000 stablecoins, which would be around 1.111 $ETH. The HA’s collateral position would then reduce to about 0.089 $ETH (worth $160). For traders maneuvering within the Angle Protocol, their engagement is fundamentally analogous to taking a long position on $ETH-margined perpetual. When the price of $ETH rises, they not only benefit from the appreciation but also from the surplus $ETH in the protocol (in the above example, a 10% surge in the price of $ETH resulted in a 60% gain for the trader). Yet, the inverse is just as pronounced. A deceleration in ETH’s price by 10% triggers a sharp 60% erosion in the trader’s position. From the point of the Angle Protocol, these traders serve as an insulating barrier against the volatility or depreciation of the collateral. This very function earns them the title of “Hedging Agency”. The leverage level for traders is determined by the ratio of the hedging position available in the protocol (0.2 ETH in this example) to the protocol’s stablecoin position (1 ETH in this example). For perpetual contract traders, Angle offers the following advantages:  No Funding Fees: Traditional centralized platforms require long-position holders to pay funding fees to their counterparts holding short positions.Slippage-Free Oracle Price: Angle attempted to create a win-win scenario for both stablecoin holders and perpetual contract traders where stablecoin holders benefit from high capital efficiency and decentralization, while contract traders enjoy an enhanced trading experience.  Actually, there might be instances where there are no traders to open long positions. In such cases, Angle introduces the Standard Liquidity Provider (SLP) to offer additional collateral (stablecoins) to ensure the protocol’s security, while simultaneously earning yields, transaction fees, and governance token($ANGLE) rewards. However, Angle’s real-world performance hasn’t been optimal. Even though traders receive a large amount of $ANGLE as a reward, the protocol’s collateral isn’t fully hedged most of the time. The core issue is that Angle hasn’t offered a product compelling enough for traders. As the price of the $ANGLE token declined, the TVL (Total Value Locked) of Angle Protocol also plummeted from its initial $250 million to around $50 million. Hedging Ratio for the USDC Pool, the Primary Collateral for Angle StablecoinsSource: https://analytics.angle.money/core/EUR/USDC Source: https://defillama.com/protocol/angle In March 2023, the interest-bearing reserve assets of Angle unfortunately fell victim to a hacker attack involving Euler. Although the hacker eventually returned the stolen assets, the incident severely impacted Angle’s momentum. By May 2023, Angle announced the discontinuation of Angle Protocol V1, and introduced plans for V2. Angle Protocol V2 shifted to a traditional over-collateralized model and was launched in early August.  Liquity V2 Since its launch in March 2021, LUSD issued by Liquity, has become the third-largest decentralized stablecoin in the market (following DAI and FRAX) and is the largest fully-decentralized stablecoin. For a deeper dive into Liquity, you can read our comprehensive reports released in July 2021 and Apr 2023 shed light on the mechanics of Liquity V1, along with its subsequent product updates and case expansions. The Liquity team stands proud of LUSD’s accomplishments in achieving both decentralization and price stability. Yet, when analyzing the metrics of capital efficiency, Liquity’s performance has been somewhat mediocre. Historical data indicates that, since its inception,Liquity’s collateralization ratio has consistently hovered around 250%, implying that for every circulating LUSD, there’s an ETH collateral worth $2.5 backing it. Source: https://dune.com/liquity/liquity On July 28th, Liquity officially unveiled the features of its V2, with the heart of this upgrade being the inclusion of LSD as an eligible collateral type. However, the standout feature remains its ambitious claim of achieving high capital efficiency via delta-neutral hedging across the entire protocol. Currently, Liquity has not publicly released detailed product documentation. The available information on V2 primarily originates from founder Robert Lauko’s talk at ETHCC, previous introductory articles released by Liquity, and discussions on Discord. Our subsequent summary is primarily based on these sources. In terms of product logic, mirroring the structure of Angle V1, Liquity’s V2 has a foundational premise to onboard traders to execute leveraged operations on its platform. These traders’ margins then function as supplementary collateral for the protocol, thereby strategically hedging the protocol’s overarching risk spectrum. For the traders, Liquity offers an attractive trading product. Diving into the specifics, Liquity has rolled out two groundbreaking novelties. The first, aptly termed as “Principal Protected Leverage Position”. This offering allows traders to access a unique leveraged trading paradigm where their principal stands safeguarded. Upon remitting a designated premium, users can unlock this utility, which provides an insurance net against significant ETH price downturns, ensuring they still pocket a predetermined amount of USD. Illustrating this with a scenario laid out in Liquity’s blog: When ETH stands at a valuation of $1000, an investor parting with 12ETH (split into 10 ETH for principal and a 2 ETH premium) can leverage a 2x position on the initial 10 ETH, coupled with downside fortification. Translated, if the ETH price sees a twofold surge, the 2x leveraged position becomes active, resulting in the user receiving a total of 40 ETH. However, if the ETH price goes down, the purchased put option becomes active, allowing the user to withdraw their initial $10000 (10 * 1000). Source: https://www.liquity.org/blog/introducing-liquity-v2 Liquity’s product innovation clearly seeks to iterate and expand on the foundational concepts introduced by Angle Protocol, with the spotlight being on the “principal protection”. While Liquity remains tight-lipped about the detailed mechanisms, piecing together insights from their design drafts and lively chatter on Discord,  this “principal protection” seems very much akin to a call option. Liquity believes that this combo product, due to its capacity to protect the principal, will be highly attractive to traders. This call option is crafted to empower traders with the dual advantage of leveraged returns when prices rise and ensuring their principal when prices fall. Through the trader’s lens, this presents a potentially more enticing proposition compared to Angle’s unembellished leveraged trading pitch – though the allure’s potency would be contingent on how Liquity calibrates its premium pricing. From the protocol’s perspective, the premium paid by users can serve as a safety buffer. When the ETH price falls, Liquity can use this premium as supplementary collateral to repay stablecoin holders; when the price rises, the appreciated part of Liquity’s collateral can be distributed to contract traders as profit. An evident challenge lurks within Liquity’s innovative design. Zooming in, when traders look to liquidate their positions prematurely and extract their ETH, Liquity faces a dilemma. While granting traders the autonomy to bow out as they wish, any such withdrawal concurrently slashes the protocol’s hedged proportion, introducing potential fragility as chunks of “collateral” are pulled out. In fact, a similar problem arose in the practical operation of Angle Protocol, where its system’s hedging rate remained low for extended periods, indicating that traders weren’t sufficiently hedging the overall protocol position. To tackle this challenge, Liquity introduced the second innovation: its secondary marketplace to subsidize positions. This means that within Liquity V2, leveraged trading positions (tokenized as NFTs) can not only be opened and closed like regular leveraged positions but can also be sold on a secondary market. In reality, Liquity’s primary concern is traders liquidating their positions, as this could reduce the protocol’s hedging ratio. When a trader wants to close a position, if another trader is willing to purchase it on the secondary market at a price higher than its inherent value, the original trader benefits from receiving more cash, which results in a win-win. For Liquity, even though this “inherent value” is subsidized by the protocol, a relatively small subsidy can maintain the overall system’s hedging ratio. As a result, at a relatively low cost, the protocol’s security is enhanced. Source: https://www.liquity.org/blog/introducing-liquity-v2 Imagine a scenario where Alice opens a position with 10 ETH, at an ETH market price of $1000, and she pays a premium of 2ETH. This position represents a leveraged long of 10 ETH with principal protection. However, if the price of ETH then falls to $800, the $12,000 worth of ETH that Alice initially invested can now only be swapped for 10 ETH (valued at $8000). At this juncture, Alice can either directly liquidate her position to get the 10 ETH ($8000) or sell this position on the secondary market for a price between $8000 and $12000. From Bob’s perspective, a potential buyer, purchasing Alice’s position would be akin to buying ETH at $800 and obtaining a call option with a strike price of $1000. This option certainly has value, ensuring Alice can get a price higher than $8000 for her position. For Liquity, if Bob acquires Alice’s position, the protocol’s collateralization remains unchanged since the premium is still retained within the protocol’s pool. If no one, like Bob, steps up to buy Alice’s position promptly, Liquity will gradually increase the value of Alice’s position over time (though the exact method isn’t specified, mechanisms like lowering the strike price or increasing the quantity of the call option could enhance the position’s value). The subsidized amount would be sourced from the protocol’s premium pool. However, this could slightly decrease Liquity’s overall over-collateralization. Liquity believes that not all positions would require protocol subsidization, and even if they do, it might not entail subsidizing a significant portion of the position. Thus, subsidizing the secondary market can effectively maintain the protocol’s hedging ratio. Lastly, despite these innovations, it might still be challenging to entirely address the liquidity shortfall in extreme scenarios. Liquity would also adopt a mechanism similar to Angle’s standard liquidity provider system as a final supplement. This could possibly involve the protocol allowing users to deposit some V1 LUSD into a stability pool, which would act as a backup to support V2 LUSD redemptions when the crypto waters get choppy. Liquity V2 is planned to launch in the second quarter of 2024. Overall, Liquity V2 shares many similarities with Angle V1 but has also introduced targeted improvements in response to the challenges faced by Angle. They’ve innovated with the “principal protection” feature, offering a more appealing product for traders. They’ve also introduced a “protocol-backed subsidized secondary market” to protect the protocol’s overall hedging ratio. However, at its core, Liquity V2 remains akin to the Angle Protocol. It represents an attempt to branch out and craft an innovative derivatives product that, in turn, supports their stablecoin operation. While Liquity’s proficiency in the stablecoin domain is well-established, it remains to be seen whether they can successfully design a top-tier derivative, achieve Product-Market Fit (PMF), and promote it effectively. Conclusion The prospect of a decentralized reserve protocol that achieves decentralization, high capital efficiency, and price stability is certainly exhilarating. However, a sophisticated and well-thought-out mechanism design is just the first step for a stablecoin protocol. What’s even more crucial is the expansion of stablecoin use cases. At present, the stride of decentralized stablecoins in proliferating use cases has been somewhat languid, with most pigeonholed into a singular role: indispensable tools for liquidity mining. But the lure of mining incentives isn’t eternal. In a provocative turn of events, PayPal’s foray with PYUSD has sounded the clarion call for crypto stablecoin protocols. It indicates that well-known entities within the Web2 domain are venturing into the stablecoin arena, suggesting that the window of opportunity for current stablecoins might be narrowing. Indeed, when we discuss the centralized risks of custodial stablecoins, our concerns largely revolve around the potential unreliability of custodians and issuing entities. For instance, Silicon Valley Bank, ranks 16th in the U.S., whereas both Tether and Circle, albeit influential, remain quintessentially ‘crypto-native’. If a “too big to fail ” entity from the traditional financial world, such as JP Morgan, were to issue a stablecoin, the inherent national credit backing it could eclipse competitors like Tether and Circle instantly. Furthermore, it introduces a quandary: if centralized entities offer robust stability, does the call for decentralization become muted? With that said, we hope to see decentralized stablecoins diversify their application spectrum, cementing themselves as the Schelling point in the ever-evolving stablecoin narrative. Realizing this, however, is a difficult task.

An Overview of Decentralized Reserve Stablecoins: Historical Evolution and Model Analysis

By Lawrence Lee, Researcher at Mint Ventures
At the end of July, the leading figure in decentralized stablecoins, Liquity, announced that its V2 would introduce “Delta Neutral Stablecoins”. Similarly, the newly-funded Ethena Finance aims to hedge its reserve assets through risk mitigation strategies, thereby achieving high capital efficiency in a truly decentralized framework. In this article, we will delve deeply into these stablecoin protocols that attempt to solve the stablecoin trilemma.
The Stablecoin Trilemma

In the realm of crypto stablecoins, there exists a trilemma: the incompatibility among price stability, decentralization, and capital efficiency.
Established centralized stablecoins, such as USDT and USDC, have carved a niche by delivering unparalleled price stability within the blockchain framework, boasting up to 100% capital efficiency. However, they come with the inherent risk of centralization. This was demonstrated when BUSD had to halt new business due to regulatory constraints and the ripple effect the March SVB incident imparted on USDC’s operations.
The algorithmic stablecoin craze that began in the latter half of 2020 tried to achieve under-collateralization on a decentralized basis. High-profile projects of this wave, such as Empty Set Dollar and Basis Cash, met with a swift downfall. Subsequently, Luna used the credit of an entire blockchain as implicit collateral. It didn’t require over-collateralization for users minting UST and managed to merge decentralization, capital efficiency, and price stability for a considerable time (from 2020 to May 2022). Regrettably, even this formidable player wasn’t immune to market forces and succumbed to a credit collapse, spiraling into decline. Emerging initiatives like Beanstalk ventured into the realm of under-collateralized tokens, but they failed to capture significant market traction. The difficulty in maintaining price stability has been the Achilles’ heel of these tokens.
Another path originated with MakerDAO, which hoped to achieve price stability through the over-collateralization of decentralized assets at the sacrifice of capital efficiency. Currently, LUSD, created by Liquity, is the largest-scale stablecoin fully backed by decentralized assets. Yet, in its quest for LUSD’s price stability, Liquity grapples with palpable capital inefficiency. The system’s collateral ratio consistently hovers above 250%, meaning every circulating LUSD requires $ETH worth over $2.5 as the collateral. Synthetix’s sUSD amplifies this dilemma due to the higher volatility of its collateral, SNX, leading to a usual minimum collateralization ratio exceeding 500%. Low capital efficiency inherently caps the potential growth scale and, invariably, diminishes its attraction for users. Liquity’s planned V2 primarily aims to address the low capital efficiency of V1, and Synthetix also plans to diversify its collateral portfolio in its V3 to reduce the minimum collateralization requirement.
Rewinding the clock to 2020 and earlier, DAI also struggled with low capital efficiency. Moreover, due to the smaller market capitalization of the entire crypto market at the time and the substantial volatility of DAI’s collateral, ETH, DAI’s price was quite unstable. To address this issue, MakerDAO introduced the PSM (Peg Stability Module) in 2020, paving the way for centralized stablecoins like USDC to mint DAI. This move represented a compromise in DAI’s balance among decentralization, capital efficiency, and price stability, partially sacrificing decentralization. This strategy endowed DAI with a more stable price and higher capital efficiency, enabling it to grow rapidly alongside the overall development of DeFi. Launched at the end of 2020, FRAX similarly uses centralized stablecoins as its primary collateral. Currently, DAI and FRAX are the top two decentralized stablecoins in circulation, a fact that undoubtedly validates their strategies. They provide users with stablecoins that more closely meet their needs but also indirectly highlight the constraints that maintaining decentralization places on stablecoin scalability.
However, a distinct breed of stablecoins is emerging, driven by the ambition to meld high capital efficiency with robust price stability while maintaining the principles of decentralization. The key attributes they endeavor to provide users are:
The capacity to mint stablecoins from decentralized assets, such as ETH, ensures freedom from censorship risks.A one-to-one generation mechanism with the U.S. dollar, sidestepping the cumbersome over-collateralization hurdles, thereby bolstering scalability.A relentless commitment to maintain price stability
Conceptually, this offers an almost utopian vision of what a decentralized stablecoin should embody. We adopt Liquity V2’s terminology for such protocols—Decentralized Reserve Protocols—as the designation for this type of stablecoin. It’s important to note that these differentiate these from traditional over-collateralized stablecoins. In the realm of Decentralized Reserve Protocols, once users deposit their assets to mint stablecoins, the protocol assumes ownership of these assets. Consequently, from a user’s perspective, it mirrors an $ETH -to-stablecoin swap transaction. This mechanism bears a striking resemblance to the workings of centralized stablecoins, such as USDT. Here, an asset worth $1 can seamlessly be traded for a stablecoin valued equivalently, and the reverse holds true. The distinguishing facet of Decentralized Reserve Protocols is their readiness to embrace crypto assets.
Some might argue that since the collateral no longer belongs to the user, such stablecoins miss out on offering a leverage component, which, to them, is an inherent advantage of certain stablecoin models. However, I believe that as they function in our daily transactions, they don’t necessarily offer or require a leverage facet. Notably, centralized stablecoins like USDT and USDC don’t embed this leverage feature. At the crux of any currency—be it fiat or crypto—are three foundational pillars: a medium for settlement, a unit for accounting, and a store of value. Leverage is merely a specific feature of CDP (Collateralized Debt Position) type stablecoins, not a general use case for stablecoins.
The reason why previous stablecoin protocols faltered in delivering this idealized stablecoin model. The primary culprit is the inherent volatility associated with decentralized assets. Faced with these erratic price fluctuations, how can protocols ensure the redemption of the stablecoins they issue while maintaining a 100% collateralization ratio?
When viewed from the perspective of a stablecoin protocol’s balance sheet, the collateral deposited by users is the asset, while the stablecoins issued by the protocol are liabilities. The critical question is: how can it be ensured that assets will always cover the liabilities?
Imagine a situation where a user deposits 1 ETH (valued at 2,000 USD) into the protocol. In return, they receive 2,000 stablecoins. Now, the unpredictable nature of the crypto market comes into play, and the value of ETH plummets to 1,000 USD. Here lies the dilemma: How can the protocol guarantee the redemption of the 2,000 stablecoins for assets equivalent to 2,000 USD?
Historically, decentralized reserve protocols have crafted two primary strategies to address this challenge: utilizing governance tokens as reserves and risk hedging with reserve assets. Based on the method of risk hedging, there are further subdivisions: protocol-driven hedging and user-driven hedging. Let’s delve into each of these approaches.

Decentralized Reserve Protocols: Governance Tokens as Reserves
The first category of protocols operates on the premise of using the protocol’s governance tokens as supplementary collateral. When the price of the collateralized assets experiences a sharp decline, the protocol issues additional governance tokens to redeem the stablecoins held by users. This can be referred to as a Decentralized Reserve Protocol with governance tokens serving as reserves. To break this down with our previous example: when $ETH drops from $2,000 to $1,000, these specialized reserve protocols would intervene, supplementing the deficit by minting governance tokens. In essence, if a user seeks to redeem their 2,000 stablecoins, they would receive assets comprising 1,000 USD in ETH and an equivalent value, of 1,000 USD, in the protocol’s governance tokens.
Among the projects employing this methodology are the Celo and Fei Protocol.
Celo
Since its launch in 2020, Celo has been a noteworthy presence in the stablecoin space. Originally operating as an independent Layer1 blockchain, it undertook a pivotal change in strategy. In a proposal initiated by its core team and passed this July, Celo began its transition to the Ethereum ecosystem, leveraging the capabilities of the OP stack. Here’s a deep dive into how Celo’s stablecoin mechanics function:
Celo’s stablecoin is backed by a reserve pool consisting of a diversified set of underlying assets. The reserve ratio (the value of reserve assets divided by the circulating market cap of stablecoins) consistently stays well above 1, providing fundamental support for the intrinsic value of its stablecoin.The process of minting Celo’s stablecoins doesn’t involve over-collateralization. Instead, they are minted by sending $Celo to the official stability module, Mento. Here’s how it works: Users send 1 USD worth of CELO to the Mento and receive 1 USD worth of stablecoins like $cUSD. The process works seamlessly in reverse as well, allowing users to exchange $cUSD for an equivalent amount in $CELO. Under this mechanism, if the market price of cUSD falls below $1, opportunistic traders will buy $cUSD at this discounted rate to exchange for $CELO each valued at one dollar. Similarly, when the price of $cUSD exceeds 1 USD, traders will opt to mint $cUSD by sending $CELO and then sell it, capturing the profit. This continuous arbitrage mechanism acts as a self-correcting tool, ensuring that the market price of $cUSD rarely strays far from its intended peg.Celo has meticulously put into place a triad of mechanisms to ensure the continued solvency of its reserve pool: 1. If the reserve ratio slips below a predetermined threshold, newly minted $CELO from block rewards is funneled into the reserve pool to ensure the reserve remains adequately funded. 2. Though presently inactivated, a transaction fee could be levied to supplement the pool. 3. The Mento module charges a stability fee, acting as yet another tool to help the reserve pool maintain its financial health.To enhance the security of the reserve funds, Celo’s asset portfolio is diversified. As of now, the pool incorporates $CELO, $BTC, $ETH , $DAI, and an environmentally-conscious choice, the carbon credit token $cMCO2. Such diversification stands in stark contrast to projects like Terra, where the native token $LUNA singularly shoulders the responsibility as the primary collateral for its stablecoin.
Source: Mint Ventures
It’s evident that Celo shares similarities with Luna, operating as a stablecoin-centric Layer1 blockchain. Its minting and redemption mechanisms are also closely aligned with Luna’s $UST. The main distinction lies in its approach during potential under-collateralization scenarios: Celo primarily utilizes block-produced $CELO as protocol collateral to ensure the redemption of its stablecoin $cUSD.
Source: https://reserve.mento.org/
With a total reserve holding of $116 million against a $46 million market cap in circulating stablecoins, Celo boasts a substantial over-collateralization ratio of 254%. Despite the system’s over-collateralized state, users can effortlessly swap $CELO valued at $1 into one $cUSD stablecoin, demonstrating the excellent capital efficiency of Celo. However, it’s worth noting that half of its collateral originates from the centralized stablecoin $USDC and the semi-decentralized $DAI, which means Celo cannot be considered a fully decentralized stablecoin.
At present, Celo’s stablecoin ranks 16th among decentralized stablecoins in terms of market cap. However, if we set aside UST and flexUSD (owing to their depegging), $CELO ascends to the 14th position.
Sourcehttps://defillama.com/stablecoins?backing=CRYPTOSTABLES&backing=ALGOSTABLES
Fei
In early 2021, Fei Protocol, which attracted significant market attention due to its trendy algorithmic stablecoin concept at the time, raised $19 million from VCs like A16Z and Coinbase. At its initial launch phase at the end of 2021 March, it drew in 639,000 $ETH for the minting of its stablecoin, $FEI, pumping the circulating volume to a massive 1.3 billion $FEI. In a short span, the market cap of $FEI sprinted, positioning itself as the second largest decentralized stablecoin, with $DAI leading at a $3 billion market cap.
However, the demand for FEI was overly satisfied during the genesis phase, driven primarily by the allure of Fei Protocol’s governance token, $TRIBE. The resultant $FEI oversupply, combined with its nascent stage and lack of robust use cases, resulted in its price remaining below $1 for an extended period. This situation was soon followed by market volatility in May, where panic due to plummeting prices led users to redeem their $FEI, leaving the protocol struggling ever since its launch.
To get back on track, Fei Protocol unveiled its V2 version at the end of 2021. The refreshed blueprint brought nuanced changes, such as a refined price stability mechanism. This revamped model allowed users to mint $FEI by depositing collaterals like $ETH , $DAI, and $LUSD, maintaining a 100% collateralization ratio. Post-minting, these assets merged into the Protocol Controlled Value (PCV). When the collateralization ratio(PCV/total circulating $FEI) is above 100%, indicating healthy appreciation of protocol assets and stress-free $FEI redemption, the protocol mints some $FEI to purchase $TRIBE, thereby reducing the collateralization ratio. Conversely, when the ratio falls below 100%, signaling potential default on $FEI redemption, the protocol mints $TRIBE to buy $FEI, thereby increasing the collateralization ratio.
Under this mechanism, the governance token $TRIBE acts as a reserve for potential risks, also reaping additional rewards during system growth (a mechanism somewhat similar to the Float Protocol, which launched alongside Fei V1). Regrettably, the launch of Fei V2 coincided with the peak of the bull market. Subsequently, the price of ETH began to fall, and FEI Protocol unfortunately suffered a hack in April 2022, losing 80 million $FEI. Ultimately, the decision was made to cease protocol development in August 2022.
Decentralized reserve protocols, which utilize governance tokens as reserves, essentially dilute the stake of all governance token holders to ensure stablecoin redemption. During a bull market, as the market cap increases, the value of governance token tends to rise as well, easily creating a flywheel effect. However, in a bear market, as the value of the collateral assets decreases, the market cap of the governance token also falls. Excessive minting can then exacerbate the price drop, leading to a “death spiral” of the governance token. If the governance token’s market cap falls below that of total circulating stablecoins, doubt about the protocol’s capacity to redeem the stablecoin among holders will increase, accelerating the exodus and causing a systemic collapse. The survival game, particularly during bear markets, becomes pivotal for these stablecoins. Celo’s successful survival amid the bear market is closely tied to its over-collateralization strategy. During the previous market high, Celo allocated a relatively larger portion of its reserves to assets like $USDC , $DAI, $BTC, and $ETH , enabling the protocol to remain secure even as the price of $CELO fell from $10 to $0.5.
Decentralized Reserve Protocols: Embracing Risk Hedging for Stability
This type of protocol is also called “risk-neutral stablecoin protocol, which involves risk hedging for reserve assets(usually in the form of cryptocurrency) of the protocol. When the price of the collateral assets plummets drastically, the hedging generates profits, ensuring that the stablecoin protocol’s assets can always repay its liabilities. We refer to this type of protocol as decentralized reserve protocols with reserve assets risk hedging, or risk-neutral stablecoin protocols. If you were to deposit 1 $ETH valued at $2000 into one of these protocols, it would, in anticipation of potential market volatility, hedge this asset, possibly by initiating a short position on a crypto exchange. When $ETH drops from $2000 to $1000, the profit ($1000) generated by hedging can compensate for the dip in value, ensuring that the protocol can still redeem 2000 stablecoins to the user.
Specifically, depending on the actual hedger, these are further divided into decentralized reserve protocols where the protocol itself hedges the risk, and decentralized reserve protocols where the users hedge the risk.
Decentralized Reserve Protocols with Protocol-Driven Risk Hedging
Stablecoin protocols adopting this approach include Pika Protocol V1, UXD Protocol, and Ethena, which recently announced its financing.
Pika V1
Pika Protocol is a derivatives protocol deployed on the Optimism network, but in its initial V1 version, Pika had planned to launch a stablecoin, with hedging achieved through Bitmex’s Inverse Perpetual. An original creation of Bitmex, Inverse Perpetuals stands apart from the typical linear perpetual saturating the market. Unlike the popular “linear perpetual”, which employs USD as a foundation for tracking crypto values, inverse perpetual has a distinct architecture – they harness the token itself as a margin to trace its USD equivalent price. Here’s an example to illustrate the earnings from inverse perpetual:
A trader goes long 50,000 contracts of XBTUSD at a price of 10,000. A few days later the price of the contract increased to 11,000.The trader’s profit will be: 50,000 1 (1/10,000 – 1/11,000) = 0.4545 XBTIf the price had in fact dropped to 9,000, the trader’s loss would have been: 50,000 1 (1/10,000 – 1/9,000) = -0.5556 XBT. The loss is greater because of the inverse and non-linear nature of the contract. Conversely, if the trader was short then the trader’s profit would be greater if the price moved down than the loss if it moved up.
Sourcehttps://www.bitmex.com/app/inversePerpetualsGuide
Upon closer analysis, it’s evident that inverse perpetual is a perfect match for Decentralized Reserve Protocols aimed at hedging risks for reserve assets. Taking our previous example, let’s assume that when ETH is priced at $2000, Pika Protocol, on receiving 1 ETH from a user, strategically deploys it as collateral, shorting 2000 units of $ETH inverse perpetual on Bitmex. Should the market act up and ETH value slide to 1000 USD,  then Pika Protocol’s profit is: 2000 1 (1/1000 – 1/2000) = 1 ETH = $1000
This means when the $ETH price falls from $2000 to $1000, Pika Protocol’s reserves change from 1 $ETH to 2 $ETH , still effectively covering the redemption of 2000 stablecoins held by users (transaction fees and funding fees are not considered here). Pika Protocol V1’s product design is entirely in line with the $NUSD product design mentioned by Bitmex founder Arthur Hayes in his blog post, able to perpetually and perfectly hedge long positions in coin margin.
Regrettably,  characterized by their counterintuitive and non-linear return properties (the relationship between token price and contract price is not linear), the inverse perpetual are not easily understood by the average investors who get used to USD-Margined features. Fast forward to the present, it’s clear that inverse perpetual, or coin-margined perpetual contracts, have struggled to find their footing in a market dominated by their more accessible counterparts – the linear perpetual, otherwise known as USD-margined perpetual contracts. A snapshot of major exchanges reveals a stark contrast: Inverse Perpetuals account for a mere 20-25% of the trading volume when stacked against Linear Perpetuals. Affected by regulations, BitMex has gradually receded from being a top-tier perpetual exchange to one with less than 0.5% of the current perpetual market share. Recognizing the limitations of linear perpetual contracts for their hedging strategies and the dwindling market appetite for inverse perpetual, Pika embarked on a strategic pivot. In its V2 version, Pika abandoned the stablecoin business and officially shifted to a derivatives protocol. 
UXD
UXD Protocol operates on the Solana network and was launched in January 2022. In 2021, UXD finished $3 million in funding led by Multicoin and raised $57 million during its IDO. In January 2023, UXD decided to expand its operations to the Ethereum ecosystem, launching on Arbitrum in April, with plans to debut on Optimism in the near future.
Upon its initial launch, UXD Protocol allowed users to deposit $SOL, $BTC, and $ETH to mint its stablecoin, UXD, on a 1:1 value basis pegged to the USD. To maintain the integrity and redemption of UXD, the deposited collaterals were hedged using short positions through Solana’s renowned lending and perpetual platform, Mango Markets. The funding fees collected from the short positions would serve as protocol revenue, while funds raised by the protocol would cover any payable funding fee costs. For an extended period post-launch, the UXD Protocol functioned smoothly. The protocol even had to impose a cap on UXD issuance due to Mango Markets’ overall open interest being under the $100 million mark. A surge in UXD’s short positions, especially if they escalated into the tens of millions, could pose tangible redemption threats. Additionally, an excess of short positions could tilt the funding rate towards becoming negative, subsequently elevating hedging costs.
Unfortunately, in October 2022, Mango Markets suffered from a governance attack, leading to a nearly $20 million loss for UXD Protocol. At the time, UXD’s insurance fund had a balance of over $55 million, allowing for normal redemptions for UXD. Although Mango Markets later returned the funds to UXD Protocol, the damage was done, and Mango Markets struggled to recover. The situation was exacerbated by the FTX crisis, leading to a rapid capital outflow from Solana. As a result, UXD couldn’t find suitable exchanges to hedge their long positions. The protocol limited its collateral support to just $USDC , a stablecoin that inherently did not necessitate hedging. They channeled deposited $USDC collateral into various on-chain $USDC vaults and Real-World Assets (RWA). Recognizing the need to diversify and mitigate risks, they made the strategic decision to foray into the Ethereum ecosystem. Following a successful launch on Arbitrum in April, they have plans to deploy on Optimism. Amidst this expansion, UXD remains on the hunt for optimal on-chain hedging platforms to fortify its operations.
Currently, UXD has a circulating supply of $14.3 million, with a protocol insurance fund balance of $53.2 million.
Source: https://dashboard.uxd.fi/
Ethena Finance, a stablecoin protocol that recently announced  $6 million in funding led by Dragonfly, with investments from centralized exchanges such as Bybit, OKX, Deribit, Gemini, and Huobi, also employs hedging strategies to hedge its reserve assets. Given the backing from numerous second-tier derivative exchanges, this will likely benefit their collateral hedging efforts. Furthermore, Ethena has set its sights on collaborating with Synthetix, a renowned decentralized derivatives protocol. This partnership aims to initiate short positions utilizing Synthetix as a liquidity provider. Moreover, it will diversify the utility of its stablecoin, USDe, by enabling it to serve as collateral within select pools.
The merits of such decentralized reserve protocols with risk-hedging strategies are evident. By hedging the crypto assets that serve as collateral, the protocol can achieve a risk-neutral position, ensuring stablecoin redemptions. This approach melds the promise of 100% capital efficiency with the trustless nature of decentralization—a match largely dependent on the quality of hedging venues. Furthermore, if the protocol can hedge its position with high capital efficiency, the reserved collateral can be channeled into multiple avenues to yield returns. Additionally, funding fees can serve as protocol revenue. This flexibility allows for multiple possibilities: the generated returns could be distributed to stablecoin holders, creating an interest-bearing stablecoin and enhancing its utility. Alternatively, the profits could also be distributed to governance token holders.
The governance tokens within a stablecoin protocol inherently act as the ultimate lender for their associated stablecoin. In situations where reserve asset risks are hedged, these stablecoin protocols can leverage their governance tokens as a redemption mechanism during extreme circumstances. For stablecoin holders, this dynamic offers an extra protective layer when compared to stablecoins solely backed by governance tokens. Conceptually, the act of hedging reserve assets is streamlined. The underlying principle is that it should remain resilient across market cycles. Consequently, this mitigates the need to verify the robustness of the governance token during downtrends. 
But as with all innovations, there are hurdles on the horizon. Here’s a breakdown:
The Centralization Paradox: The current financial landscape sees centralized exchanges reigning supreme in perpetual contract liquidity. Furthermore, most decentralized derivatives exchanges are not designed for stablecoin protocols to do hedging, leading to an inevitable centralization risk. This risk poses a twofold problem: Centralized exchanges, by their very nature, come with inherent risks. With limited hedging venues, any single venue becomes crucial to a protocol’s health. A hiccup in one venue can have a substantial impact on the protocol. The UXD Protocol’s halt in operations due to the attack on Mango Markets is an extreme example of such centralization risks.Limitations in Choosing Hedging Tools. The current mainstream linear perpetual contracts cannot perfectly hedge their long positions. Taking $ETH as an example, a stablecoin protocol would ideally seek a hedge using ETH-denominated short positions with ETH as collateral. However, prominent linear perpetual contracts lean on $USDT as collateral, mapping their profit curves to USD, which cannot perfectly hedge against the $ETH position. Even if the stablecoin protocol tries to borrow against $ETH for $USDT, it increases operational costs and complexities in position risk management, while also reducing capital efficiency. Pika Protocol’s experience underscores how inverse perpetual contracts might be the holy grail for these protocols. Sadly, their market footprint remains undersized.Scale Growth Has Inherent Limitations. With the growth of a protocol’s stablecoin market cap, comes an amplified demand for perpetual contract short positions for hedging purposes. Beyond the complexities of obtaining a sufficient number of short positions, the more short positions the protocol holds, the higher the liquidity requirement from the counterparty when closing positions. This can lead to potentially negative funding rates, implying potentially higher hedging costs and operational complexities. For a stablecoin with a scale of tens of millions of dollars, this might not pose significant challenges. However, if one aims to scale further, reaching hundreds of millions or even billions, this issue can significantly cap its growth potential.Operational Risks. Regardless of the hedging mechanism employed, there’s inevitably a high frequency of opening positions, adjusting portfolios, and managing collateral. These processes require manual intervention, leading to significant operational and even moral risks.
Decentralized Reserve Stablecoin Protocols with User-Driven Risk Hedging
Protocols that have adopted this approach include Angle Protocol V1 and Liquity V2.
Angle V1
Angle Protocol was launched on the Ethereum network in November 2021. Before its launch, they closed a $5 million financing round led by a16z.
While the details of the Angle Protocol V1 are extensively discussed in a report by Mint Ventures, here we provide a brief overview:
Like other decentralized reserve stablecoin protocols, under ideal conditions, Angle allows users to mint its stablecoin, agUSD, by depositing 1$ worth of collateral in the form of $ETH . What differentiates Angle is its target audience. Beyond the archetypal stablecoin users, Angle Protocol has a dedicated offering for perpetual contract traders. Within the ecosystem of Angle, these traders are distinctly termed as the Hedging Agency (HA).
Note: Angle’s first stablecoin was pegged to the Euro, termed agEUR, the logic remains the same. For the sake of consistency in this context, we’ll use the USD-pegged stablecoin as an example. 
Using the same example, suppose ETH is currently valued at $2000. A user deposits 1 $ETH to Angle, consequently minting 2000 USD-pegged stablecoins. Concurrently, Angle sanctions a leveraged position equivalent to 1 $ETH for the benefit of traders. Under the assumption that the Hedging Agency (HA) utilizes 0.2 $ETH (worth $400) as collateral, and opens a position with 5x leverage, the protocol’s total collateral amounts to 1.2 $ETH . In monetary terms, this is a sum of $2400 stacked against a  liability of $2000 in stablecoins.
When the price of $ETH rises to $2200, the protocol only needs to retain enough $ETH to back the $2000 stablecoins, which is approximately 0.909 $ETH . The remaining 0.291 $ETH (valued at $640) can be withdrawn by the HA.
Conversely, if $ETH drops to $1800, the protocol still has to maintain enough $ETH to back the $2000 stablecoins, which would be around 1.111 $ETH . The HA’s collateral position would then reduce to about 0.089 $ETH (worth $160).
For traders maneuvering within the Angle Protocol, their engagement is fundamentally analogous to taking a long position on $ETH -margined perpetual. When the price of $ETH rises, they not only benefit from the appreciation but also from the surplus $ETH in the protocol (in the above example, a 10% surge in the price of $ETH resulted in a 60% gain for the trader). Yet, the inverse is just as pronounced. A deceleration in ETH’s price by 10% triggers a sharp 60% erosion in the trader’s position. From the point of the Angle Protocol, these traders serve as an insulating barrier against the volatility or depreciation of the collateral. This very function earns them the title of “Hedging Agency”. The leverage level for traders is determined by the ratio of the hedging position available in the protocol (0.2 ETH in this example) to the protocol’s stablecoin position (1 ETH in this example).
For perpetual contract traders, Angle offers the following advantages: 
No Funding Fees: Traditional centralized platforms require long-position holders to pay funding fees to their counterparts holding short positions.Slippage-Free Oracle Price: Angle attempted to create a win-win scenario for both stablecoin holders and perpetual contract traders where stablecoin holders benefit from high capital efficiency and decentralization, while contract traders enjoy an enhanced trading experience. 
Actually, there might be instances where there are no traders to open long positions. In such cases, Angle introduces the Standard Liquidity Provider (SLP) to offer additional collateral (stablecoins) to ensure the protocol’s security, while simultaneously earning yields, transaction fees, and governance token($ANGLE) rewards.
However, Angle’s real-world performance hasn’t been optimal. Even though traders receive a large amount of $ANGLE as a reward, the protocol’s collateral isn’t fully hedged most of the time. The core issue is that Angle hasn’t offered a product compelling enough for traders. As the price of the $ANGLE token declined, the TVL (Total Value Locked) of Angle Protocol also plummeted from its initial $250 million to around $50 million.
Hedging Ratio for the USDC Pool, the Primary Collateral for Angle StablecoinsSource: https://analytics.angle.money/core/EUR/USDC
Source: https://defillama.com/protocol/angle
In March 2023, the interest-bearing reserve assets of Angle unfortunately fell victim to a hacker attack involving Euler. Although the hacker eventually returned the stolen assets, the incident severely impacted Angle’s momentum. By May 2023, Angle announced the discontinuation of Angle Protocol V1, and introduced plans for V2. Angle Protocol V2 shifted to a traditional over-collateralized model and was launched in early August. 
Liquity V2
Since its launch in March 2021, LUSD issued by Liquity, has become the third-largest decentralized stablecoin in the market (following DAI and FRAX) and is the largest fully-decentralized stablecoin. For a deeper dive into Liquity, you can read our comprehensive reports released in July 2021 and Apr 2023 shed light on the mechanics of Liquity V1, along with its subsequent product updates and case expansions.
The Liquity team stands proud of LUSD’s accomplishments in achieving both decentralization and price stability. Yet, when analyzing the metrics of capital efficiency, Liquity’s performance has been somewhat mediocre. Historical data indicates that, since its inception,Liquity’s collateralization ratio has consistently hovered around 250%, implying that for every circulating LUSD, there’s an ETH collateral worth $2.5 backing it.
Source: https://dune.com/liquity/liquity
On July 28th, Liquity officially unveiled the features of its V2, with the heart of this upgrade being the inclusion of LSD as an eligible collateral type. However, the standout feature remains its ambitious claim of achieving high capital efficiency via delta-neutral hedging across the entire protocol.
Currently, Liquity has not publicly released detailed product documentation. The available information on V2 primarily originates from founder Robert Lauko’s talk at ETHCC, previous introductory articles released by Liquity, and discussions on Discord. Our subsequent summary is primarily based on these sources.
In terms of product logic, mirroring the structure of Angle V1, Liquity’s V2 has a foundational premise to onboard traders to execute leveraged operations on its platform. These traders’ margins then function as supplementary collateral for the protocol, thereby strategically hedging the protocol’s overarching risk spectrum. For the traders, Liquity offers an attractive trading product.
Diving into the specifics, Liquity has rolled out two groundbreaking novelties. The first, aptly termed as “Principal Protected Leverage Position”. This offering allows traders to access a unique leveraged trading paradigm where their principal stands safeguarded. Upon remitting a designated premium, users can unlock this utility, which provides an insurance net against significant ETH price downturns, ensuring they still pocket a predetermined amount of USD. Illustrating this with a scenario laid out in Liquity’s blog: When ETH stands at a valuation of $1000, an investor parting with 12ETH (split into 10 ETH for principal and a 2 ETH premium) can leverage a 2x position on the initial 10 ETH, coupled with downside fortification. Translated, if the ETH price sees a twofold surge, the 2x leveraged position becomes active, resulting in the user receiving a total of 40 ETH. However, if the ETH price goes down, the purchased put option becomes active, allowing the user to withdraw their initial $10000 (10 * 1000).
Source: https://www.liquity.org/blog/introducing-liquity-v2
Liquity’s product innovation clearly seeks to iterate and expand on the foundational concepts introduced by Angle Protocol, with the spotlight being on the “principal protection”. While Liquity remains tight-lipped about the detailed mechanisms, piecing together insights from their design drafts and lively chatter on Discord,  this “principal protection” seems very much akin to a call option.
Liquity believes that this combo product, due to its capacity to protect the principal, will be highly attractive to traders. This call option is crafted to empower traders with the dual advantage of leveraged returns when prices rise and ensuring their principal when prices fall. Through the trader’s lens, this presents a potentially more enticing proposition compared to Angle’s unembellished leveraged trading pitch – though the allure’s potency would be contingent on how Liquity calibrates its premium pricing. From the protocol’s perspective, the premium paid by users can serve as a safety buffer. When the ETH price falls, Liquity can use this premium as supplementary collateral to repay stablecoin holders; when the price rises, the appreciated part of Liquity’s collateral can be distributed to contract traders as profit.
An evident challenge lurks within Liquity’s innovative design. Zooming in, when traders look to liquidate their positions prematurely and extract their ETH, Liquity faces a dilemma. While granting traders the autonomy to bow out as they wish, any such withdrawal concurrently slashes the protocol’s hedged proportion, introducing potential fragility as chunks of “collateral” are pulled out. In fact, a similar problem arose in the practical operation of Angle Protocol, where its system’s hedging rate remained low for extended periods, indicating that traders weren’t sufficiently hedging the overall protocol position.
To tackle this challenge, Liquity introduced the second innovation: its secondary marketplace to subsidize positions.
This means that within Liquity V2, leveraged trading positions (tokenized as NFTs) can not only be opened and closed like regular leveraged positions but can also be sold on a secondary market. In reality, Liquity’s primary concern is traders liquidating their positions, as this could reduce the protocol’s hedging ratio. When a trader wants to close a position, if another trader is willing to purchase it on the secondary market at a price higher than its inherent value, the original trader benefits from receiving more cash, which results in a win-win. For Liquity, even though this “inherent value” is subsidized by the protocol, a relatively small subsidy can maintain the overall system’s hedging ratio. As a result, at a relatively low cost, the protocol’s security is enhanced.
Source: https://www.liquity.org/blog/introducing-liquity-v2
Imagine a scenario where Alice opens a position with 10 ETH, at an ETH market price of $1000, and she pays a premium of 2ETH. This position represents a leveraged long of 10 ETH with principal protection. However, if the price of ETH then falls to $800, the $12,000 worth of ETH that Alice initially invested can now only be swapped for 10 ETH (valued at $8000). At this juncture, Alice can either directly liquidate her position to get the 10 ETH ($8000) or sell this position on the secondary market for a price between $8000 and $12000. From Bob’s perspective, a potential buyer, purchasing Alice’s position would be akin to buying ETH at $800 and obtaining a call option with a strike price of $1000. This option certainly has value, ensuring Alice can get a price higher than $8000 for her position. For Liquity, if Bob acquires Alice’s position, the protocol’s collateralization remains unchanged since the premium is still retained within the protocol’s pool. If no one, like Bob, steps up to buy Alice’s position promptly, Liquity will gradually increase the value of Alice’s position over time (though the exact method isn’t specified, mechanisms like lowering the strike price or increasing the quantity of the call option could enhance the position’s value). The subsidized amount would be sourced from the protocol’s premium pool. However, this could slightly decrease Liquity’s overall over-collateralization. Liquity believes that not all positions would require protocol subsidization, and even if they do, it might not entail subsidizing a significant portion of the position. Thus, subsidizing the secondary market can effectively maintain the protocol’s hedging ratio.
Lastly, despite these innovations, it might still be challenging to entirely address the liquidity shortfall in extreme scenarios. Liquity would also adopt a mechanism similar to Angle’s standard liquidity provider system as a final supplement. This could possibly involve the protocol allowing users to deposit some V1 LUSD into a stability pool, which would act as a backup to support V2 LUSD redemptions when the crypto waters get choppy.
Liquity V2 is planned to launch in the second quarter of 2024.
Overall, Liquity V2 shares many similarities with Angle V1 but has also introduced targeted improvements in response to the challenges faced by Angle. They’ve innovated with the “principal protection” feature, offering a more appealing product for traders. They’ve also introduced a “protocol-backed subsidized secondary market” to protect the protocol’s overall hedging ratio.
However, at its core, Liquity V2 remains akin to the Angle Protocol. It represents an attempt to branch out and craft an innovative derivatives product that, in turn, supports their stablecoin operation. While Liquity’s proficiency in the stablecoin domain is well-established, it remains to be seen whether they can successfully design a top-tier derivative, achieve Product-Market Fit (PMF), and promote it effectively.
Conclusion
The prospect of a decentralized reserve protocol that achieves decentralization, high capital efficiency, and price stability is certainly exhilarating. However, a sophisticated and well-thought-out mechanism design is just the first step for a stablecoin protocol. What’s even more crucial is the expansion of stablecoin use cases. At present, the stride of decentralized stablecoins in proliferating use cases has been somewhat languid, with most pigeonholed into a singular role: indispensable tools for liquidity mining. But the lure of mining incentives isn’t eternal.
In a provocative turn of events, PayPal’s foray with PYUSD has sounded the clarion call for crypto stablecoin protocols. It indicates that well-known entities within the Web2 domain are venturing into the stablecoin arena, suggesting that the window of opportunity for current stablecoins might be narrowing. Indeed, when we discuss the centralized risks of custodial stablecoins, our concerns largely revolve around the potential unreliability of custodians and issuing entities. For instance, Silicon Valley Bank, ranks 16th in the U.S., whereas both Tether and Circle, albeit influential, remain quintessentially ‘crypto-native’. If a “too big to fail ” entity from the traditional financial world, such as JP Morgan, were to issue a stablecoin, the inherent national credit backing it could eclipse competitors like Tether and Circle instantly. Furthermore, it introduces a quandary: if centralized entities offer robust stability, does the call for decentralization become muted?
With that said, we hope to see decentralized stablecoins diversify their application spectrum, cementing themselves as the Schelling point in the ever-evolving stablecoin narrative. Realizing this, however, is a difficult task.
A Quick Look at the Pioneer of Restaking: Understanding EigenLayer’s Business Logic and Valuation AnBy Alex Xu, Research Partner at Mint Ventures Preface With the completion of the Ethereum Shanghai upgrade, a multitude of Liquid staking derivatives (LSD) projects have witnessed exponential growth. The user base and net worth of LSD assets have seen a substantial increase. As the crypto space anticipates the forthcoming Cancun upgrade and the open-source of the OP stack, 2023 is increasingly being heralded as the “Year of Rollups.” Pioneering services anchored to the Rollup infrastructure—including the Data Availability(DA) layer, shared sequencer, and Rollups-as-a-Service (RaaS) offerings—are carving out their niche in the market. One standout in this arena is EigenLayer. This avant-garde entity floated the innovative “Restaking” paradigm grounded in LSD assets, with a vision to cater to a diverse array of Rollups and Middlewares. Throughout this fiscal year, the buzz around EigenLayer has been palpable. Their funding round in March fetched an impressive $50 million at a valuation of $500 million. Yet, the grapevine suggests that their Over-The-Counter (OTC) token pricing has catapulted to a staggering $2 billion valuation, comparable to the level of public-chain projects. In the following analysis, we will delve deep into EigenLayer’s business model and provide an exploratory valuation of the project, attempting to answer the following questions: What are Restaking services, who is its target audience, and what problems is it trying to solve?What are the challenges to the widespread adoption of the Restaking model?Is the $500 million—or the speculated $2 billion—valuation for EigenLayer a tad ambitious? The insights and opinions presented in this article reflect my views as of the publication date primarily through a business lens, with a limited delve into the technical details of EigenLayer, and may contain factual inaccuracies or biases. This article is intended for discussion purposes only, and feedback is welcomed. The Business Logic of EigenLayer Before we plunge into the detailed operations of EigenLayer, let’s introduce a few frequently used terms that will appear throughout the text: Middleware: In the realm of Web3, this term refers to the services that bridge the gap between the core blockchain infrastructure and decentralized applications (Dapps). Quintessential middleware components in this space include oracles, cross-chain bridges, sequencers, Decentralized Identity (DID), and the DA layer, to name a few. LSD: Stands for Liquid Staking Derivatives, an example being Lido’s stETH. AVS: Refers to Actively Validated Services. At its core, AVS is a decentralized node system that bestows projects with enhanced security and a guarantee of decentralization. The most iconic representation of AVS is the Proof-of-Stake (PoS) mechanism inherent to public blockchains. DA: An abbreviation for Data Availability. Projects (like Rollups) can back up their transaction data on the DA layer, ensuring that if required in the future, they can access and restore all historical transaction records. Business Scope EigenLayer offers a matching market grounded in cryptoeconomic security. Cryptoeconomic security is the assurance provided by various Web3 projects. To ensure smooth operation and maintain a permissionless and decentralized nature, the primary service providers of the network, known as validators, are mandated to stake tokens as a gesture of their commitment. If these validators fail to fulfill their obligations, their staked tokens are subject to be slashed. EigenLayer, as a platform, performs a dual role. At its core, EigenLayer serves as a nexus for LSD asset holders, pooling their assets. Concurrently, leveraging these aggregated LSD assets as collateral to offer convenient and cost-effective AVS services to middlewares, side chains, or Rollups. EigenLayer positions itself between LSD providers and those with AVS demands, facilitating a matchmaking service. A dedicated collateral service provider ensures the safekeeping and security of the staked assets. Beyond the core services, the parent company behind EigenLayer has offered data availability services to Rollups or application chains that require DA layer services. This product is named “EigenDA”. There’s a synergistic relationship between EigenDA and EigenLayer in their operations. EigenLayer aims to address the following pain points: 1. For Diverse Blockchain Initiatives: EigenLayer can help to reduce the high costs associated with independently building their trustless network. Instead of doing it all in-house, projects can directly purchase staking assets and node operators on the EigenLayer platform. Source: EigenLayer Whitepaper 2. For the Ethereum Ecosystem: EigenLayer enhances the utility of LSD. By positioning $ETH as a preferred security collateral option for multiple projects, EigenLayer bolsters the demand for $ETH within the broader market. 3. For LSD Asset Holders: EigenLayer enhances the capital efficiency of LSD assets, ensuring holders receive better returns on their investments. The Clientele of EigenLayer EigenLayer offers a range of services tailored to different parties in the blockchain ecosystem. Let’s break down the primary users and their specific needs: LSD Asset Providers: Their main goal is to earn more yield from their LSD assets beyond the standard PoS rewards. They are willing to provide their LSD assets as collateral to node operators, even though they face potential slashing.Node Operators: They obtain LSD assets through EigenLayer to provide node services to projects in need of AVS services. Their revenue is derived from node rewards and transaction fees provided by these projects.AVS Demanders: These are typically projects that require AVS to ensure their security and are looking to reduce associated costs. Examples might include a particular Rollup or cross-chain bridge that utilizes LSD assets as collateral for node operation. These projects can conveniently purchase AVS services from EigenLayer without the complexities of setting it up themselves. The primary demand for EigenDA stems from diverse Rollups and application chains. Operation Mechanism in EigenLayer EigenLayer provides users with the ability to restake their tokens, which were originally staked on the Ethereum network. This includes a variety of tokens such as stETH, rETH, and cbETH. The staking service providers play a pivotal role in ensuring that these tokens are matched with the appropriate security network demanders, facilitating the provision of AVS services. The primary collateral for AVS consists of the tokens users have staked on EigenLayer. As a gesture of appreciation and compensation, the projects on the receiving end of these services distribute a “security fee” to these stakers. Advancements in Product Offerings At present, EigenLayer has rolled out its restake feature exclusively for LSD. The platform is still in the throes of developing node operation staking and LSD-based AVS services. During their two public LSD asset deposit events, there was a significant user turnout, with deposits swiftly reaching their maximum limits. A major driving force behind this overwhelming participation was the speculation surrounding potential airdrop rewards from EigenLayer. Additionally, users have the option to deposit batches of 32 ETH to engage in Restake. Despite these deposit restrictions, EigenLayer boasts a collection of approximately 150,000 staking ETH. Source: https://app.eigenlayer.xyz/ According to EigenLayer’s officially released roadmap, the primary focus for the ongoing Q3 revolves around the development of the Operator testnet, which is geared towards node operators. The subsequent Q4 is earmarked for the initiation of the AVS service testnet development.  https://docs.EigenLayer.xyz/overview/readme/protocol-features/roadmap When it comes to EigenDA, its first confirmed clientele is the rollup project Mantle, which is based on a fork of Optimistic Virtual Machine. Mantle is currently using a test version of EigenDA as its DA. Tokenomics EigenLayer confirmed that it will issue tokens, but details regarding the tokenomics and related information remain undetermined and are yet to be unveiled to the public. Team and Funding Background The Core Team  Founder&CEO: Sreeram Kannan Renowned as an associate professor in the Department of Computer Engineering at the University of Washington, Dr. Kannan also wears the entrepreneurial hat as the driving force behind Layr Labs – the umbrella entity steering EigenLayer’s vision. With an impressive portfolio of over 20 academic papers in the blockchain sector, Dr. Kannan completed his undergraduate studies in telecommunications at the Indian Institute of Science, and later acquired a master’s degree in mathematics and a Ph.D. in information theory and wireless communication from the University of Illinois at Urbana-Champaign. After serving as a postdoctoral researcher at the University of California, Berkeley, currently, he’s currently at the helm of the University of Washington Blockchain Lab (UW-Blockchain-Lab) while nurturing the next generation of blockchain enthusiasts. Founder & Chief Strategy Officer: Calvin Liu After graduating from Cornell University with a dual major in philosophy and economics, Liu boasts an impressive trajectory in data analysis, business consulting, and business strategy. He played a pivotal role as the head of strategy at Compound for nearly four years before joining EigenLayer in 2022. COO Chris Dury A graduate with an MBA from the Stern School of Business at New York University, Chris brings with him extensive experience in cloud service product project management. Before joining EigenLayer, he held the position of Senior Vice President of Product at Domino Data Lab, a machine learning platform. He also served in various leadership roles at Amazon AWS, including General Manager and Director, where he led several cloud service projects aimed at game developers. Chris became a part of EigenLayer in early 2022. Source: https://www.linkedin.com/company/eigenl/ The EigenLayer team is expanding rapidly, with a current headcount of over 30 employees, most of whom are based in Seattle, USA. Founded in 2021 by Dr. Kannan, Layr Labs is the parent company behind EigenLayer. However, Layr Labs is not a one-trick pony; its innovative portfolio boasts two other projects: EigenDA and Babylon, the latter also offer cryptoeconomic security services but primarily cater to the specific needs of the Cosmos ecosystem. Funding Details EigenLayer has conducted two public funding rounds to date. In 2022, they raised $14.5 million in a seed round (valuation undisclosed), followed by a Series A round in March 2023 where they secured $50 million at a valuation of $500 million. Some of the well-known VCs are listed below: Source: rootdata.com In the same timeframe of 2023, Layr Labs also completed an equity financing round, raising close to $64.48 million. Detailed information can be found in the SEC Filing Document Market Size, Narrative, and Challenges of the Restaking Business The Projected Market Size of the Restaking Business EigenLayer has introduced the innovative concept of restaking, offering “cryptoeconomic security as a Service.” Catering primarily to middleware entities such as oracles, bridges, and DA layers, EigenLayer also extends its offerings to side chains, application chains, and Rollups. The primary pain point it aims to address is to considerably minimize the security expenses for decentralized networks, obviating the need for individual projects to construct their trust networks. Theoretically, any project that requires staked-token-gated and utilizes game theoretic principles for sustaining consensus in a decentralized network emerges as a probable client for EigenLayer’s services. It’s challenging to precisely estimate the market cap of restaking businesses, but optimistically, it might evolve into a market worth tens of billions of dollars within the next three years. As of August 30, 2023, the total staked ETH has peaked at a remarkable $42 billion, a figure that’s juxtaposed against its circulating market cap hovering around the $200 billion mark and the total on-chain value under Ethereum is gauged between $300 billion to $400 billion. Given that EigenLayer’s primary future clients will likely be smaller and newer projects compared to Ethereum, which holds a dominant position with $40 billion in staking scale, EigenLayer’s total staking amount might range between $10 billion to $100 billion in the short term. The Narratives Fueling EigenLayer’s Operations and Growth Demand Side With the integration of the much-anticipated Cancun upgrade and the open-source of the OP Stack, smaller Rollups and application chains are rapidly developing, increasing the overall demand for cost-effective AVS.The modularization trend in public chains, Rollups, and application chains has triggered a need for affordable DA layers that operate outside the Ethereum ecosystem. The strategic growth of EigenDA thus becomes instrumental in fueling the demand for EigenLayer, showcasing a synergistic effect where the growth of one bolsters the expansion of the other. Supply Side Ethereum’s increasing staking ratio and the expansion in the community of stakers have brought forth an opulence of LSD assets along with an extensive user base. These stakers, are perpetually in pursuit of avenues to optimize the capital efficiency and yield of their LSD holdings. In the future, EigenLayer also hopes to broaden its LSD offering beyond just ETH. Issues and Challenges For AVS demanders, one lingering ambiguity is the real-world savings they can achieve by sourcing collateral assets, coupled with the specialized validation node services offered by EigenLayer. The assumption that leveraging ETH LSDs as collateral naturally translates into acquiring Ethereum’s massive security infrastructure, valued at tens of billions, doesn’t hold up to scrutiny. In essence, a project’s economic security is determined by the cumulative magnitude of ETH LSDs and the proficiency of the validation node’s operations. While EigenLayer offers a potentially expedited solution than building an AVS from scratch, the tangible savings concerning cost might fall short of expectations.Introducing external assets as collateral for AVS might diminish the utility of a project’s native tokens. While EigenLayer offers a hybrid staking mechanism that integrates both its proprietary tokens and EigenLayer Restaking, this approach may still encounter notable hesitancy in its adoption.Projects might have an over-dependence on EigenLayer for constructing their AVS, leading to a passive long-term development strategy and potential “bottlenecks” in the future. As these projects burgeon and evolve, a strategic shift towards leveraging their tokens as the prime security collateral seems likely.When projects use LSDs as security collateral, they also need to consider the inherent credit and safety risks associated with the LSD platform, adding another layer of risk. The Competitive Landscape “Restaking” is a relatively new concept, pioneered by EigenLayer. Currently, the market landscape is relatively sparse when it comes to competitors imitating this model. However, the main challenge is for prospective clients to make the strategic decision of whether to invest in building a proprietary security network from the ground up or to delegate this intricate task to EigenLayer. EigenLayer still needs a robust portfolio of client implementations to demonstrate the superiority and efficiency of its solution. Valuation Derivation EigenLayer, as a novel business project, lacks clear benchmark projects and valuation metrics. Consequently, our approach hinges on extrapolating its valuation via forecasted annual protocol revenue and the Price-to-Sales (PS) ratio. Before delving into the formal estimation, we need to make a few assumptions: EigenLayer’s primary business model revolves around collecting a commission of security fees from AVS service clients. Out of the service fee, 90% goes to LSD depositors, 5% to node operators, and EigenLayer retains a 5% commission, a metric aligned with the standards of Lido.AVS service clients pay an average annual security fee equivalent to 10% of the total borrowed LSDs. The rationale behind this 10% threshold is grounded in prevailing market dynamics. Contemporary mainstream PoS projects bestow annual rewards ranging between 3-8% for PoS stakers. Given the nascent phase of most EigenLayer clients, they’re incentivized at a marginally higher rate. Consequently, our chosen metric of 10% aptly reflects the average security fee ratio. The reward rate of major Layer1 blockchains Based on the above assumptions, and considering the volume of borrowed LSDs channeled by EigenLayer with the respective PS, we can delineate specific valuation spectrums. The colored segments represent the valuation ranges we believe are more probable, with greener shades indicating more optimistic forecasts. I would characterize the overlap of “annual borrowed LSDs with a volume of $2-5 billion and P/S at 20-40x” as a high probability valuation range. The reason is as follows: At present, the aggregate PoS staking token market capitalization of the top ten public chains touches the $73 billion mark. When we factor in projects like Aptos and Sui, this figure surges to nearly $82 billion. However, a significant portion of the staking for the two projects derives from unreleased tokens held by core teams and institutions. Exercising prudence, we’ve opted to sideline these potential anomalies. Out of caution, I’ve excluded Sui and Aptos. I’ve assumed, albeit a bit arbitrarily, that EigenLayer’s share of LSDs could account for 2.5%-6.5% of the total PoS staking market, corresponding to a market cap of $20-50 billion. The viability of this market share figure is open to interpretation and subjective to individual perspectives.The PS interval of 20x-40x is deeply influenced by Lido’s extant 25x PS, a figure rooted in data as of August 30, 2023, and extrapolated from the fully diluted market cap. It’s noteworthy that emerging narratives, particularly in their inception stages, often warrant a premium valuation. Based on the above calculations, a valuation range of $2-10 billion might be reasonable for EigenLayer. For primary investors participating in the project at a midpoint valuation of $5 billion, and taking into account potential constraints linked to token vesting, there might not be much of a safety margin left. If market whispers hold weight, and there exist players eager to buy EigenLayer tokens over the counter at a $20 billion valuation, they should proceed with utmost caution. While we’ve explored EigenLayer’s potential valuation trajectory, it’s vital to differentiate between the overall project valuation and the intrinsic value of its native token. A spectrum of factors will affect the ability of value capture within the business ecosystem. For example: What percentage of the protocol’s revenue will be distributed to token holders?Beyond just buybacks or dividends, does the token have a robust utility in the business to increase its adoption?Will EigenDA and EigenLayer utilize the same token, providing more scenarios and demand for the token? If EigenLayer fails to adequately address the first two considerations, the inherent worth of its token may be compromised. Conversely, unforeseen developments related to the third point could bolster the token’s value. Moreover, as EigenLayer makes its market debut, its valuation will inevitably be swayed by the overarching bullish or bearish sentiments prevalent in the crypto landscape. The ultimate appraisal? Only the market will tell.

A Quick Look at the Pioneer of Restaking: Understanding EigenLayer’s Business Logic and Valuation An

By Alex Xu, Research Partner at Mint Ventures
Preface
With the completion of the Ethereum Shanghai upgrade, a multitude of Liquid staking derivatives (LSD) projects have witnessed exponential growth. The user base and net worth of LSD assets have seen a substantial increase. As the crypto space anticipates the forthcoming Cancun upgrade and the open-source of the OP stack, 2023 is increasingly being heralded as the “Year of Rollups.” Pioneering services anchored to the Rollup infrastructure—including the Data Availability(DA) layer, shared sequencer, and Rollups-as-a-Service (RaaS) offerings—are carving out their niche in the market. One standout in this arena is EigenLayer. This avant-garde entity floated the innovative “Restaking” paradigm grounded in LSD assets, with a vision to cater to a diverse array of Rollups and Middlewares. Throughout this fiscal year, the buzz around EigenLayer has been palpable. Their funding round in March fetched an impressive $50 million at a valuation of $500 million. Yet, the grapevine suggests that their Over-The-Counter (OTC) token pricing has catapulted to a staggering $2 billion valuation, comparable to the level of public-chain projects.
In the following analysis, we will delve deep into EigenLayer’s business model and provide an exploratory valuation of the project, attempting to answer the following questions:
What are Restaking services, who is its target audience, and what problems is it trying to solve?What are the challenges to the widespread adoption of the Restaking model?Is the $500 million—or the speculated $2 billion—valuation for EigenLayer a tad ambitious?
The insights and opinions presented in this article reflect my views as of the publication date primarily through a business lens, with a limited delve into the technical details of EigenLayer, and may contain factual inaccuracies or biases. This article is intended for discussion purposes only, and feedback is welcomed.
The Business Logic of EigenLayer
Before we plunge into the detailed operations of EigenLayer, let’s introduce a few frequently used terms that will appear throughout the text:
Middleware: In the realm of Web3, this term refers to the services that bridge the gap between the core blockchain infrastructure and decentralized applications (Dapps). Quintessential middleware components in this space include oracles, cross-chain bridges, sequencers, Decentralized Identity (DID), and the DA layer, to name a few.
LSD: Stands for Liquid Staking Derivatives, an example being Lido’s stETH.
AVS: Refers to Actively Validated Services. At its core, AVS is a decentralized node system that bestows projects with enhanced security and a guarantee of decentralization. The most iconic representation of AVS is the Proof-of-Stake (PoS) mechanism inherent to public blockchains.
DA: An abbreviation for Data Availability. Projects (like Rollups) can back up their transaction data on the DA layer, ensuring that if required in the future, they can access and restore all historical transaction records.
Business Scope
EigenLayer offers a matching market grounded in cryptoeconomic security.
Cryptoeconomic security is the assurance provided by various Web3 projects. To ensure smooth operation and maintain a permissionless and decentralized nature, the primary service providers of the network, known as validators, are mandated to stake tokens as a gesture of their commitment. If these validators fail to fulfill their obligations, their staked tokens are subject to be slashed.
EigenLayer, as a platform, performs a dual role. At its core, EigenLayer serves as a nexus for LSD asset holders, pooling their assets. Concurrently, leveraging these aggregated LSD assets as collateral to offer convenient and cost-effective AVS services to middlewares, side chains, or Rollups. EigenLayer positions itself between LSD providers and those with AVS demands, facilitating a matchmaking service. A dedicated collateral service provider ensures the safekeeping and security of the staked assets.

Beyond the core services, the parent company behind EigenLayer has offered data availability services to Rollups or application chains that require DA layer services. This product is named “EigenDA”. There’s a synergistic relationship between EigenDA and EigenLayer in their operations.
EigenLayer aims to address the following pain points:
1. For Diverse Blockchain Initiatives: EigenLayer can help to reduce the high costs associated with independently building their trustless network. Instead of doing it all in-house, projects can directly purchase staking assets and node operators on the EigenLayer platform.

Source: EigenLayer Whitepaper
2. For the Ethereum Ecosystem: EigenLayer enhances the utility of LSD. By positioning $ETH as a preferred security collateral option for multiple projects, EigenLayer bolsters the demand for $ETH within the broader market.
3. For LSD Asset Holders: EigenLayer enhances the capital efficiency of LSD assets, ensuring holders receive better returns on their investments.
The Clientele of EigenLayer
EigenLayer offers a range of services tailored to different parties in the blockchain ecosystem. Let’s break down the primary users and their specific needs:
LSD Asset Providers: Their main goal is to earn more yield from their LSD assets beyond the standard PoS rewards. They are willing to provide their LSD assets as collateral to node operators, even though they face potential slashing.Node Operators: They obtain LSD assets through EigenLayer to provide node services to projects in need of AVS services. Their revenue is derived from node rewards and transaction fees provided by these projects.AVS Demanders: These are typically projects that require AVS to ensure their security and are looking to reduce associated costs. Examples might include a particular Rollup or cross-chain bridge that utilizes LSD assets as collateral for node operation. These projects can conveniently purchase AVS services from EigenLayer without the complexities of setting it up themselves.
The primary demand for EigenDA stems from diverse Rollups and application chains.
Operation Mechanism in EigenLayer
EigenLayer provides users with the ability to restake their tokens, which were originally staked on the Ethereum network. This includes a variety of tokens such as stETH, rETH, and cbETH. The staking service providers play a pivotal role in ensuring that these tokens are matched with the appropriate security network demanders, facilitating the provision of AVS services. The primary collateral for AVS consists of the tokens users have staked on EigenLayer. As a gesture of appreciation and compensation, the projects on the receiving end of these services distribute a “security fee” to these stakers.
Advancements in Product Offerings
At present, EigenLayer has rolled out its restake feature exclusively for LSD. The platform is still in the throes of developing node operation staking and LSD-based AVS services. During their two public LSD asset deposit events, there was a significant user turnout, with deposits swiftly reaching their maximum limits. A major driving force behind this overwhelming participation was the speculation surrounding potential airdrop rewards from EigenLayer. Additionally, users have the option to deposit batches of 32 ETH to engage in Restake. Despite these deposit restrictions, EigenLayer boasts a collection of approximately 150,000 staking ETH.

Source: https://app.eigenlayer.xyz/
According to EigenLayer’s officially released roadmap, the primary focus for the ongoing Q3 revolves around the development of the Operator testnet, which is geared towards node operators. The subsequent Q4 is earmarked for the initiation of the AVS service testnet development. 

https://docs.EigenLayer.xyz/overview/readme/protocol-features/roadmap
When it comes to EigenDA, its first confirmed clientele is the rollup project Mantle, which is based on a fork of Optimistic Virtual Machine. Mantle is currently using a test version of EigenDA as its DA.
Tokenomics
EigenLayer confirmed that it will issue tokens, but details regarding the tokenomics and related information remain undetermined and are yet to be unveiled to the public.
Team and Funding Background
The Core Team 

Founder&CEO: Sreeram Kannan
Renowned as an associate professor in the Department of Computer Engineering at the University of Washington, Dr. Kannan also wears the entrepreneurial hat as the driving force behind Layr Labs – the umbrella entity steering EigenLayer’s vision. With an impressive portfolio of over 20 academic papers in the blockchain sector, Dr. Kannan completed his undergraduate studies in telecommunications at the Indian Institute of Science, and later acquired a master’s degree in mathematics and a Ph.D. in information theory and wireless communication from the University of Illinois at Urbana-Champaign. After serving as a postdoctoral researcher at the University of California, Berkeley, currently, he’s currently at the helm of the University of Washington Blockchain Lab (UW-Blockchain-Lab) while nurturing the next generation of blockchain enthusiasts.

Founder & Chief Strategy Officer: Calvin Liu
After graduating from Cornell University with a dual major in philosophy and economics, Liu boasts an impressive trajectory in data analysis, business consulting, and business strategy. He played a pivotal role as the head of strategy at Compound for nearly four years before joining EigenLayer in 2022.

COO Chris Dury
A graduate with an MBA from the Stern School of Business at New York University, Chris brings with him extensive experience in cloud service product project management. Before joining EigenLayer, he held the position of Senior Vice President of Product at Domino Data Lab, a machine learning platform. He also served in various leadership roles at Amazon AWS, including General Manager and Director, where he led several cloud service projects aimed at game developers. Chris became a part of EigenLayer in early 2022.

Source: https://www.linkedin.com/company/eigenl/
The EigenLayer team is expanding rapidly, with a current headcount of over 30 employees, most of whom are based in Seattle, USA.
Founded in 2021 by Dr. Kannan, Layr Labs is the parent company behind EigenLayer.
However, Layr Labs is not a one-trick pony; its innovative portfolio boasts two other projects: EigenDA and Babylon, the latter also offer cryptoeconomic security services but primarily cater to the specific needs of the Cosmos ecosystem.
Funding Details
EigenLayer has conducted two public funding rounds to date. In 2022, they raised $14.5 million in a seed round (valuation undisclosed), followed by a Series A round in March 2023 where they secured $50 million at a valuation of $500 million.

Some of the well-known VCs are listed below:

Source: rootdata.com
In the same timeframe of 2023, Layr Labs also completed an equity financing round, raising close to $64.48 million. Detailed information can be found in the SEC Filing Document
Market Size, Narrative, and Challenges of the Restaking Business
The Projected Market Size of the Restaking Business
EigenLayer has introduced the innovative concept of restaking, offering “cryptoeconomic security as a Service.” Catering primarily to middleware entities such as oracles, bridges, and DA layers, EigenLayer also extends its offerings to side chains, application chains, and Rollups. The primary pain point it aims to address is to considerably minimize the security expenses for decentralized networks, obviating the need for individual projects to construct their trust networks.
Theoretically, any project that requires staked-token-gated and utilizes game theoretic principles for sustaining consensus in a decentralized network emerges as a probable client for EigenLayer’s services. It’s challenging to precisely estimate the market cap of restaking businesses, but optimistically, it might evolve into a market worth tens of billions of dollars within the next three years.
As of August 30, 2023, the total staked ETH has peaked at a remarkable $42 billion, a figure that’s juxtaposed against its circulating market cap hovering around the $200 billion mark and the total on-chain value under Ethereum is gauged between $300 billion to $400 billion. Given that EigenLayer’s primary future clients will likely be smaller and newer projects compared to Ethereum, which holds a dominant position with $40 billion in staking scale, EigenLayer’s total staking amount might range between $10 billion to $100 billion in the short term.
The Narratives Fueling EigenLayer’s Operations and Growth
Demand Side
With the integration of the much-anticipated Cancun upgrade and the open-source of the OP Stack, smaller Rollups and application chains are rapidly developing, increasing the overall demand for cost-effective AVS.The modularization trend in public chains, Rollups, and application chains has triggered a need for affordable DA layers that operate outside the Ethereum ecosystem. The strategic growth of EigenDA thus becomes instrumental in fueling the demand for EigenLayer, showcasing a synergistic effect where the growth of one bolsters the expansion of the other.
Supply Side
Ethereum’s increasing staking ratio and the expansion in the community of stakers have brought forth an opulence of LSD assets along with an extensive user base. These stakers, are perpetually in pursuit of avenues to optimize the capital efficiency and yield of their LSD holdings. In the future, EigenLayer also hopes to broaden its LSD offering beyond just ETH.
Issues and Challenges
For AVS demanders, one lingering ambiguity is the real-world savings they can achieve by sourcing collateral assets, coupled with the specialized validation node services offered by EigenLayer. The assumption that leveraging ETH LSDs as collateral naturally translates into acquiring Ethereum’s massive security infrastructure, valued at tens of billions, doesn’t hold up to scrutiny. In essence, a project’s economic security is determined by the cumulative magnitude of ETH LSDs and the proficiency of the validation node’s operations. While EigenLayer offers a potentially expedited solution than building an AVS from scratch, the tangible savings concerning cost might fall short of expectations.Introducing external assets as collateral for AVS might diminish the utility of a project’s native tokens. While EigenLayer offers a hybrid staking mechanism that integrates both its proprietary tokens and EigenLayer Restaking, this approach may still encounter notable hesitancy in its adoption.Projects might have an over-dependence on EigenLayer for constructing their AVS, leading to a passive long-term development strategy and potential “bottlenecks” in the future. As these projects burgeon and evolve, a strategic shift towards leveraging their tokens as the prime security collateral seems likely.When projects use LSDs as security collateral, they also need to consider the inherent credit and safety risks associated with the LSD platform, adding another layer of risk.
The Competitive Landscape
“Restaking” is a relatively new concept, pioneered by EigenLayer. Currently, the market landscape is relatively sparse when it comes to competitors imitating this model. However, the main challenge is for prospective clients to make the strategic decision of whether to invest in building a proprietary security network from the ground up or to delegate this intricate task to EigenLayer. EigenLayer still needs a robust portfolio of client implementations to demonstrate the superiority and efficiency of its solution.
Valuation Derivation
EigenLayer, as a novel business project, lacks clear benchmark projects and valuation metrics. Consequently, our approach hinges on extrapolating its valuation via forecasted annual protocol revenue and the Price-to-Sales (PS) ratio.
Before delving into the formal estimation, we need to make a few assumptions:
EigenLayer’s primary business model revolves around collecting a commission of security fees from AVS service clients. Out of the service fee, 90% goes to LSD depositors, 5% to node operators, and EigenLayer retains a 5% commission, a metric aligned with the standards of Lido.AVS service clients pay an average annual security fee equivalent to 10% of the total borrowed LSDs.
The rationale behind this 10% threshold is grounded in prevailing market dynamics. Contemporary mainstream PoS projects bestow annual rewards ranging between 3-8% for PoS stakers. Given the nascent phase of most EigenLayer clients, they’re incentivized at a marginally higher rate. Consequently, our chosen metric of 10% aptly reflects the average security fee ratio.

The reward rate of major Layer1 blockchains
Based on the above assumptions, and considering the volume of borrowed LSDs channeled by EigenLayer with the respective PS, we can delineate specific valuation spectrums. The colored segments represent the valuation ranges we believe are more probable, with greener shades indicating more optimistic forecasts.

I would characterize the overlap of “annual borrowed LSDs with a volume of $2-5 billion and P/S at 20-40x” as a high probability valuation range. The reason is as follows:
At present, the aggregate PoS staking token market capitalization of the top ten public chains touches the $73 billion mark. When we factor in projects like Aptos and Sui, this figure surges to nearly $82 billion. However, a significant portion of the staking for the two projects derives from unreleased tokens held by core teams and institutions. Exercising prudence, we’ve opted to sideline these potential anomalies. Out of caution, I’ve excluded Sui and Aptos. I’ve assumed, albeit a bit arbitrarily, that EigenLayer’s share of LSDs could account for 2.5%-6.5% of the total PoS staking market, corresponding to a market cap of $20-50 billion. The viability of this market share figure is open to interpretation and subjective to individual perspectives.The PS interval of 20x-40x is deeply influenced by Lido’s extant 25x PS, a figure rooted in data as of August 30, 2023, and extrapolated from the fully diluted market cap. It’s noteworthy that emerging narratives, particularly in their inception stages, often warrant a premium valuation.
Based on the above calculations, a valuation range of $2-10 billion might be reasonable for EigenLayer. For primary investors participating in the project at a midpoint valuation of $5 billion, and taking into account potential constraints linked to token vesting, there might not be much of a safety margin left. If market whispers hold weight, and there exist players eager to buy EigenLayer tokens over the counter at a $20 billion valuation, they should proceed with utmost caution.
While we’ve explored EigenLayer’s potential valuation trajectory, it’s vital to differentiate between the overall project valuation and the intrinsic value of its native token. A spectrum of factors will affect the ability of value capture within the business ecosystem. For example:
What percentage of the protocol’s revenue will be distributed to token holders?Beyond just buybacks or dividends, does the token have a robust utility in the business to increase its adoption?Will EigenDA and EigenLayer utilize the same token, providing more scenarios and demand for the token?
If EigenLayer fails to adequately address the first two considerations, the inherent worth of its token may be compromised. Conversely, unforeseen developments related to the third point could bolster the token’s value.
Moreover, as EigenLayer makes its market debut, its valuation will inevitably be swayed by the overarching bullish or bearish sentiments prevalent in the crypto landscape.
The ultimate appraisal? Only the market will tell.
Thoughts on Tokenized Treasury Bond Operations: Finding The Definitive Solution For Mid-Term RWABy Colin Lee, Researcher at Mint Ventures In our previous analysis, we pinpointed Treasury RWAs as the sector poised for an explosive growth in both market cap and user engagement in the mid term. According to data from rwa.xyz, projects dedicated to tokenized treasury bond assets, excluding the bonds within MakerDAO, have approached a market cap of nearly $700 million, representing a growth of around 240% since the beginning of 2023. Additionally, the treasury bond within MakerDAO has rapidly expanded to billions of dollars. This trajectory underscores a swift growth in the tokenized treasuries. Source: https://app.rwa.xyz/treasuries Given this compelling backdrop, let’s pivot to an in-depth analysis of the dominant tokenized treasury products currently shaping the market. 1. The Strategic Importance of Tokenized Treasury Bonds In our recent explorations, notably “An Exploration of Risk Free Rate in The Crypto World” and “The Outlook of The On-chain Bond Market,” we delved into the intricacies of benchmark interest rates in the crypto world, alongside the nascent contours of a potential bond market. Within this context, yields derived from Proof of Stake (PoS) protocols on public blockchains can be conceptually aligned with the crypto equivalent of a risk-free rate, laying the groundwork for a burgeoning bond market orbiting these very yields. While the immediate emergence of a crypto-native bond market mirroring the magnitude of its traditional counterpart remains a forward-looking conjecture, the inception of an ‘on-chain risk-free rate’ (often abbreviated as LSD) carries profound implications for the investors. This is particularly salient for investors who use public blockchain tokens (e.g., ETH) as their monetary standard can secure low-risk returns even in bear markets. From this perspective, some investment strategies from traditional markets can smoothly transition to the crypto-native industry, such as balanced portfolios toggling between stocks and bonds. Tokenized Treasury Bonds, like LSD, can enable USDT-dominated investors to implement traditional portfolio strategies once the risk-free rates from traditional financial markets are introduced into the on-chain world. There are several advantages to this: USDT-dominated investors will have a relatively safe and stable income source even in bear markets. Take the stablecoin market as an example. After gradually entering a bearish trend in mid-2021, the overall stablecoin market has shrunk from $188 billion to less than $130 billion. The reduction in the scale of the stablecoin market has also impacted the overall market liquidity.It paves the way for the smoother introduction and market assimilation of commingled financial products consisting of stocks and bonds. In traditional markets, these commingled funds are familiar to most investors. Their introduction into the on-chain universe is a catalyst for innovation within the DeFi asset management. Source: https://defillama.com/stablecoins The most typical example is MakerDAO. During a bear market and significant increases in U.S. Treasury yields, MakerDAO expanded its investment portfolio to include U.S. treasury bonds and their profitability witnessed a substantial improvement after 2023. Source: https://dune.com/SebVentures/maker—accounting_1 Hence, there is reason to believe that MakerDAO serves as a compelling case study within the DeFi landscape. After witnessing MakerDAO’s enhanced profitability through its RWA strategy, it’s plausible to anticipate that other DeFi protocols will follow suit, seeking to bolster their financial robustness through more diverse strategies like RWA. Particularly during bear markets, RWA can function as a dependable and robust revenue stream that undergirds a project’s longevity and operational stability. 2. Business Model for Tokenized Treasury Bonds At present, tokenized treasury bonds have fostered the development of five business models: the Agency Model, Platform Model, Service Provider Model, Proprietary Model, and the Hybrid Model. The agency model neither directly participates in wrapping underlying assets nor provides KYC services to users. Its primary focus is customer onboarding via crypto-native methodologies, emphasizing business marketing, fund acquisition, and the expansion of the ecosystem and use cases. Projects like TProtocol adopted similar infrastructures to established platforms such as Aave and Compound. These initiatives typically secure liquidity by establishing liquidity pools of capital. This approach involves aggregating funds from various users, which are then allocated to a single borrower tasked with the acquisition of underlying assets, a prominent example being U.S. treasury bonds. The platform model — with notable projects like Desmo Labs — offers a gamut of services including on-chain integration, sales, and KYC. However, it maintains a strategic distance from the direct assets wrapping. These types of projects generally offer three categories of services: the tokenization of assets or equities, on-chain verifiable information services and comprehensive KYC services. In theory, these projects can assist in wrapping any type of asset/equity from traditional markets, their focus extends beyond treasury bonds. Operationally, they resonate more with traditional internet platforms. Thriving in this competitive space demands their all-in-one solution’s usability and ability to acquire customers. The Service Provider Model specializes in the on-chain integration of RWAs, procurement of assets, and comprehensive asset management, with representatives like Monetalis Group. However, they maintain a degree of separation from direct interactions with end-users or institutional buyers of treasury bonds.  The proprietary model, where the project team actively seeks corresponding assets and collaborates with external partners to establish the business framework, will ensure risk isolation of assets and tokenize assets/equities. Notable adherents to this paradigm include the likes of MakerDAO, Franklin OnChain U.S. Government Money Fund, and Frax Finance. This type of model, in comparison to the first two models, involves a higher degree of complexity in off-chain operations, requiring significant efforts in legal navigation, the erection of corporate structures, and the meticulous selection of assets and partners. Despite these demands, the model’s strength lies in its complexity: the underlying assets are relatively controllable, allowing the project team to manage risks proactively. The hybrid model can be a combination of the four models mentioned above. Projects under this model, such as Fortunafi, Centrifuge, and ARKS Labs, offer a spectrum of services that include on-chain integration and KYC processes, whilst maintaining an active role in asset acquisition and presenting direct investment avenues to their clients. A closer examination of Fortunafi reveals a quartet of service categories:(1) Access Capital, which furnishes investees with crucial funding streams; (2) Earn Yield, featuring an array of wrapped assets available for direct investment after completing KYC; (3) Protocol Services, delivering governance, treasury management, and additional protocol-centric services; and (4) Whitelabeled products, providing end-to-end on-chain services for RWA. It is worth noting that the RWA services offered by these projects are not limited to treasury bonds, potentially extending to the on-chain wrapping of a diverse asset. Naturally, the landscape of blockchain’s intersection with real-world assets (RWAs) extends beyond the five detailed models. There exist entities primarily centered around transactional operations, a notable example being platforms like DigiFT. Further elaboration on this topic is not provided here. 3. Asset Side: The Underlying Assets and Its Framework 3.1 Underlying Assets  Currently, several underlying assets are currently making waves in the market: U.S. Treasury ETFs. Pioneers that utilize the U.S. Treasury ETFs include Backed Finance, Swarm, MakerDAO, and ARKS Labs, among others. The advantage of adopting this type of underlying asset is its simplicity: the management of underlying assets is delegated to the issuer and manager of the ETF, including liquidity and the rolling-over of bonds. Consequently, projects adopting these ETFs are spared the operational rigmarole, allowing them to sidestep significant risk issues that haven’t notably surfaced thus far. Hence, the operational risks synonymous with asset management don’t pose a significant concern. Instead, these ventures can channel their focus on enveloping the most substantial and liquid assets the market has to offer.U.S. Treasury Bonds. A distinct set of projects, with noteworthy mentions including OpenEden, TrueFi, and Matrixdock, opt for direct U.S. Treasury Bonds. These projects typically lean towards the shorter end of the maturity spectrum, ensuring an asset liquidity profile that mirrors cash. However, these projects must engage directly with counterparties, thus inheriting the associated asset management risks. The careful selection of suitable partners is crucial in such projects.A combination of U.S. Treasury Bonds, U.S. Government Agency Bond, and Repurchase Agreements. Certain projects such as Franklin OnChain U.S. Government Money Fund, Superstate Trust, TProtocol, Arca Labs, and Maple Finance, have adopted this diversified portfolio. Similarly, these projects delegate the management of underlying assets to professional managers, and issues related to the rollover and liquidity of underlying assets are directly relevant to the project. At the operational level, if the project fails to select a high-quality manager, problems may arise. 3.2 Fee Structure The diversity in underlying assets, as dissected in the prior section, inherently gives rise to distinct fee structures. Without considering gas fees resulting from on-chain transactions, the primary fee structures are as shown in the following diagram: Since the management of the U.S. Treasury Bond ETFs are entrusted to ETF custodians, the primary cost arises from the minting and redemption processes, with fees typically ranging from 0.05% to 0.5%. For the other two types, as they involve the management of underlying assets, additional management and transaction fees come into play. The range of management fees is approximately 0.3% to 0.5%, while transaction fees encompass various aspects like bank transfer fees, and their rates are also around 0.2%. 3.3 The Business Structure The choice of underlying assets inevitably shapes the business structure within the projects. Currently, the market features several categories: 1. Trust: Projects adopting this approach include MakerDAO, among others. Source: https://forum.makerdao.com/t/mip65-clydesdale-governance-framework-setup/16565 This framework operates by having a sponsor allocate assets to a Special Purpose Vehicle (SPV), effectively creating a trust. The sponsor, in turn, gains rights to the trust income, which are then passed on to investors as trust beneficiaries. In the context of MakerDAO’s integration with the U.S. Treasury Bonds, the structure engages diverse roles like managers and auditors. A key participant in this arrangement is Monetalis Group, responsible for managing substantial aspects of the off-chain operations — ranging from the acquisition of underlying assets to regular reporting and the facilitation of on-chain activities. Meanwhile, MakerDAO maintains a significant influence, directing factors such as operational scale and the choice of underlying assets through its established governance protocols. 2. Limited Partnership SPV: Another noteworthy framework being leveraged by blockchain projects like Maple Finance and Matrixdock. This model is distinctive for its proactive engagement in asset selection and liquidity acquisition. The SPV operates as a separate entity, primarily established to pool investor capital for the purpose of asset securitization or acquisition. One of the fundamental reasons for creating an SPV is to insulate stakeholders from bankruptcy risks. It’s essential to note that the trust structure, as previously discussed, is technically a form of an SPV. However, the evolution of SPVs has brought to the table more refined benefits. These include: Streamlining Financial Management: By avoiding the convoluted departmental involvements and ambiguous operational flows common in conventional corporate frameworks, SPVs bring clarity and efficiency to financial processes.Enhancing Transparency: SPVs typically correspond to a single project or asset, which promotes management clarity. In traditional settings, like commercial banks, investors often face barriers in gaining penetrating insights of underlying assets due to limited disclosure. Such information may only be accessible at the management accounting level within the bank. Take individual housing loans as an example; in public financial statements and annual reports, the characteristics of such loans are not detailed, let alone information about individual borrowers. However, if individual housing loans are bundled into an SPV, it’s likely that comprehensive loan data—covering terms, interest rates, collateral details, and loan amounts—would be required for disclosure, offering investors a deeper insight into their investments.Tax and Fee Reduction: SPVs can also be structured to enjoy more favorable tax treatments, depending on the nature of the underlying assets. Source: https://downloads.eth.maple.finance/docs/legal/abe08ded-5d07-42cf-b435-a0d8d8156ca5/Cash_Mngt_T&C.pdf In this business model, two primary layers can be discerned: The User-SPV Interaction: Here, users essentially hold debt obligations against the SPV. The security of user returns hinges significantly on the SPV’s punctuality and consistency in meeting its obligations. The SPV-Bank Dynamics: In this layer, SPVs immerse themselves in the treasury bond market while also engaging in repurchase agreements within the commercial banks. In this process, defaults in interbank repurchase agreements potentially pose a more substantial risk compared to direct holdings of U.S. treasury bonds. Beyond these two layers, users stand exposed to an additional risk concerning the SPV itself. ARKS Labs has expanded upon this business structure by nesting smaller SPVs within a more expansive business framework. This nested configuration allows for the scalability of business operations, making it convenient to add new underlying assets in the future. Interestingly, this strategic architecture is very similar to the MakerDAO’s architecture, as  mentioned in our analysis Ramblings on RWAs: Underlying Assets, Business Structure, and Their Evolution in Crypto World. Source: ARKS Labs 3. Lending Platform + SPV: TProtocol stands out for implementing this distinctive business structure. This model diverges from the previously discussed SPV framework primarily in the role and relationship dynamics around the SPV. In the conventional SPV setup, the project team plays an active part in uncovering and wrapping assets. TProtocol breaks from this norm; here, the SPV is independent, tied not to TProtocol but to the originators of the RWA assets. Taking the diagram below as an example, the SPV initiators can vary, spanning a range of institutions. This variability extends to subsequent on-chain service providers and asset intermediaries, infusing TProtocol’s operational structure with a higher degree of flexibility. However, this flexibility isn’t without its trade-offs. As partners increase, the control over the SPV’s subsequent management, including the ability to inspect and manage service providers, may decrease to some extent. 4. The On-chainization of Fund Shares: Similar to traditional fund purchases, it requires detailed information about the purchasers, aligned with their respective blockchain addresses. Franklin OnChain U.S. Government Money Fund has adopted such a business structure. Projects of this type are more akin to what has often been called ‘blockchainization,’ where the project team brings offline assets and purchaser information onto the blockchain. Future transaction information will also be recorded on the blockchain in a ledger-like method. While the RWA sector is in its early phase, current user demand and capital requirements for such business structures remain moderate. Nonetheless, as the intrinsic value of treasury bond-backed RWAs gradually gains recognition from investors, the scalability of these structures is thrust into the spotlight. The ability to seamlessly integrate new assets and engage additional off-chain service providers could be a determining factor during the rapid development phase of the sector. 4. User End: KYC and Other Requirements The diversity in underlying assets and operational structures inevitably leads to different prerequisites at the user end. Presently, these can be broadly segmented into three aspects: 1. Minimum Investment Threshold: Projects like MakerDAO, ARKS Labs, and TProtocol do not impose a minimum investment amount limit on users. However, projects like Maple Finance, TrueFi, Arca Labs, Backed Finance, and others have set explicit minimum investment amount limits. The ‘no-limit’ approach aligns more with the habits of current DeFi users, while projects with a minimum investment amount exceeding $100,000 target high-net-worth individuals primarily. 2. KYC Requirements: Based on the complexity of KYC, it can be categorized into three types: no-KYC projects, such as Flux Finance, ARKS Labs, and TProtocol; basic KYC, like Desmo Labs, where users only need to upload documentation such as passport details; strict KYC, such as OpenEden, Ondo Finance, Maple Finance, Matrixdock, etc., requiring KYC information on par with the traditional financial industry. Elevated KYC requirements not only signify higher thresholds in the traditional financial sector but are also less palatable for DeFi users. 3. Other Requirements: Certain projects impose geographical constraints on their clientele, limiting their services to non-U.S. participants or barring users from specific jurisdictions like the U.S., Singapore, or Hong Kong, typically employing IP filtering mechanisms. It’s noteworthy that many projects delegate KYC verifications and region-specific restrictions to specialized third-party agencies, thereby abstaining from direct involvement in the KYC verification process. 5. Revenue Allocation Strategy and Composability 5.1 Revenue Allocation Strategy Currently, profit-sharing strategies predominantly fall into two primary categories: The first and most common strategy involves direct profit allocation through debt obligation relationships. Here, users, whether they’re holding debt obligations in an SPV or accessing treasury bonds — through ETFs or other instruments — stand to receive the lion’s share of the yields these bonds generate. Users can expect to receive approximately a net profit of around 4%, excluding profits earned in minting and burning processes and profits earned by agencies. This revenue allocation method bears striking resemblance to LSD projects: the bulk of the staking rewards are funneled back to the users, with a nominal percentage appropriated as fees. The second strategy is currently exclusive to the MakerDAO project, employing a saving rate mechanism. Since user funds do not directly correlate with underlying assets, MakerDAO utilizes an approach similar to traditional banking’s interest rate spread model. On the asset side, capital is channeled into higher-yield ventures like RWAs. Conversely, on the liability side, user returns are governed by the Dai Savings Rate (DSR). The DSR hasn’t been static; it has experienced four noteworthy recalibrations to date: 1) An escalation from 1% to 3.49%. 2) A reduction to 3.19% from 3.49%. 3) A significant leap to 8% from 3.19%. 4) A moderation to 5% from the peak of 8%. This strategy has granted team members greater flexibility. However, its drawbacks may become evident. Users are thrust into a scenario with no concrete predictive framework for anticipating future returns. In the case of RWA backed by the U.S. Treasury Bonds, users might reasonably expect yields akin to those of the bonds themselves. However, due to monetary policy adjustments, such as the recent decision by MakerDAO to distribute excess earnings to depositors, rates have surged to 8%. In the future, if the number of depositors increases significantly, yields could fall back closer to U.S. Treasury bond rates. Such fluctuations may not be favorable for investors seeking stable yield levels. For treasury bond-backed RWAs, having a precise and predictable yield is of utmost importance.  In this context, the first distribution strategy may be preferred over the second. However, should a project following the second strategy stabilize its yield, aligning closely with treasury bond rates, the differential in investor returns between the two methodologies could become negligible. 5.2 Composability Composability within the RWA tokens, specifically those backed by treasury bonds, encounters fragmentation, primarily due to prevailing KYC requirements: Several projects, particularly those enforcing strict KYC protocols — including Ondo Finance, Matrixdock, Franklin OnChain U.S. Government Money Fund, among others — impose allowlist constraints on address accessibility. This regulatory measure dictates that, despite the existence of pertinent token liquidity pools, transactional activities are permissible solely under the conditions of allow-listing and user accreditation.  Relevant token liquidity pools are present on-chain, unrestricted trading remains an unattainable luxury without requisite access permissions. Unless they can significantly expand the scale of the underlying assets, these projects face challenges in gaining support from various DeFi projects for enhanced composability. Conversely, projects without KYC requirements currently do not face composability challenges. The only limitations to composability for these projects are their business resources, prowess in business expansion, and the overall magnitude of the project itself. 6. Summary Analyzing the landscape of RWA projects centered on tokenized treasury bonds, certain business models emerge as potentially successful strategies in the short to medium term: Underlying Assets: Choosing U.S. Treasury Bond ETFs as the underlying asset appears to be a judicious move, delegating the intricacies of liquidity management to stalwarts of the traditional financial arena. For projects venturing into the direct acquisition of U.S. Treasury bonds or a blend of assets, the prowess in forging alliances with adept partners will be under scrutiny. Business Structure: Utilizing established and adaptable models is preferable, especially those with solid scalability, to facilitate rapid expansion in scale and integrate new asset classes in the future. User Base: In the medium to short term, projects with no KYC requirements and minimum investment thresholds tend to attract a broader user base. Moving forward, should regulatory bodies mandate KYC procedures, the adoption of a more streamlined, or basic KYC process could gain prominence as a mainstream solution.  Yield Distribution: In order to provide investors in the U.S. Treasury Bond RWAs with a more stable and assured expectation of yield, the optimal solution is for the project to provide users with a yield that is consistent with the ratio of Treasury yields. Composability: Until regulatory restrictions limit access to on-chain treasury bond RWAs, expanding the utility of RWA tokens held by users becomes an essential strategy. Projects must focus on this aspect to spur substantial business traction in the medium to long term. Peering into the longer horizon, as regulatory frameworks become more pronounced and intricate, projects that can efficiently navigate and comply with basic KYC protocols may find themselves at an advantageous point, poised for sustained success.

Thoughts on Tokenized Treasury Bond Operations: Finding The Definitive Solution For Mid-Term RWA

By Colin Lee, Researcher at Mint Ventures
In our previous analysis, we pinpointed Treasury RWAs as the sector poised for an explosive growth in both market cap and user engagement in the mid term. According to data from rwa.xyz, projects dedicated to tokenized treasury bond assets, excluding the bonds within MakerDAO, have approached a market cap of nearly $700 million, representing a growth of around 240% since the beginning of 2023. Additionally, the treasury bond within MakerDAO has rapidly expanded to billions of dollars. This trajectory underscores a swift growth in the tokenized treasuries.
Source: https://app.rwa.xyz/treasuries
Given this compelling backdrop, let’s pivot to an in-depth analysis of the dominant tokenized treasury products currently shaping the market.
1. The Strategic Importance of Tokenized Treasury Bonds
In our recent explorations, notably “An Exploration of Risk Free Rate in The Crypto World” and “The Outlook of The On-chain Bond Market,” we delved into the intricacies of benchmark interest rates in the crypto world, alongside the nascent contours of a potential bond market. Within this context, yields derived from Proof of Stake (PoS) protocols on public blockchains can be conceptually aligned with the crypto equivalent of a risk-free rate, laying the groundwork for a burgeoning bond market orbiting these very yields.
While the immediate emergence of a crypto-native bond market mirroring the magnitude of its traditional counterpart remains a forward-looking conjecture, the inception of an ‘on-chain risk-free rate’ (often abbreviated as LSD) carries profound implications for the investors. This is particularly salient for investors who use public blockchain tokens (e.g., ETH) as their monetary standard can secure low-risk returns even in bear markets. From this perspective, some investment strategies from traditional markets can smoothly transition to the crypto-native industry, such as balanced portfolios toggling between stocks and bonds.
Tokenized Treasury Bonds, like LSD, can enable USDT-dominated investors to implement traditional portfolio strategies once the risk-free rates from traditional financial markets are introduced into the on-chain world. There are several advantages to this:
USDT-dominated investors will have a relatively safe and stable income source even in bear markets. Take the stablecoin market as an example. After gradually entering a bearish trend in mid-2021, the overall stablecoin market has shrunk from $188 billion to less than $130 billion. The reduction in the scale of the stablecoin market has also impacted the overall market liquidity.It paves the way for the smoother introduction and market assimilation of commingled financial products consisting of stocks and bonds. In traditional markets, these commingled funds are familiar to most investors. Their introduction into the on-chain universe is a catalyst for innovation within the DeFi asset management.
Source: https://defillama.com/stablecoins
The most typical example is MakerDAO. During a bear market and significant increases in U.S. Treasury yields, MakerDAO expanded its investment portfolio to include U.S. treasury bonds and their profitability witnessed a substantial improvement after 2023.
Source: https://dune.com/SebVentures/maker—accounting_1
Hence, there is reason to believe that MakerDAO serves as a compelling case study within the DeFi landscape. After witnessing MakerDAO’s enhanced profitability through its RWA strategy, it’s plausible to anticipate that other DeFi protocols will follow suit, seeking to bolster their financial robustness through more diverse strategies like RWA. Particularly during bear markets, RWA can function as a dependable and robust revenue stream that undergirds a project’s longevity and operational stability.
2. Business Model for Tokenized Treasury Bonds
At present, tokenized treasury bonds have fostered the development of five business models: the Agency Model, Platform Model, Service Provider Model, Proprietary Model, and the Hybrid Model.
The agency model neither directly participates in wrapping underlying assets nor provides KYC services to users. Its primary focus is customer onboarding via crypto-native methodologies, emphasizing business marketing, fund acquisition, and the expansion of the ecosystem and use cases. Projects like TProtocol adopted similar infrastructures to established platforms such as Aave and Compound. These initiatives typically secure liquidity by establishing liquidity pools of capital. This approach involves aggregating funds from various users, which are then allocated to a single borrower tasked with the acquisition of underlying assets, a prominent example being U.S. treasury bonds.
The platform model — with notable projects like Desmo Labs — offers a gamut of services including on-chain integration, sales, and KYC. However, it maintains a strategic distance from the direct assets wrapping. These types of projects generally offer three categories of services: the tokenization of assets or equities, on-chain verifiable information services and comprehensive KYC services. In theory, these projects can assist in wrapping any type of asset/equity from traditional markets, their focus extends beyond treasury bonds. Operationally, they resonate more with traditional internet platforms. Thriving in this competitive space demands their all-in-one solution’s usability and ability to acquire customers.
The Service Provider Model specializes in the on-chain integration of RWAs, procurement of assets, and comprehensive asset management, with representatives like Monetalis Group. However, they maintain a degree of separation from direct interactions with end-users or institutional buyers of treasury bonds. 
The proprietary model, where the project team actively seeks corresponding assets and collaborates with external partners to establish the business framework, will ensure risk isolation of assets and tokenize assets/equities. Notable adherents to this paradigm include the likes of MakerDAO, Franklin OnChain U.S. Government Money Fund, and Frax Finance. This type of model, in comparison to the first two models, involves a higher degree of complexity in off-chain operations, requiring significant efforts in legal navigation, the erection of corporate structures, and the meticulous selection of assets and partners. Despite these demands, the model’s strength lies in its complexity: the underlying assets are relatively controllable, allowing the project team to manage risks proactively.
The hybrid model can be a combination of the four models mentioned above. Projects under this model, such as Fortunafi, Centrifuge, and ARKS Labs, offer a spectrum of services that include on-chain integration and KYC processes, whilst maintaining an active role in asset acquisition and presenting direct investment avenues to their clients. A closer examination of Fortunafi reveals a quartet of service categories:(1) Access Capital, which furnishes investees with crucial funding streams; (2) Earn Yield, featuring an array of wrapped assets available for direct investment after completing KYC; (3) Protocol Services, delivering governance, treasury management, and additional protocol-centric services; and (4) Whitelabeled products, providing end-to-end on-chain services for RWA. It is worth noting that the RWA services offered by these projects are not limited to treasury bonds, potentially extending to the on-chain wrapping of a diverse asset.
Naturally, the landscape of blockchain’s intersection with real-world assets (RWAs) extends beyond the five detailed models. There exist entities primarily centered around transactional operations, a notable example being platforms like DigiFT. Further elaboration on this topic is not provided here.

3. Asset Side: The Underlying Assets and Its Framework
3.1 Underlying Assets 
Currently, several underlying assets are currently making waves in the market:
U.S. Treasury ETFs. Pioneers that utilize the U.S. Treasury ETFs include Backed Finance, Swarm, MakerDAO, and ARKS Labs, among others. The advantage of adopting this type of underlying asset is its simplicity: the management of underlying assets is delegated to the issuer and manager of the ETF, including liquidity and the rolling-over of bonds. Consequently, projects adopting these ETFs are spared the operational rigmarole, allowing them to sidestep significant risk issues that haven’t notably surfaced thus far. Hence, the operational risks synonymous with asset management don’t pose a significant concern. Instead, these ventures can channel their focus on enveloping the most substantial and liquid assets the market has to offer.U.S. Treasury Bonds. A distinct set of projects, with noteworthy mentions including OpenEden, TrueFi, and Matrixdock, opt for direct U.S. Treasury Bonds. These projects typically lean towards the shorter end of the maturity spectrum, ensuring an asset liquidity profile that mirrors cash. However, these projects must engage directly with counterparties, thus inheriting the associated asset management risks. The careful selection of suitable partners is crucial in such projects.A combination of U.S. Treasury Bonds, U.S. Government Agency Bond, and Repurchase Agreements. Certain projects such as Franklin OnChain U.S. Government Money Fund, Superstate Trust, TProtocol, Arca Labs, and Maple Finance, have adopted this diversified portfolio. Similarly, these projects delegate the management of underlying assets to professional managers, and issues related to the rollover and liquidity of underlying assets are directly relevant to the project. At the operational level, if the project fails to select a high-quality manager, problems may arise.
3.2 Fee Structure
The diversity in underlying assets, as dissected in the prior section, inherently gives rise to distinct fee structures. Without considering gas fees resulting from on-chain transactions, the primary fee structures are as shown in the following diagram:

Since the management of the U.S. Treasury Bond ETFs are entrusted to ETF custodians, the primary cost arises from the minting and redemption processes, with fees typically ranging from 0.05% to 0.5%. For the other two types, as they involve the management of underlying assets, additional management and transaction fees come into play. The range of management fees is approximately 0.3% to 0.5%, while transaction fees encompass various aspects like bank transfer fees, and their rates are also around 0.2%.
3.3 The Business Structure
The choice of underlying assets inevitably shapes the business structure within the projects. Currently, the market features several categories:
1. Trust: Projects adopting this approach include MakerDAO, among others.
Source: https://forum.makerdao.com/t/mip65-clydesdale-governance-framework-setup/16565
This framework operates by having a sponsor allocate assets to a Special Purpose Vehicle (SPV), effectively creating a trust. The sponsor, in turn, gains rights to the trust income, which are then passed on to investors as trust beneficiaries. In the context of MakerDAO’s integration with the U.S. Treasury Bonds, the structure engages diverse roles like managers and auditors. A key participant in this arrangement is Monetalis Group, responsible for managing substantial aspects of the off-chain operations — ranging from the acquisition of underlying assets to regular reporting and the facilitation of on-chain activities. Meanwhile, MakerDAO maintains a significant influence, directing factors such as operational scale and the choice of underlying assets through its established governance protocols.
2. Limited Partnership SPV: Another noteworthy framework being leveraged by blockchain projects like Maple Finance and Matrixdock. This model is distinctive for its proactive engagement in asset selection and liquidity acquisition.
The SPV operates as a separate entity, primarily established to pool investor capital for the purpose of asset securitization or acquisition. One of the fundamental reasons for creating an SPV is to insulate stakeholders from bankruptcy risks. It’s essential to note that the trust structure, as previously discussed, is technically a form of an SPV. However, the evolution of SPVs has brought to the table more refined benefits. These include:
Streamlining Financial Management: By avoiding the convoluted departmental involvements and ambiguous operational flows common in conventional corporate frameworks, SPVs bring clarity and efficiency to financial processes.Enhancing Transparency: SPVs typically correspond to a single project or asset, which promotes management clarity. In traditional settings, like commercial banks, investors often face barriers in gaining penetrating insights of underlying assets due to limited disclosure. Such information may only be accessible at the management accounting level within the bank. Take individual housing loans as an example; in public financial statements and annual reports, the characteristics of such loans are not detailed, let alone information about individual borrowers. However, if individual housing loans are bundled into an SPV, it’s likely that comprehensive loan data—covering terms, interest rates, collateral details, and loan amounts—would be required for disclosure, offering investors a deeper insight into their investments.Tax and Fee Reduction: SPVs can also be structured to enjoy more favorable tax treatments, depending on the nature of the underlying assets.
Source: https://downloads.eth.maple.finance/docs/legal/abe08ded-5d07-42cf-b435-a0d8d8156ca5/Cash_Mngt_T&C.pdf
In this business model, two primary layers can be discerned:
The User-SPV Interaction: Here, users essentially hold debt obligations against the SPV. The security of user returns hinges significantly on the SPV’s punctuality and consistency in meeting its obligations. The SPV-Bank Dynamics: In this layer, SPVs immerse themselves in the treasury bond market while also engaging in repurchase agreements within the commercial banks. In this process, defaults in interbank repurchase agreements potentially pose a more substantial risk compared to direct holdings of U.S. treasury bonds.
Beyond these two layers, users stand exposed to an additional risk concerning the SPV itself.
ARKS Labs has expanded upon this business structure by nesting smaller SPVs within a more expansive business framework. This nested configuration allows for the scalability of business operations, making it convenient to add new underlying assets in the future. Interestingly, this strategic architecture is very similar to the MakerDAO’s architecture, as  mentioned in our analysis Ramblings on RWAs: Underlying Assets, Business Structure, and Their Evolution in Crypto World.
Source: ARKS Labs
3. Lending Platform + SPV: TProtocol stands out for implementing this distinctive business structure. This model diverges from the previously discussed SPV framework primarily in the role and relationship dynamics around the SPV. In the conventional SPV setup, the project team plays an active part in uncovering and wrapping assets. TProtocol breaks from this norm; here, the SPV is independent, tied not to TProtocol but to the originators of the RWA assets.
Taking the diagram below as an example, the SPV initiators can vary, spanning a range of institutions. This variability extends to subsequent on-chain service providers and asset intermediaries, infusing TProtocol’s operational structure with a higher degree of flexibility. However, this flexibility isn’t without its trade-offs. As partners increase, the control over the SPV’s subsequent management, including the ability to inspect and manage service providers, may decrease to some extent.

4. The On-chainization of Fund Shares: Similar to traditional fund purchases, it requires detailed information about the purchasers, aligned with their respective blockchain addresses. Franklin OnChain U.S. Government Money Fund has adopted such a business structure. Projects of this type are more akin to what has often been called ‘blockchainization,’ where the project team brings offline assets and purchaser information onto the blockchain. Future transaction information will also be recorded on the blockchain in a ledger-like method.
While the RWA sector is in its early phase, current user demand and capital requirements for such business structures remain moderate. Nonetheless, as the intrinsic value of treasury bond-backed RWAs gradually gains recognition from investors, the scalability of these structures is thrust into the spotlight. The ability to seamlessly integrate new assets and engage additional off-chain service providers could be a determining factor during the rapid development phase of the sector.
4. User End: KYC and Other Requirements
The diversity in underlying assets and operational structures inevitably leads to different prerequisites at the user end. Presently, these can be broadly segmented into three aspects:
1. Minimum Investment Threshold: Projects like MakerDAO, ARKS Labs, and TProtocol do not impose a minimum investment amount limit on users. However, projects like Maple Finance, TrueFi, Arca Labs, Backed Finance, and others have set explicit minimum investment amount limits. The ‘no-limit’ approach aligns more with the habits of current DeFi users, while projects with a minimum investment amount exceeding $100,000 target high-net-worth individuals primarily.
2. KYC Requirements: Based on the complexity of KYC, it can be categorized into three types: no-KYC projects, such as Flux Finance, ARKS Labs, and TProtocol; basic KYC, like Desmo Labs, where users only need to upload documentation such as passport details; strict KYC, such as OpenEden, Ondo Finance, Maple Finance, Matrixdock, etc., requiring KYC information on par with the traditional financial industry. Elevated KYC requirements not only signify higher thresholds in the traditional financial sector but are also less palatable for DeFi users.
3. Other Requirements: Certain projects impose geographical constraints on their clientele, limiting their services to non-U.S. participants or barring users from specific jurisdictions like the U.S., Singapore, or Hong Kong, typically employing IP filtering mechanisms.
It’s noteworthy that many projects delegate KYC verifications and region-specific restrictions to specialized third-party agencies, thereby abstaining from direct involvement in the KYC verification process.
5. Revenue Allocation Strategy and Composability
5.1 Revenue Allocation Strategy
Currently, profit-sharing strategies predominantly fall into two primary categories:
The first and most common strategy involves direct profit allocation through debt obligation relationships. Here, users, whether they’re holding debt obligations in an SPV or accessing treasury bonds — through ETFs or other instruments — stand to receive the lion’s share of the yields these bonds generate. Users can expect to receive approximately a net profit of around 4%, excluding profits earned in minting and burning processes and profits earned by agencies.
This revenue allocation method bears striking resemblance to LSD projects: the bulk of the staking rewards are funneled back to the users, with a nominal percentage appropriated as fees.
The second strategy is currently exclusive to the MakerDAO project, employing a saving rate mechanism. Since user funds do not directly correlate with underlying assets, MakerDAO utilizes an approach similar to traditional banking’s interest rate spread model. On the asset side, capital is channeled into higher-yield ventures like RWAs. Conversely, on the liability side, user returns are governed by the Dai Savings Rate (DSR). The DSR hasn’t been static; it has experienced four noteworthy recalibrations to date: 1) An escalation from 1% to 3.49%. 2) A reduction to 3.19% from 3.49%. 3) A significant leap to 8% from 3.19%. 4) A moderation to 5% from the peak of 8%.
This strategy has granted team members greater flexibility. However, its drawbacks may become evident. Users are thrust into a scenario with no concrete predictive framework for anticipating future returns. In the case of RWA backed by the U.S. Treasury Bonds, users might reasonably expect yields akin to those of the bonds themselves. However, due to monetary policy adjustments, such as the recent decision by MakerDAO to distribute excess earnings to depositors, rates have surged to 8%. In the future, if the number of depositors increases significantly, yields could fall back closer to U.S. Treasury bond rates. Such fluctuations may not be favorable for investors seeking stable yield levels.
For treasury bond-backed RWAs, having a precise and predictable yield is of utmost importance.  In this context, the first distribution strategy may be preferred over the second. However, should a project following the second strategy stabilize its yield, aligning closely with treasury bond rates, the differential in investor returns between the two methodologies could become negligible.
5.2 Composability
Composability within the RWA tokens, specifically those backed by treasury bonds, encounters fragmentation, primarily due to prevailing KYC requirements:
Several projects, particularly those enforcing strict KYC protocols — including Ondo Finance, Matrixdock, Franklin OnChain U.S. Government Money Fund, among others — impose allowlist constraints on address accessibility. This regulatory measure dictates that, despite the existence of pertinent token liquidity pools, transactional activities are permissible solely under the conditions of allow-listing and user accreditation. 
Relevant token liquidity pools are present on-chain, unrestricted trading remains an unattainable luxury without requisite access permissions.
Unless they can significantly expand the scale of the underlying assets, these projects face challenges in gaining support from various DeFi projects for enhanced composability.
Conversely, projects without KYC requirements currently do not face composability challenges. The only limitations to composability for these projects are their business resources, prowess in business expansion, and the overall magnitude of the project itself.
6. Summary
Analyzing the landscape of RWA projects centered on tokenized treasury bonds, certain business models emerge as potentially successful strategies in the short to medium term:
Underlying Assets: Choosing U.S. Treasury Bond ETFs as the underlying asset appears to be a judicious move, delegating the intricacies of liquidity management to stalwarts of the traditional financial arena. For projects venturing into the direct acquisition of U.S. Treasury bonds or a blend of assets, the prowess in forging alliances with adept partners will be under scrutiny.
Business Structure: Utilizing established and adaptable models is preferable, especially those with solid scalability, to facilitate rapid expansion in scale and integrate new asset classes in the future.
User Base: In the medium to short term, projects with no KYC requirements and minimum investment thresholds tend to attract a broader user base. Moving forward, should regulatory bodies mandate KYC procedures, the adoption of a more streamlined, or basic KYC process could gain prominence as a mainstream solution. 
Yield Distribution: In order to provide investors in the U.S. Treasury Bond RWAs with a more stable and assured expectation of yield, the optimal solution is for the project to provide users with a yield that is consistent with the ratio of Treasury yields.
Composability: Until regulatory restrictions limit access to on-chain treasury bond RWAs, expanding the utility of RWA tokens held by users becomes an essential strategy. Projects must focus on this aspect to spur substantial business traction in the medium to long term.
Peering into the longer horizon, as regulatory frameworks become more pronounced and intricate, projects that can efficiently navigate and comply with basic KYC protocols may find themselves at an advantageous point, poised for sustained success.
OP vs ARB: Which Stands Out As The Better Investment Target Post-Cancun Upgrade?By Alex Xu, Research Partner at Mint Ventures In 2023, the Cancun upgrade has firmly established itself as one of the most pivotal events in the crypto world since the Shanghai upgrade. Layer2 projects that stand to gain from it are central to our focus for the year. It is said that the Cancun upgrade, proposed in EIP-4844, is projected to be implemented between October 2023 and January 2024. Meanwhile, tokens of forefront Layer2 solutions—ARB (Arbitrum) and OP (Optimism) underwent retracements after marking fresh all-time highs earlier this year. This suggests that the current climate might still present an opportune window for strategic engagement. Certainly, when evaluated by market capitalization, the market cap of OP has been consistently breaking records since 2023, whereas ARB has largely been trading in a consolidation phase at lower levels. Through this article, we aim to unpack the following aspects: The inherent value and business model behind Layer2. OP vs. ARB: A comprehensive analysis of their competitiveness and key performance indicators. The transformative impact the Cancun upgrade could have on Layer2 fundamentals. Unveiling the potential risks associated with Optimism. The insights and opinions presented in this article reflect my views as of the publication date primarily through a business lens, with a limited delve into the technical intricacies of Layer 2 and may contain factual inaccuracies or biases. This article is intended for discussion purposes only, and feedback is welcomed. 1. Unpacking the Inherent Value and Business Model of L2 1.1 Source of L2’s Value and Its Defensive Moat Layer 2 (L2) solutions can be seen as the natural evolution of Layer 1 (L1) protocols, echoing their fundamental promises: delivering a robust, censorship-resistant, and universally accessible block space. A fitting analogy might be to view L2 as a specialized on-chain cloud service. L2’s competitive advantage primarily emerges in its economic efficiency. As a case in point, Optimism showcases gas expenditures that are just 1.56% of those on Ethereum.  However, one must recognize the niche nature of “block space as a specialized cloud service”. Not all online services demand the distinctive functionalities L1 or L2 platforms provide. In the traditional world fraught with restrictions and obscured financial operations, blockchain emerges as a beacon, creating an environment for myriad practical applications.  L2’s block space valuation is intrinsically tethered to its demand, driven by both service providers and users.  Much like L1, L2 has the ability to cultivate a robust defensive moat through the power of network effects. Within the L2 paradigm, as the user base broadens and diversifies, collaboration becomes more frictionless, streamlining interactions within this ecosystem. This not only nurtures the path-breaking service innovations but also further expands the user pool. Every newcomer that joins and establishes their presence on the L2 network amplifies the intrinsic value for the existing users.  In the Web3 landscape, the network effect prowess of L1 & L2 platforms is eclipsed only by stablecoins, with USDT standing as a prime exemplar. The leading L1 & L2 platforms inherently present higher entry barriers, and as a result, often command a more elevated valuation premium. 1.2 Profit Model of Layer2 L2’s revenue structure is straightforward. On one side, L2 secures its data by procuring storage capacity from reliable Data Availability (DA) layers. This serves as an insurance policy—ensuring that if any disruptions occur on L2, the data remains safeguarded and can be readily restored using these backup layers. On the flip side, L2 offers users an affordable block space, and in return, charges them accordingly. The resultant profit margin is primarily a function of the fees garnered on L2 (base fees plus MEV earnings), offset by the costs remitted to the DA service providers. Delving deeper with Optimism and Arbitrum as illustrative examples, both platforms have strategically allied with Ethereum for their Data Availability needs, capitalizing on Ethereum’s peerless decentralized stature and its gold standard reputation in the L1 realm. They pay in Gas to Ethereum, facilitating the archival of their condensed L2 data within the Ethereum framework. Their revenue stream is predominantly anchored in the Gas and MEV fees accrued when their user base—which spans everyday users to sophisticated developers—interact on their L2 platforms. Deducting operational costs from this revenue gives a clear picture of their gross profit. It’s pivotal to emphasize that “gross profit” here is a metric that doesn’t encompass subsequent project-associated financial commitments—like salaries, ecosystem incentives, promotional activities, and other overheads. The Function of Sequencers within L2 Operations The fee collection on L2, as well as expenses to L1, both executed by the L2 sequencer. Profits from these operations flow directly to the sequencer. In the current landscape, both Optimism and Arbitrum have their sequencers operated by the official team, with consequent profits enriching their treasury. Naturally, having a centralized sequencer poses an elevated single-point vulnerability. Therefore, both Optimism and Arbitrum have committed to transition towards a more decentralized sequencer framework in their roadmaps. The decentralization of sequencers is poised to adopt a PoS (Proof of Stake) framework. Under this model, decentralized sequencers would be required to stake native L2 tokens like ARB or OP as collateral. If they fail to uphold their obligations, these staked tokens risk being slashed. Users have the choice to either stake on their own as sequencers or leverage staking services provided by entities like Lido. In this arrangement, while users supply the staked tokens, specialized and dispersed sequencer operators oversee the sequencing and data uploading tasks. Users who participate in staking can then earn a substantial portion of the L2 fees and MEV rewards, with Lido’s model earmarking 90% of these rewards for stakers. As this narrative unfolds, both ARB and OP  will be infused with economic utility beyond their foundational governance role. 1.3 ARB VS OP The Competitive Advantage of Optimism Since its inception, ARB has consistently outperformed OP on several business-centric metrics, reinforcing its superior positioning in the market. Leveraging its inherent strengths of network effects, Arbitrum seemed poised for not just market dominance but also for commanding a higher valuation premium. However, the trend began to shift following Optimism’s introduction of the Superchain strategy in February 2023 and the strong push behind the OP Stack. OP Stack is an open-source L2 technology suite, offering a streamlined solution for emerging projects seeking to leverage L2 capabilities. By utilizing the OP Stack, they can expediently roll out their personalized L2 solutions, significantly slashing both development and trial costs. The “Superchain” stands as Optimism’s visionary blueprint for the future. By embracing the L2 built upon the OP Stack, a uniformity in technical architecture is achieved. This facilitates seamless, ultra-secure, and high-speed atomic-level communication and interaction of both information and assets across different platforms. Drawing parallels with the Cosmos Interchain concept, this innovative framework has been named the “Superchain.” Following the introduction of the OP Stack and Superchain, Coinbase was among the early adopters, launching its Layer2 Base Chain built on the OP Stack in February and by August 10th, the platform officially went live. Coinbase’s pioneering approach served as a catalyst, setting the stage for a surge in OP Stack adoption throughout the crypto ecosystem. This ripple effect saw major players like Binance introducing their opBNB. Other prominent entrants to the OP Stack community included the Paradigm-endorsed NFT project, ZORA, and the Loot ecosystem’s Adventure Gold DAO. The Gitcoin-affiliated public service endeavor, Public Goods Network (PGN), the renowned options trading platform, Lyra, and the distinguished on-chain analytics provider, Debank, also made their alignments clear with this trend. Notably, Celo, traditionally an L1 solution, has embraced OP Stack for its L2 strategy. Historically, L2 solutions primarily catered to users, treating block space as a unique domain for their operations. Yet, the introduction of the Superchain and OP Stack has revolutionized this perspective, expanding the definition of ‘users’ to include L2 operators themselves. As a result, L2’s domain, which was traditionally a B2C model (with L2 developers also counted as consumers), has morphed into an inclusive B2B2C framework. This evolutionary step has carved out new value pathways for Optimism and solidified its competitive defenses. Network effects in the multi-chain era. Expanding the traditional understanding of “network” from a single chain to encompass a “multi-chain ecosystem”, seamless integration of funds and information across diverse chains is achieved via the uniform OP Stack. Entrusted with user acquisition and engagement, L2 operators strive to bolster the cumulative user base of this multi-chain ecosystem. As this collective user population grows, the intrinsic value of each L2, as well as every individual user within this network, experiences a surge. Economies of scale. Optimism bears the fixed costs of the technical framework—maintaining and updating the OP Stack. However, the constant feedback and refinements offered by its diverse user base elevate its overall quality. This cost-efficient strategy not only diminishes the expenses linked to single-chain maintenance and updates, sequencers, and indexing incentives but also amplifies its allure for prospective L2 solution seekers. Synergistic Ecosystem. By weaving major Web3 players into the fabric of the Optimism ecosystem, a shared vision and mutual interests emerge. This alignment paves the way for robust support in areas like technological improvements, user acquisition, developer engagement, and investment drives. Evolving from a single-chain ecosystem to a cross-chain ecosystem, Optimism not only reaps the advantages of anticipated growth in users and developers across the full spectrum, but its core metrics on the OP Mainnet are steadily inching closer, even surpassing the once-distant leader, Arbitrum, as proved by the following metrics: a. Monthly Active Addresses: The proportion of Optimism to Arbitrum’s monthly active addresses has surged from a previous low of 32.1% to its current rate of 73.6%. Source: tokenterminal b. Monthly L2 Profits: The L2 profit ratio of Optimism relative to Arbitrum has surged from a mere 16.4% to 100.2%, now surpassing Arbitrum. Source: Tokenterminal c. Monthly Interaction Counts: The interaction count ratio of Optimism compared to Arbitrum has swelled from a previous 22.4% to 106.5%. Source: Tokenterminal d. TVL: The TVL ratio of Optimism relative to Arbitrum has grown from a trough of 1/3 to the present level of 1/2.  The TVL on the OP Mainnet was approximately $20 billion in March and has since risen to an estimated $30 billion. The TVL on the Arbitrum was approximately $60 billion in March (reaching a high of about $70 billion) and continues to hover around $60 billion currently. Source: https://l2beat.com/ Comparison of Valuations: OP vs. ARB As Optimism’s operational metrics continue their swift ascent, the valuation of OP Mainnet relative to Arbitrum is increasingly attractive. P/E Ratio (circulating market cap/L2’s annualized profit): Based on the revenue data of the most recent week, Optimism’s P/E ratio has moderated to slightly under 80, compared to Arbitrum’s which is positioned at 113. This is particularly noteworthy considering OP’s strong price performance and the continuous vesting and growth of its circulating supply over the past few months. Source: tokenterminal The Vigorous Expansion of the Optimism Ecosystem While OP Mainnet metrics progressively advance against Arbitrum, driven in part by a rejuvenation within its own ecosystem, it’s the contributions from new entrants to the Optimism community that have had a pronounced impact A case in point is evident when examining the projects that have contributed the most transactional volume to the OP Mainnet over the last 30 days, where Gnosis Safe contract operations stand out in the top spot, with Worldcoin securing the fourth position. Source: https://dune.com/optimismfnd/Optimism Intriguingly, a substantial portion of the transactions on Gnosis Safe are initiated by the Worldcoin team. By the end of June 2023, World App had successfully deployed in excess of 300,000 Gnosis Safe accounts. This surge is primarily attributed to the migration of World App accounts to the OP Mainnet. According to data released on Worldcoin’s official site on August 11th, the platform claims a user base exceeding 2.2 million. Remarkably, just in the previous week, they have seen the addition of 257,000 new accounts. World App’s daily transaction count reaches an impressive 126,000, constituting approximately 21% of the total daily transfers across both the OP Mainnet and Arbitrum. Source: https://worldcoin.org/ At present, Worldcoin has migrated its ID system and tokens to the OP Mainnet, with  future plans to develop an application chain grounded in the OP Stack. This strategic move is anticipated to usher in a wave of active users and developers. Beyond Worldcoin, the momentum in the wake of Base launch has been nothing short of extraordinary. Coinbase, standing as the inaugural and paramount proponent of the OP Stack, registered a noteworthy 136,000 active addresses as of August 10th. This figure closely mirrors that of the L2 market frontrunner, Arbitrum, which hosts 147,000 addresses. Source: https://dune.com/tk-research/base Among all L1 & L2 smart contract platforms, these metrics are only eclipsed by Tron (1.5M), BNBchain (1.04M), Polygon (0.37M), and Arbitrum (0.14M). Interestingly,, post the official debut of Base on August 10th, the application that made a breakout wasn’t rooted in the conventional domains of DeFi or meme tokens. Rather, it was a socially-oriented application named friend.tech, adding an unexpected twist to the narrative. Arbitrum’s Dilemma Arbitrum finds itself in a complex situation owing to its strategic positioning. Although it features a robust L2 mainnet, marked by the stellar performances of Arbitrum One and the superior Arbitrum Nova, it concurrently launched its L3 stack, Orbiter, setting it in competition with OP Stack. However, in an environment where L2 is still in its ascendancy, there’s a reluctance among many to pigeonhole themselves into the L3 category and use Arbitrum one as their primary DA layer. Significantly, projects with substantial industry assets—whether in terms of users, developers, or intellectual property—are often inclined towards building on L2. This preference  translates to a higher valuation potential and a broader user outreach. Indeed, the emergence of platforms like ALTLayer, offering Rollups-as-a-Service (RaaS) solutions, represents a shift in how smaller projects and developers approach the implementation of rollups. ALTLayers offers solutions that simplify the process of building and operating rollups, essentially providing a low-threshold, less-code avenue for integration. They assist users in seamlessly integrating various rollup modules available in the market into diverse programs, allowing for a Lego-like building experience. The RaaS modular solution provided by ALTLayer Within the diverse array of Rollup options that RaaS platforms offer, Arbitrum’s Orbiter is but one choice among many. Smaller entities, upon evaluating the breadth of available offerings, may opt for more cost-effective L2 solutions, avoiding constraining themselves within an L3 classification. In this unfolding landscape, despite the marginal lead that Arbitrum One, a standalone L2, holds over its Layer 2 counterparts in transaction volume, it is experiencing a swift decline in market share. This decline is predominantly attributed to both new and existing users transitioning to Optimism-aligned and other diverse Layer 2 ecosystems. Overall, Optimism leverages its open-source L2 toolkit to create a network effect, attracting users via a B2B2C model. This approach seems to have a competitive advantage over Arbitrum’s solid, but single-chain methodology. If Arbitrum doesn’t reassess and adjust its strategic direction promptly, its position as the preeminent L2 single-chain leader could be at serious risk. 2. How the Cancun Upgrade Enhances L2 Project Fundamentals 2.1 Current Project Valuations for Arbitrum and Optimism To accurately assess the prevailing valuation metrics of Arbitrum and Optimism, we have leveraged revenue data from the recent quarter and contrasted it against their current market valuations. Maintaining a constant P/E ratio and considering the projected 90% reduction in L1 costs for Arbitrum and Optimism post-Cancun upgrade (based on the conservative estimates outlined in EIP-4844, forecasting a 90-99% decrease for L2 projects in L1 expenses), the adjusted valuation for $ARB and OP, assuming a consistent L2 pricing model, are detailed below: The Cancun upgrade results in a substantial decrease in L1 costs, thereby directly contributing to improved profitability and an ensuing increase in project valuations. 2.2 The Impact of Cancun Upgrade on L2 Valuation Indeed, the Cancun upgrade, which brings about a reduction in L1 costs, necessitates a re-evaluation of the L2 fee structure for both Arbitrum and Optimism, as maintaining the current fees is untenable. Consequently, our valuation assessment needs to incorporate two crucial variables: To what extent will Arbitrum and Optimism transfer the savings to users by diminishing L2 fees? As L2 fees decrease, what magnitude of surge in L2 transactional activity is anticipated? Based on the assumption that the P/E multiple remains unchanged, the analysis endeavors to deduce the prices of $ARB and OP after the Cancun upgrade based on the alterations in the “Ratio of Cost Savings Transferred to Fee Savings” and “Ratios of Fee Savings to Transfers Increase”. The fundamental rationale behind the two token price projection tables is: The smaller the proportion of L2 users benefiting from the massive cost reductions post the Cancun upgrade, the higher the operational profit for L2. The greater the increase in transactional activity due to L2s reducing fees, the higher the operational profit for L2. Furthermore, given that current gas fees on Optimism are about 30-50% lower than that of Arbitrum, Optimism possesses more flexibility with the decline in L1 costs, giving it greater room to decide how much of the savings it retains. Consequently, it is inferred that Optimism will redistribute between 60-100% of its cost savings to benefit users, while for Arbitrum, this redistribution is likely to be in the range of 70-100%. If we only consider the impact of the Cancun upgrade on the mainnet chain of Optimism and Arbitrum, their potential for price appreciation seems closely aligned. Certainly, the analysis outlined above regarding the price sensitivity of Arbitrum and Optimism post-Cancun upgrade follows a relatively linear logic, and there are at least two factors not incorporated in these projections: The calculations are predicated on the present project P/E ratios, which have likely already integrated expectations about the Cancun upgrade. Post-Cancun upgrade, Optimism is anticipated to introduce a greater quantity of tokens into circulation than it currently holds. If we maintain the assumption that the projected circulating market capitalization remains steady, this influx would naturally infer a decline in the token price. Despite these caveats, a fundamental axiom persists: as the operational profits of L2 soar, so does the intrinsic value of its tokens, paving the way for the possibility of achieving higher market evaluations. Whether it manifests as a reduction in operational costs or a boost in on-chain engagement,, Cancun upgrade offers tangible enhancements to L2 projects. 3. The Potential Risks of Optimism Drawing from earlier discussions, Optimism has seamlessly transitioned from being a single-chain L2 solution to establishing itself as a nexus in the inte-rchain L2 ecosystem, bolstered by the Superchain narrative and the extensive integration of the OP Stack. By leveraging a B2B2C model in collaboration with OP Stack partners, it has onboarded a wider user base. Over the long haul, this positioning grants Optimism more robust network effects, economies of scale, and a coalition of stakeholders sharing mutual interests, presenting a more lucrative business model than Arbitrum. Furthermore, recent metrics suggest that the primary transaction data on the OP Mainnet is progressively narrowing the gap with, if not surpassing, that of Arbitrum. Concurrently, emerging L2 platforms within the OP Stack ecosystems, such as BASE, are witnessing swift growth, intensifying the competition for Arbitrum’s market position. Given the similar upside potential for the token prices of OP and $ARB, spurred by the Cancun upgrade, the added allure of Optimism Superchain storyline arguably positions it as a more attractive investment target. Nevertheless, the L2 landscape remains highly competitive. Investors should be aware of the ensuing risks associated with Optimism. 3.1 Arbitrum Mulls Over Opening Its L2 License, Echoing Optimism’s Strategy to Capture the User Base At present, Arbitrum operates under a Business Source License (BSL), constraining partners who wish to utilize the Arbitrum stack for developing their Rollup ecosystems to either secure formal authorization from Arbitrum DAO or Offchain Labs—the driving force behind Arbitrum—or resort to developing on L3 using Arbitrum One. However, with the OP Stack witnessing rapid expansion and significant user adoption in recent months, a sense of urgency is discernible within the Arbitrum community. On August 8th, an Arbitrum team representative, stonecoldpat, spurred a dialogue on the governance forum, inviting community reflections on the pivotal question of “When, and How, Should the Arbitrum Foundation Issue a License for the Arbitrum Technology Stack to a New Strategic Partner”, delineating specific areas of discussion: Assessing the community’s perspective on licensing Arbitrum’s code to external entities. Discuss the possibility of attaching specific conditions to code authorization licenses. Devise an evaluation framework to determine the eligibility of potential licensees. Sketching out Short-term and Medium-term Roadmaps for the Above Points: In the short term, the intent is to identify and green-light licenses to those partners who satisfy a set benchmark. In the medium term, there’s a move to streamline the licensing process. Any project aligned with the conditions can get the license. The main feedback can be summarized as the following: “It appears to be a strategic blunder that the Arbitrum Foundation or Offchain Labs have not already issued a license to large strategic partners to use the Arbitrum software stack. This dithering may actually be harming the Arbitrum ecosystem.” “We have not had any feedback that the Arbitrum Foundation should not issue a license for the Arbitrum technology stack to strategic partners. It has mostly focused on the criteria for doing so, conditions that should be attached to it and allowing the DAO to give its initial input on that process.” Given the current dynamics, Arbitrum will shift towards an Optimism-like strategy is undeniably on the horizon, positioning itself for a foray into the competitive landscape of the “L2 interchains.” This strategic shift potentially challenges the currently flourishing ecosystem of OP Stacks. On August 9th, Andre Cronje, the co-founder and chief architect of the Fantom Foundation, during an interview with The Block, revealed that they’re actively evaluating the Optimism L2 solutions. Their scope of evaluation spans both the OP Stack and Arbitrum stack.Given Fantom’s esteemed standing as a foremost L1, it remains improbable that they would acquiesce to functioning as an L3 within Arbitrum. Consequently, Cronje’s mention of the “Arbitrum stack” seemingly refers to the L2 solutions. The real challenge lies in the timeline: how long will it take for the Arbitrum community and its partners to strike a consensus and subsequently roll out these licenses? By the time they do, what will the competitive landscape look like, and how many pivotal clients will remain up for grabs? The protraction of this process only serves to benefit the OP Stack ecosystem, as more collaborators might lean towards it, further complicating Arbitrum’s dilemma. 3.2 The Escalating Heat in the L2 Service Landscape Beyond Arbitrum and Optimism, the L2 landscape, particularly the ZK-based solutions, is witnessing swift advancements or is queued up for launch. There’s ZKsync, showcasing compelling operational metrics (though somewhat inflated due to airdrop enthusiasts), Linea, fortified by its affiliation with Consensys (with Metamask under its umbrella boasting 30 million monthly users and Infura catering to over 400,000 developers), and the eagerly awaited Scroll, among others. Furthermore, platforms exemplified by Altlayer, offering Rollup as a Service, are paving the way for a seamless integration and operation experience for Rollup developers via service aggregators. By diving directly into the OP Stack’s precursor stages, these platforms could indeed dilute the bargaining prowess within the Optimism ecosystem. Altlayer Ecosystem 3.3 Assessing Value Transmission in the Superchain Ecosystem: Ripple Effects on the Optimism Foundation and the OP Token Currently, OP lacks a direct mechanism for value capture. Among the myriad of OP Stack adopters, only BASE has explicitly committed to donating 10% of its L2 profits to the Optimism Foundation. Other collaborating projects haven’t provided analogous assurances. Validation of the value capture of OP may be on hold until the launch of its decentralized sequencer protocol and the acceptance degree by OP Stack adopters. Should the community rally behind and integrate a decentralized  sequencer protocol collateralized by OP, it would undeniably boost direct demand for OP, culminating in effective value transfer. On the flip side, if other L2 projects persist in maintaining their own sequencer standards and node infrastructure, it might not only hinder the value capture of OP but also weaken the synergies between L2s within the Optimism ecosystem. 3.4 Valuation Risk In the previous section on the valuation of OP, it was highlighted that the forecasted price elevation of OP, attributed to the Cancun upgrade, operates on the assumption that “the PE of Optimism after the upgrade remains consistent with the current level.” Considering that the Cancun upgrade is one of the most anticipated events in the market this year, the current PE valuation of Optimism has more or less priced in this expectation. For those who hold a pessimistic view, they may even believe that the current PE has already priced in the positive impact of Cancun. Reference ASXNEIP-4844 Research Report The BlockFantom is exploring adding optimistic rollups to connect to Ethereum Supporting EIP-4844: Reducing Fees for Ethereum Layer 2 Rollups

OP vs ARB: Which Stands Out As The Better Investment Target Post-Cancun Upgrade?

By Alex Xu, Research Partner at Mint Ventures

In 2023, the Cancun upgrade has firmly established itself as one of the most pivotal events in the crypto world since the Shanghai upgrade. Layer2 projects that stand to gain from it are central to our focus for the year.

It is said that the Cancun upgrade, proposed in EIP-4844, is projected to be implemented between October 2023 and January 2024. Meanwhile, tokens of forefront Layer2 solutions—ARB (Arbitrum) and OP (Optimism) underwent retracements after marking fresh all-time highs earlier this year. This suggests that the current climate might still present an opportune window for strategic engagement.

Certainly, when evaluated by market capitalization, the market cap of OP has been consistently breaking records since 2023, whereas ARB has largely been trading in a consolidation phase at lower levels. Through this article, we aim to unpack the following aspects:

The inherent value and business model behind Layer2.

OP vs. ARB: A comprehensive analysis of their competitiveness and key performance indicators.

The transformative impact the Cancun upgrade could have on Layer2 fundamentals.

Unveiling the potential risks associated with Optimism.

The insights and opinions presented in this article reflect my views as of the publication date primarily through a business lens, with a limited delve into the technical intricacies of Layer 2 and may contain factual inaccuracies or biases. This article is intended for discussion purposes only, and feedback is welcomed.

1. Unpacking the Inherent Value and Business Model of L2

1.1 Source of L2’s Value and Its Defensive Moat

Layer 2 (L2) solutions can be seen as the natural evolution of Layer 1 (L1) protocols, echoing their fundamental promises: delivering a robust, censorship-resistant, and universally accessible block space. A fitting analogy might be to view L2 as a specialized on-chain cloud service. L2’s competitive advantage primarily emerges in its economic efficiency. As a case in point, Optimism showcases gas expenditures that are just 1.56% of those on Ethereum. 

However, one must recognize the niche nature of “block space as a specialized cloud service”. Not all online services demand the distinctive functionalities L1 or L2 platforms provide. In the traditional world fraught with restrictions and obscured financial operations, blockchain emerges as a beacon, creating an environment for myriad practical applications. 

L2’s block space valuation is intrinsically tethered to its demand, driven by both service providers and users. 

Much like L1, L2 has the ability to cultivate a robust defensive moat through the power of network effects.

Within the L2 paradigm, as the user base broadens and diversifies, collaboration becomes more frictionless, streamlining interactions within this ecosystem. This not only nurtures the path-breaking service innovations but also further expands the user pool. Every newcomer that joins and establishes their presence on the L2 network amplifies the intrinsic value for the existing users. 

In the Web3 landscape, the network effect prowess of L1 & L2 platforms is eclipsed only by stablecoins, with USDT standing as a prime exemplar. The leading L1 & L2 platforms inherently present higher entry barriers, and as a result, often command a more elevated valuation premium.

1.2 Profit Model of Layer2

L2’s revenue structure is straightforward. On one side, L2 secures its data by procuring storage capacity from reliable Data Availability (DA) layers. This serves as an insurance policy—ensuring that if any disruptions occur on L2, the data remains safeguarded and can be readily restored using these backup layers. On the flip side, L2 offers users an affordable block space, and in return, charges them accordingly. The resultant profit margin is primarily a function of the fees garnered on L2 (base fees plus MEV earnings), offset by the costs remitted to the DA service providers.

Delving deeper with Optimism and Arbitrum as illustrative examples, both platforms have strategically allied with Ethereum for their Data Availability needs, capitalizing on Ethereum’s peerless decentralized stature and its gold standard reputation in the L1 realm. They pay in Gas to Ethereum, facilitating the archival of their condensed L2 data within the Ethereum framework. Their revenue stream is predominantly anchored in the Gas and MEV fees accrued when their user base—which spans everyday users to sophisticated developers—interact on their L2 platforms. Deducting operational costs from this revenue gives a clear picture of their gross profit.

It’s pivotal to emphasize that “gross profit” here is a metric that doesn’t encompass subsequent project-associated financial commitments—like salaries, ecosystem incentives, promotional activities, and other overheads.

The Function of Sequencers within L2 Operations

The fee collection on L2, as well as expenses to L1, both executed by the L2 sequencer. Profits from these operations flow directly to the sequencer. In the current landscape, both Optimism and Arbitrum have their sequencers operated by the official team, with consequent profits enriching their treasury. Naturally, having a centralized sequencer poses an elevated single-point vulnerability. Therefore, both Optimism and Arbitrum have committed to transition towards a more decentralized sequencer framework in their roadmaps.

The decentralization of sequencers is poised to adopt a PoS (Proof of Stake) framework. Under this model, decentralized sequencers would be required to stake native L2 tokens like ARB or OP as collateral. If they fail to uphold their obligations, these staked tokens risk being slashed. Users have the choice to either stake on their own as sequencers or leverage staking services provided by entities like Lido. In this arrangement, while users supply the staked tokens, specialized and dispersed sequencer operators oversee the sequencing and data uploading tasks. Users who participate in staking can then earn a substantial portion of the L2 fees and MEV rewards, with Lido’s model earmarking 90% of these rewards for stakers.

As this narrative unfolds, both ARB and OP  will be infused with economic utility beyond their foundational governance role.

1.3 ARB VS OP

The Competitive Advantage of Optimism

Since its inception, ARB has consistently outperformed OP on several business-centric metrics, reinforcing its superior positioning in the market. Leveraging its inherent strengths of network effects, Arbitrum seemed poised for not just market dominance but also for commanding a higher valuation premium.

However, the trend began to shift following Optimism’s introduction of the Superchain strategy in February 2023 and the strong push behind the OP Stack.

OP Stack is an open-source L2 technology suite, offering a streamlined solution for emerging projects seeking to leverage L2 capabilities. By utilizing the OP Stack, they can expediently roll out their personalized L2 solutions, significantly slashing both development and trial costs. The “Superchain” stands as Optimism’s visionary blueprint for the future. By embracing the L2 built upon the OP Stack, a uniformity in technical architecture is achieved. This facilitates seamless, ultra-secure, and high-speed atomic-level communication and interaction of both information and assets across different platforms. Drawing parallels with the Cosmos Interchain concept, this innovative framework has been named the “Superchain.”

Following the introduction of the OP Stack and Superchain, Coinbase was among the early adopters, launching its Layer2 Base Chain built on the OP Stack in February and by August 10th, the platform officially went live. Coinbase’s pioneering approach served as a catalyst, setting the stage for a surge in OP Stack adoption throughout the crypto ecosystem. This ripple effect saw major players like Binance introducing their opBNB. Other prominent entrants to the OP Stack community included the Paradigm-endorsed NFT project, ZORA, and the Loot ecosystem’s Adventure Gold DAO. The Gitcoin-affiliated public service endeavor, Public Goods Network (PGN), the renowned options trading platform, Lyra, and the distinguished on-chain analytics provider, Debank, also made their alignments clear with this trend. Notably, Celo, traditionally an L1 solution, has embraced OP Stack for its L2 strategy.

Historically, L2 solutions primarily catered to users, treating block space as a unique domain for their operations. Yet, the introduction of the Superchain and OP Stack has revolutionized this perspective, expanding the definition of ‘users’ to include L2 operators themselves. As a result, L2’s domain, which was traditionally a B2C model (with L2 developers also counted as consumers), has morphed into an inclusive B2B2C framework. This evolutionary step has carved out new value pathways for Optimism and solidified its competitive defenses.

Network effects in the multi-chain era. Expanding the traditional understanding of “network” from a single chain to encompass a “multi-chain ecosystem”, seamless integration of funds and information across diverse chains is achieved via the uniform OP Stack. Entrusted with user acquisition and engagement, L2 operators strive to bolster the cumulative user base of this multi-chain ecosystem. As this collective user population grows, the intrinsic value of each L2, as well as every individual user within this network, experiences a surge.

Economies of scale. Optimism bears the fixed costs of the technical framework—maintaining and updating the OP Stack. However, the constant feedback and refinements offered by its diverse user base elevate its overall quality. This cost-efficient strategy not only diminishes the expenses linked to single-chain maintenance and updates, sequencers, and indexing incentives but also amplifies its allure for prospective L2 solution seekers.

Synergistic Ecosystem. By weaving major Web3 players into the fabric of the Optimism ecosystem, a shared vision and mutual interests emerge. This alignment paves the way for robust support in areas like technological improvements, user acquisition, developer engagement, and investment drives.

Evolving from a single-chain ecosystem to a cross-chain ecosystem, Optimism not only reaps the advantages of anticipated growth in users and developers across the full spectrum, but its core metrics on the OP Mainnet are steadily inching closer, even surpassing the once-distant leader, Arbitrum, as proved by the following metrics:

a. Monthly Active Addresses: The proportion of Optimism to Arbitrum’s monthly active addresses has surged from a previous low of 32.1% to its current rate of 73.6%.

Source: tokenterminal

b. Monthly L2 Profits: The L2 profit ratio of Optimism relative to Arbitrum has surged from a mere 16.4% to 100.2%, now surpassing Arbitrum.

Source: Tokenterminal

c. Monthly Interaction Counts: The interaction count ratio of Optimism compared to Arbitrum has swelled from a previous 22.4% to 106.5%.

Source: Tokenterminal

d. TVL: The TVL ratio of Optimism relative to Arbitrum has grown from a trough of 1/3 to the present level of 1/2. 

The TVL on the OP Mainnet was approximately $20 billion in March and has since risen to an estimated $30 billion.

The TVL on the Arbitrum was approximately $60 billion in March (reaching a high of about $70 billion) and continues to hover around $60 billion currently.

Source: https://l2beat.com/

Comparison of Valuations: OP vs. ARB

As Optimism’s operational metrics continue their swift ascent, the valuation of OP Mainnet relative to Arbitrum is increasingly attractive.

P/E Ratio (circulating market cap/L2’s annualized profit): Based on the revenue data of the most recent week, Optimism’s P/E ratio has moderated to slightly under 80, compared to Arbitrum’s which is positioned at 113. This is particularly noteworthy considering OP’s strong price performance and the continuous vesting and growth of its circulating supply over the past few months.

Source: tokenterminal

The Vigorous Expansion of the Optimism Ecosystem

While OP Mainnet metrics progressively advance against Arbitrum, driven in part by a rejuvenation within its own ecosystem, it’s the contributions from new entrants to the Optimism community that have had a pronounced impact A case in point is evident when examining the projects that have contributed the most transactional volume to the OP Mainnet over the last 30 days, where Gnosis Safe contract operations stand out in the top spot, with Worldcoin securing the fourth position.

Source: https://dune.com/optimismfnd/Optimism

Intriguingly, a substantial portion of the transactions on Gnosis Safe are initiated by the Worldcoin team. By the end of June 2023, World App had successfully deployed in excess of 300,000 Gnosis Safe accounts. This surge is primarily attributed to the migration of World App accounts to the OP Mainnet.

According to data released on Worldcoin’s official site on August 11th, the platform claims a user base exceeding 2.2 million. Remarkably, just in the previous week, they have seen the addition of 257,000 new accounts. World App’s daily transaction count reaches an impressive 126,000, constituting approximately 21% of the total daily transfers across both the OP Mainnet and Arbitrum.

Source: https://worldcoin.org/

At present, Worldcoin has migrated its ID system and tokens to the OP Mainnet, with  future plans to develop an application chain grounded in the OP Stack. This strategic move is anticipated to usher in a wave of active users and developers.

Beyond Worldcoin, the momentum in the wake of Base launch has been nothing short of extraordinary. Coinbase, standing as the inaugural and paramount proponent of the OP Stack, registered a noteworthy 136,000 active addresses as of August 10th. This figure closely mirrors that of the L2 market frontrunner, Arbitrum, which hosts 147,000 addresses.

Source: https://dune.com/tk-research/base

Among all L1 & L2 smart contract platforms, these metrics are only eclipsed by Tron (1.5M), BNBchain (1.04M), Polygon (0.37M), and Arbitrum (0.14M). Interestingly,, post the official debut of Base on August 10th, the application that made a breakout wasn’t rooted in the conventional domains of DeFi or meme tokens. Rather, it was a socially-oriented application named friend.tech, adding an unexpected twist to the narrative.

Arbitrum’s Dilemma

Arbitrum finds itself in a complex situation owing to its strategic positioning. Although it features a robust L2 mainnet, marked by the stellar performances of Arbitrum One and the superior Arbitrum Nova, it concurrently launched its L3 stack, Orbiter, setting it in competition with OP Stack. However, in an environment where L2 is still in its ascendancy, there’s a reluctance among many to pigeonhole themselves into the L3 category and use Arbitrum one as their primary DA layer. Significantly, projects with substantial industry assets—whether in terms of users, developers, or intellectual property—are often inclined towards building on L2. This preference  translates to a higher valuation potential and a broader user outreach.

Indeed, the emergence of platforms like ALTLayer, offering Rollups-as-a-Service (RaaS) solutions, represents a shift in how smaller projects and developers approach the implementation of rollups. ALTLayers offers solutions that simplify the process of building and operating rollups, essentially providing a low-threshold, less-code avenue for integration. They assist users in seamlessly integrating various rollup modules available in the market into diverse programs, allowing for a Lego-like building experience.

The RaaS modular solution provided by ALTLayer

Within the diverse array of Rollup options that RaaS platforms offer, Arbitrum’s Orbiter is but one choice among many. Smaller entities, upon evaluating the breadth of available offerings, may opt for more cost-effective L2 solutions, avoiding constraining themselves within an L3 classification.

In this unfolding landscape, despite the marginal lead that Arbitrum One, a standalone L2, holds over its Layer 2 counterparts in transaction volume, it is experiencing a swift decline in market share. This decline is predominantly attributed to both new and existing users transitioning to Optimism-aligned and other diverse Layer 2 ecosystems.

Overall, Optimism leverages its open-source L2 toolkit to create a network effect, attracting users via a B2B2C model. This approach seems to have a competitive advantage over Arbitrum’s solid, but single-chain methodology. If Arbitrum doesn’t reassess and adjust its strategic direction promptly, its position as the preeminent L2 single-chain leader could be at serious risk.

2. How the Cancun Upgrade Enhances L2 Project Fundamentals

2.1 Current Project Valuations for Arbitrum and Optimism

To accurately assess the prevailing valuation metrics of Arbitrum and Optimism, we have leveraged revenue data from the recent quarter and contrasted it against their current market valuations.

Maintaining a constant P/E ratio and considering the projected 90% reduction in L1 costs for Arbitrum and Optimism post-Cancun upgrade (based on the conservative estimates outlined in EIP-4844, forecasting a 90-99% decrease for L2 projects in L1 expenses), the adjusted valuation for $ARB and OP, assuming a consistent L2 pricing model, are detailed below:

The Cancun upgrade results in a substantial decrease in L1 costs, thereby directly contributing to improved profitability and an ensuing increase in project valuations.

2.2 The Impact of Cancun Upgrade on L2 Valuation

Indeed, the Cancun upgrade, which brings about a reduction in L1 costs, necessitates a re-evaluation of the L2 fee structure for both Arbitrum and Optimism, as maintaining the current fees is untenable. Consequently, our valuation assessment needs to incorporate two crucial variables:

To what extent will Arbitrum and Optimism transfer the savings to users by diminishing L2 fees?

As L2 fees decrease, what magnitude of surge in L2 transactional activity is anticipated?

Based on the assumption that the P/E multiple remains unchanged, the analysis endeavors to deduce the prices of $ARB and OP after the Cancun upgrade based on the alterations in the “Ratio of Cost Savings Transferred to Fee Savings” and “Ratios of Fee Savings to Transfers Increase”.

The fundamental rationale behind the two token price projection tables is:

The smaller the proportion of L2 users benefiting from the massive cost reductions post the Cancun upgrade, the higher the operational profit for L2.

The greater the increase in transactional activity due to L2s reducing fees, the higher the operational profit for L2.

Furthermore, given that current gas fees on Optimism are about 30-50% lower than that of Arbitrum, Optimism possesses more flexibility with the decline in L1 costs, giving it greater room to decide how much of the savings it retains. Consequently, it is inferred that Optimism will redistribute between 60-100% of its cost savings to benefit users, while for Arbitrum, this redistribution is likely to be in the range of 70-100%.

If we only consider the impact of the Cancun upgrade on the mainnet chain of Optimism and Arbitrum, their potential for price appreciation seems closely aligned.

Certainly, the analysis outlined above regarding the price sensitivity of Arbitrum and Optimism post-Cancun upgrade follows a relatively linear logic, and there are at least two factors not incorporated in these projections:

The calculations are predicated on the present project P/E ratios, which have likely already integrated expectations about the Cancun upgrade.

Post-Cancun upgrade, Optimism is anticipated to introduce a greater quantity of tokens into circulation than it currently holds. If we maintain the assumption that the projected circulating market capitalization remains steady, this influx would naturally infer a decline in the token price.

Despite these caveats, a fundamental axiom persists: as the operational profits of L2 soar, so does the intrinsic value of its tokens, paving the way for the possibility of achieving higher market evaluations. Whether it manifests as a reduction in operational costs or a boost in on-chain engagement,, Cancun upgrade offers tangible enhancements to L2 projects.

3. The Potential Risks of Optimism

Drawing from earlier discussions, Optimism has seamlessly transitioned from being a single-chain L2 solution to establishing itself as a nexus in the inte-rchain L2 ecosystem, bolstered by the Superchain narrative and the extensive integration of the OP Stack. By leveraging a B2B2C model in collaboration with OP Stack partners, it has onboarded a wider user base. Over the long haul, this positioning grants Optimism more robust network effects, economies of scale, and a coalition of stakeholders sharing mutual interests, presenting a more lucrative business model than Arbitrum. Furthermore, recent metrics suggest that the primary transaction data on the OP Mainnet is progressively narrowing the gap with, if not surpassing, that of Arbitrum. Concurrently, emerging L2 platforms within the OP Stack ecosystems, such as BASE, are witnessing swift growth, intensifying the competition for Arbitrum’s market position.

Given the similar upside potential for the token prices of OP and $ARB , spurred by the Cancun upgrade, the added allure of Optimism Superchain storyline arguably positions it as a more attractive investment target.

Nevertheless, the L2 landscape remains highly competitive. Investors should be aware of the ensuing risks associated with Optimism.

3.1 Arbitrum Mulls Over Opening Its L2 License, Echoing Optimism’s Strategy to Capture the User Base

At present, Arbitrum operates under a Business Source License (BSL), constraining partners who wish to utilize the Arbitrum stack for developing their Rollup ecosystems to either secure formal authorization from Arbitrum DAO or Offchain Labs—the driving force behind Arbitrum—or resort to developing on L3 using Arbitrum One. However, with the OP Stack witnessing rapid expansion and significant user adoption in recent months, a sense of urgency is discernible within the Arbitrum community. On August 8th, an Arbitrum team representative, stonecoldpat, spurred a dialogue on the governance forum, inviting community reflections on the pivotal question of “When, and How, Should the Arbitrum Foundation Issue a License for the Arbitrum Technology Stack to a New Strategic Partner”, delineating specific areas of discussion:

Assessing the community’s perspective on licensing Arbitrum’s code to external entities.

Discuss the possibility of attaching specific conditions to code authorization licenses.

Devise an evaluation framework to determine the eligibility of potential licensees.

Sketching out Short-term and Medium-term Roadmaps for the Above Points:

In the short term, the intent is to identify and green-light licenses to those partners who satisfy a set benchmark.

In the medium term, there’s a move to streamline the licensing process. Any project aligned with the conditions can get the license.

The main feedback can be summarized as the following:

“It appears to be a strategic blunder that the Arbitrum Foundation or Offchain Labs have not already issued a license to large strategic partners to use the Arbitrum software stack. This dithering may actually be harming the Arbitrum ecosystem.”

“We have not had any feedback that the Arbitrum Foundation should not issue a license for the Arbitrum technology stack to strategic partners. It has mostly focused on the criteria for doing so, conditions that should be attached to it and allowing the DAO to give its initial input on that process.”

Given the current dynamics, Arbitrum will shift towards an Optimism-like strategy is undeniably on the horizon, positioning itself for a foray into the competitive landscape of the “L2 interchains.” This strategic shift potentially challenges the currently flourishing ecosystem of OP Stacks.

On August 9th, Andre Cronje, the co-founder and chief architect of the Fantom Foundation, during an interview with The Block, revealed that they’re actively evaluating the Optimism L2 solutions. Their scope of evaluation spans both the OP Stack and Arbitrum stack.Given Fantom’s esteemed standing as a foremost L1, it remains improbable that they would acquiesce to functioning as an L3 within Arbitrum. Consequently, Cronje’s mention of the “Arbitrum stack” seemingly refers to the L2 solutions.

The real challenge lies in the timeline: how long will it take for the Arbitrum community and its partners to strike a consensus and subsequently roll out these licenses? By the time they do, what will the competitive landscape look like, and how many pivotal clients will remain up for grabs? The protraction of this process only serves to benefit the OP Stack ecosystem, as more collaborators might lean towards it, further complicating Arbitrum’s dilemma.

3.2 The Escalating Heat in the L2 Service Landscape

Beyond Arbitrum and Optimism, the L2 landscape, particularly the ZK-based solutions, is witnessing swift advancements or is queued up for launch. There’s ZKsync, showcasing compelling operational metrics (though somewhat inflated due to airdrop enthusiasts), Linea, fortified by its affiliation with Consensys (with Metamask under its umbrella boasting 30 million monthly users and Infura catering to over 400,000 developers), and the eagerly awaited Scroll, among others. Furthermore, platforms exemplified by Altlayer, offering Rollup as a Service, are paving the way for a seamless integration and operation experience for Rollup developers via service aggregators. By diving directly into the OP Stack’s precursor stages, these platforms could indeed dilute the bargaining prowess within the Optimism ecosystem.

Altlayer Ecosystem

3.3 Assessing Value Transmission in the Superchain Ecosystem: Ripple Effects on the Optimism Foundation and the OP Token

Currently, OP lacks a direct mechanism for value capture. Among the myriad of OP Stack adopters, only BASE has explicitly committed to donating 10% of its L2 profits to the Optimism Foundation. Other collaborating projects haven’t provided analogous assurances. Validation of the value capture of OP may be on hold until the launch of its decentralized sequencer protocol and the acceptance degree by OP Stack adopters. Should the community rally behind and integrate a decentralized  sequencer protocol collateralized by OP, it would undeniably boost direct demand for OP, culminating in effective value transfer. On the flip side, if other L2 projects persist in maintaining their own sequencer standards and node infrastructure, it might not only hinder the value capture of OP but also weaken the synergies between L2s within the Optimism ecosystem.

3.4 Valuation Risk

In the previous section on the valuation of OP, it was highlighted that the forecasted price elevation of OP, attributed to the Cancun upgrade, operates on the assumption that “the PE of Optimism after the upgrade remains consistent with the current level.” Considering that the Cancun upgrade is one of the most anticipated events in the market this year, the current PE valuation of Optimism has more or less priced in this expectation. For those who hold a pessimistic view, they may even believe that the current PE has already priced in the positive impact of Cancun.

Reference

ASXNEIP-4844 Research Report

The BlockFantom is exploring adding optimistic rollups to connect to Ethereum

Supporting EIP-4844: Reducing Fees for Ethereum Layer 2 Rollups
Explore the latest crypto news
âšĄïž Be a part of the latests discussions in crypto
💬 Interact with your favorite creators
👍 Enjoy content that interests you
Email / Phone number

Latest News

--
View More
Sitemap
Cookie Preferences
Platform T&Cs