Binance Square
LIVE
Foresight Ventures
@Foresight_Ventures
A crypto investment firm driven by commonsense and consensus.
Following
Followers
Liked
Shared
All Content
LIVE
--
Foresight Ventures: AI + Web3 =Author: Ian Xu@Foresight Ventures TL;DR This article discusses the intersection of AI and Web3, exploring how on-chain AI can bring significant value to the decentralized internet. It highlights several projects, including Worldcoin, Pragma, Lyra Finance, Giza, Zama.ai, and potential ML-as-a-service applications. The article emphasizes that AI in the Web3 context is in its early stages but holds great potential. On-chain AI, being transparent and verifiable, can significantly enhance efficiency and security, enabling new product forms. The ZKML is particularly promising, with ZK-rollup potentially serving as an entry point for AI into the Web3 world. While the current infrastructure can support models of a certain scale, there are still many uncertainties, particularly in terms of verifying models through Zero-Knowledge Proofs. This is seen as the inevitable path for AI on-chain but requires exponential improvements in proof systems to support increasingly large models. In terms of applications, on-chain AI could potentially participate in any aspect of Web3, including gaming, DeFi, DID, and tooling. Despite the scarcity of existing projects, the article remains optimistic about the potential of on-chain AI and its transformative impact on the Web3 space. 1. AI + Web3 = ? Developers’ obsessive dedication to infrastructure construction and the continuous updates of various rollup solutions have indeed made a breakthrough in the originally lagging computing power of web3. This has also made it possible for AI to be put on the blockchain. But you might want to say that instead of going to great lengths to implement on-chain AI, running models off-chain seems to be able to meet most needs. In fact, almost all AI models are currently running in a black-box, centralized mode and are creating irreplaceable value in various fields. 1.1 Let’s go back to the most basic question, what is AI on the blockchain? The mainstream understanding is to make AI models transparent + verifiable through Web3. To be more specific, AI on the blockchain means the complete verification of artificial intelligence models. That is to say, a model needs to disclose the following three points to the entire network (users or verifiers): Model architecture; Model parameters and weights: Disclosing parameters and weights may sometimes have a negative impact on product security. Therefore, for specific scenarios, such as risk control models, weights can be hidden to ensure security; Model input: In the context of web3, it is generally public data on the chain. When the above conditions are met, the whole model execution process is deterministic and no longer a black-box operation. Anyone can verify the model’s input and results on the blockchain, thereby preventing the model owner or related authorized person from manipulating the model. ï»ż 1.2 What is the driving force for on-chain AI? The significance of on-chain AI is not to replace the centralized operation mode of Web2 AI, but: Without sacrificing decentralization and trustlessness, it creates the next stage of value for the web3 world. The current Web3 is like the early stage of web2, and it does not yet have the ability to undertake broader applications or create greater value. Only after incorporating AI, can the imagination of Dapp truly leap to the next stage, and these on-chain applications may become closer to the level of web2 applications. This proximity is not in making the functions more similar, but in enhancing user experience and possibilities by leveraging the value of Web3. It provides a transparent, trustless solution for the black-box operation mode of web2 AI. Imagine the application scenarios of web3: Adding a recommendation algorithm to the NFT trading platform, recommending corresponding NFTs based on user preferences, and improving conversion; Adding AI opponents in games for a more transparent and fair gaming experience; 

 However, these applications are further improvements in efficiency or user experience through AI for existing functions. Is it valuable? Yes. Is the value big? It depends on the product and the scenario. The value that AI can create is not limited to optimizing from 99 to 100. What really excites me is the new applications from 0 to 1, some use cases that can only be achieved through transparent + verifiable on-chain models. However, these “exciting” use cases currently rely mainly on imagination, without mature applications. Here are some brainstorming ideas: Crypto trading based on neural network decision models: One product form could be more like an upgraded version of copy trading, or even a completely new way of trading. Users no longer need to trust or research other experienced traders but bet on completely open and transparent models and their performance. Essentially, AI trades faster and more decisively based on predictions of future crypto prices. However, without the “trustless autonomy” inherent in on-chain AI, such betting objects or standards simply do not exist. Users/investors can transparently see the reasons, processes, and even the exact probability of future rises/falls in the model’s decision-making. AI models acting as referees: A product could be a new form of the oracle, predicting the accuracy of data sources through AI models. Users no longer need to trust validators and don’t have to worry about nodes misbehaving. Oracle providers don’t even need to design complex node networks and reward-punishment mechanisms to achieve decentralization. Correspondingly, the on-chain transparent + verifiable AI is already sufficient to verify the confidence level of off-chain data sources. This new product form has the potential to dominate in terms of security, efficiency, and cost, and the object of decentralization jumps from humans to “trustless autonomy” AI tools, which is undoubtedly safer. Organization management/operating systems based on large models: The governance of DAOs should inherently be efficient, decentralized, and fair, but the current situation is quite the opposite, loose and bloated, lacking transparency and fairness. The introduction of on-chain AI can provide a very fitting solution, maximizing management mode and efficiency, and minimizing systemic and human risks in management. We can even imagine a new development and operation mode for web3 projects, where the entire framework and future development direction and proposals hardly depend on the decision-making of the development team or DAO voting. Instead, decisions are made based on the larger data acquisition and computational abilities of large models. But all this is premised on the model being on-chain. Without AI’s “trustless autonomy”, there is no transition from humans to tools in the decentralized world. 

 In summary, New product forms based on on-chain AI can be summarized as transitioning the subject of decentralization and trustlessness from humans to AI tools. This is in line with the evolution of productivity in the traditional world, where initially, efforts were made to upgrade and enhance human efficiency, and later, humans were replaced by intelligent tools, revolutionizing the original product design in terms of security and efficiency. The most critical point, and the premise of all the above, is to make AI transparent + verifiable through Web3. ï»ż 1.3 The Next Stage of Web3 Web3, as a phenomenal technological innovation, cannot just stay in its initial stage. Traffic and economic models are important, but users will not always stay in pursuit of traffic or spend a lot of resources to do X to earn, and Web3 will not onboard the next wave of new users because of this. But one thing is certain: the revolution of productivity and value in the crypto world must come from the addition of AI. I think it can be roughly divided into the following three stages: Start: The update and iteration of zero-knowledge proof algorithms and hardware provide the first possibility for the emergence of on-chain AI; (we are here) Development: Whether it’s the improvement of existing applications by AI or the new products based on on-chain AI, both are pushing the entire industry forward; Endgame: What is the ultimate direction of on-chain AI? The above discussions are all about exploring application scenarios bottom-up through the combination of AI and Web3. If we switch to a top-down approach to view the on-chain AI, could AI trace back to Web3 itself? AI + blockchain = adaptive blockchain Some public chains will take the lead in integrating on-chain AI, transforming from the level of public chains into a kind of adaptive one. The development direction no longer depends on project foundation decisions but is based on massive data decision-making, and the level of automation far exceeds traditional Web3, thus standing out from the current multi-chain prosperity. With the blessing of verifiable + transparent AI, where web3’s self-regulation is manifested can refer to a few examples mentioned by Modulus Lab: On-chain transaction markets can automatically adjust in a decentralized manner, such as adjusting the interest rate of stablecoins in real-time based on publicly available on-chain data, without the need for trust assumptions; Multimodal learning can allow on-chain protocol interactions to be completed through biometric recognition, providing secure KYC and achieving complete trustless identity management; Allow on-chain applications to maximize the value brought by on-chain data, supporting services such as customized content recommendation. From another perspective, zkrollup keeps iterating and optimizing, but it always lacks a real application that can only run on the zk ecosystem, ZKML exactly meets this point, and its imagination space is also large enough. ZK-rollup is likely to serve as the entry point for AI into web3 in the future, creating greater value, and the two complement each other. 2. Implementation and Feasibility 2.1 What Can Web3 Provide for AI? Infrastructure and ZK are undoubtedly the most fiercely competitive tracks in web3. Various ZK projects have made great efforts in circuit optimization and algorithm upgrading, whether it’s the exploration of multi-layer networks, the development of modularization and data availability layers, further customizing rollup as a service or even hardware acceleration
 These attempts are pushing the scalability, cost, and computing power of Web3 infrastructure to the next level. ï»ż It sounds good to put AI on the chain, but how exactly is it done? One approach is through the ZK-proof system. For example, create a customized circuit for machine learning, the process of generating a witness off-chain is the process of model execution, and generate a proof for the model prediction process (including model parameters and inputs), anyone can verify the proof on-chain. The AI model still runs on an efficient cluster, even with some hardware acceleration to further enhance computational speed, maximizing the use of computing power while ensuring that no centralized person or institution can tamper with or interfere with the model, that is, to ensure: Model prediction result certainty = verifiable (input + model architecture + parameters) Based on the above approach, we can further infer which infrastructures are crucial for AI on-chain: ZKP system, rollup: Rollups expand our imagination of on-chain computing capabilities, packaging a bunch of transactions, and even recursively generating proof of proof to further reduce costs. For the current large models, the first step to provide possibilities is the proof system and rollup; Hardware acceleration: ZK rollup provides a verifiable basis, but the generation speed of proof directly relates to the usability and user experience of the model. Waiting for several hours to generate a model’s proof is obviously not going to work, so hardware acceleration through FPGA is a great boost. Cryptography: Cryptography is the foundation of the crypto world, and on-chain models and sensitive data also need to ensure privacy. Supplement: The basis of large models is GPU. Without high parallel support, the efficiency of large models will be very low, and they cannot run. Therefore, for an on-chain zk ecosystem: GPU-friendly = AI-friendly Take Starknet as an example, Cario can only run on CPU, so only some small decision tree models can be deployed, which is not conducive to the deployment of large models in the long term. 2.2 Challenge: More Powerful Proof System The generation speed and memory usage of ZK Proof are crucial, one is related to user experience and feasibility, while the other pertains to cost and scalability. Is the current zkp system sufficient? Sufficient, but not good enough
 Modulus Lab has detailed the specific situation of models and computing power in the article “The Cost of Intelligence: Proving Machine Learning Inference with Zero-Knowledge”. When you have time, you can read this “Paper0” in the ZKML field: https://drive.google.com/file/d/1tylpowpaqcOhKQtYolPlqvx6R2Gv4IzE/view Below are the different proof systems mentioned in Paper 0. ï»ż Based on the above zk algorithms, Modulus Lab conducts tests from two dimensions: time consumption and memory occupancy and controls two core variables: parameters, and layers in these two dimensions. The following are benchmark suites. Such a design can roughly cover LeNet5’s 60k parameter volume, 0.5MFLOPs, to ResNet-34’s 22M parameter volume, 3.77 GFLOPs. ï»ż Time consumption test: ï»ż Memory consumption test: ï»ż Based on the above data, overall, the current zk algorithm and the potential to support the generation of large model proofs are available, but the corresponding costs are still high, requiring even more than 10 times optimization. Taking Gloth16 as an example, although it benefits from the optimization of computation time brought by high concurrency, as a trade-off, memory usage significantly increases. The performance of Plonky2 and zkCNN in time and space also verifies this point. So now the question has actually changed from whether the zkp system can support on-chain AI to is the cost worth supporting AI on-chain. And with the exponential rise in model parameters, the pressure on the proof system will also rapidly increase. Indeed, is there a trustless neural network now? No! It’s because the cost is too high. Therefore, creating an AI-customized proof system is of vital importance. At the same time, to implement AI logic, which is very complex in a single call, the gas consumption model also needs to be redesigned. A high-performance zkvm is essential. But now we can see many high-performance attempts, such as OlaVM, polygon Miden, etc. The continuous optimization of these infrastructures greatly improves the feasibility of on-chain AI. 3. Is the application worth looking forward to? Although on-chain AI is still in its early stages, it may be between the starting and development stages when viewed from the above layers. However, the AI direction never lacks excellent teams and innovative ideas. As mentioned above, looking at the development stage of AI in the web3 world, the current market is at the mid-stage from starting to developing, and the product attempt direction is still mainly based on user experience optimization based on existing functions. But the most valuable thing is to turn trustless subjects from people into tools through AI on the chain, subverting the original product form in terms of security and efficiency. Next, starting from some existing application attempts, analyze the long-term product development direction of on-chain AI 3.1 The Rockefeller Bot: The world’s first on-chain AI Rockefeller is the first on-chain AI product launched by the Modulus Lab team, with a strong “commemorative value”. This model is essentially a trading bot. Specifically, the training data of Rockefeller is a large amount of publicly available WEth-USDC price/exchange rate on the chain. It is a three-layer feed-forward neural network model, and the prediction target is the future WEth price rise and fall. Here is the process when the trading bot decides to trade: Rockefeller generates ZKP for the prediction results on ZK-rollup; ZKP is verified on L1 (funds are kept by L1 contract) and operations are executed; It can be seen that the prediction and fund operations of the trading bot are completely decentralized and trustless. As mentioned above, from a higher dimension, Rockefeller is more like a new type of Defi gameplay. Compared to trusting other traders, in this mode, users are actually betting on the transparent + verifiable + autonomous model. Users do not need to trust centralized institutions to ensure the legality of the model decision-making process. At the same time, AI can also eliminate the impact of human nature to the greatest extent and make decisions more decisively. ï»ż You might already want to invest some money in Rockefeller and give it a try, but can this really make money? No, it can’t, according to the Modulus team. Rather than being an application, Rockefeller is more like a Proof of Concept (POC) for on-chain AI. Due to limitations in cost, efficiency, and proof systems, Rockefeller’s primary purpose is to serve as a demo to show the feasibility of on-chain AI to the web3 world. (Rockefeller has completed its mission and is now offline T T) 3.2 Leela: The world’s first on-chain AI game Leela v.s. the World, recently released, is also from Modulus Lab. The game mechanism is simple, where human players form teams to battle against AI. In the game, players can stake their bets, and at the end of each match, the loser’s pool will be distributed to the winner according to the number of tokens staked. ï»ż Speaking of on-chain AI, this time Modulus Lab has deployed a larger deep neural network (with a number of Parameters > 3,700,000). Although Leela surpasses Rockefeller in terms of model scale and product content, it is essentially still a large-scale on-chain AI experiment. The mechanism and operation mode behind Leela is what need attention, which can help us better understand the operation mode and improvement space of on-chain AI. Here is the logic diagram given by the official: ï»ż Every move Leela makes, or every prediction, will generate a ZKP, and only after being verified by the contract will it take effect in the game. That is to say, thanks to the trustless autonomous AI, the funds bet by users and the fairness of the game are fully protected by cryptography, and there is no need to trust the game developer. Leela uses the Halo2 algorithm, mainly because its tools and flexible design can help design a more efficient proof system. The specific performance situation can refer to the test data above. But at the same time, during the operation of Leela, the Modulus team also found the drawbacks of Halo2, such as slow proof generation and unfriendliness to one-shot proving, etc. Therefore, it further confirms the conclusion drawn from the previous test data: if we need to bring larger models into web3, we need to develop a more powerful proof system. However, the value of Leela lies in bringing us a larger imagination space for AI + Web3 game, at this moment, King of Glory players should be extremely hopeful for the matchmaking algorithm to be fully on-chain:) Gamefi needs more high-quality content support and a fairer game system, and on-chain AI just provides this. For example, introducing AI-driven game scenes or NPCs into the game provides a huge imagination space for both the player’s game experience and the gameplay of the economic system. 3.3 Worldcoin: AI + KYC Worldcoin is an on-chain identity system (Privacy-Preserving Proof-of-Personhood Protocol) that uses biometrics to establish an identity system and achieve derivative functions like payments. The goal is to combat Sybil attacks, and it now has more than 1.4 million registered users. ï»ż Users scan their iris with a hardware device called Orb, and personal information is added to a database. Worldcoin runs a CNN model in the computational environment of the Orb hardware to compress and validate the effectiveness of user iris data. It sounds powerful, but for true decentralized identity verification, the Worldcoin team is exploring model output verification through ZKPs. Challenges Worth mentioning is that the CNN model used by Worldcoin has a size: parameters = 1.8 million, layers = 50. Based on the test data shown above, the current proof system can handle this in terms of time, but the memory consumption is impossible to complete for consumer-grade hardware. 3.4 Other projects Pragma: Pragma is a ZK oracle developed from the Starkware ecosystem. The team is also exploring how to solve the problem of decentralized off-chain data verification through on-chain AI. Users no longer need to trust validators but can verify off-chain data sources through sufficiently accurate and verifiable on-chain AI, such as reading corresponding physical information as input and making decisions for actual asset or identity verification. Lyra finance: Lyra finance is an option AMM that provides a derivatives trading market. To improve capital utilization, the Lyra team and Modulus Lab are collaborating to develop an AMM based on a verifiable AI model. With a verifiable, fair AI model, Lyra finance has the opportunity to become a large-scale implementation experiment for on-chain AI, bringing fair matchmaking to web3 users for the first time, optimizing the on-chain market through AI, and providing higher returns. Giza: A ZKML platform that deploys models directly on-chain rather than off-chain verification. Nice try, but
 Due to the computational power and Cairo’s lack of support for CUDA-based proof generation, Giza can only support the deployment of small models. This is the most fatal problem. In the long run, large models that can have a disruptive impact on web3 will require powerful hardware support, such as GPUs. Zama-ai: Homomorphic encryption of models. Homomorphic encryption is a form of encryption where: f[E(x)] = E[f(x)], where f is an operation, E is a homomorphic encryption algorithm, and x is a variable, for example, E(a) + E(b) = E(a + b). It allows specific forms of algebraic operations on ciphertext to result in an encrypted result, and decrypting this result will yield the same result as performing the same operation on the plaintext. Model privacy has always been a hotspot and bottleneck of AI. Although zk is privacy-friendly, zk does not equate to privacy. Zama is committed to ensuring the privacy-preserving execution of models. ML-as-a-service: This is currently just a thought direction, without specific applications, but the goal is to solve the problems of malicious behavior by centralized ML service providers and user trust through ZKPs. Daniel Kang has a detailed description in the article “Trustless Verification of Machine Learning” (refer to the diagram in the article). ï»ż 4. Conclusion Overall, AI in the web3 world is in a very early stage, but there is no doubt that the maturation and popularization of on-chain AI will take the value of web3 to another level. Technically, web3 can provide a unique infrastructure for AI, and AI is an essential tool for changing the production relations of web3. The combination of the two can spark many possibilities, which is an exciting and imaginative place. From the perspective of AI’s motivation to go on-chain, on one hand, the transparent + verifiable on-chain AI transforms the decentralized and trustless entities from people to AI tools, greatly enhancing efficiency and security, and providing possibilities for creating entirely new product forms. On the other hand, as the infrastructure of web3 continues to iterate, web3 genuinely needs a killer application that can maximize the value of this infrastructure. ZKML fits this point, for example, ZK-rollup is likely to be the entry point for AI into web3 in the future. From a feasibility perspective, the current infrastructure can support models of a certain scale to some extent, but there are still many uncertainties. Using ZKP to create verifiable models currently appears to be the only path for AI to go on-chain and may also be the most deterministic technical path for bringing AI into web3 applications. However, in the long run, the current proof system needs to be exponentially improved to sufficiently support the increasingly large models. From the perspective of application scenarios, AI can almost perfectly participate in any direction of web3, whether it is gaming, DeFi, DID, or tooling
 Although the existing projects are very scarce and lack long-term value, they have not yet transitioned from a tool to improve efficiency to an application that changes production relations. But it’s exciting that someone has taken the first step, and we can see the earliest look at on-chain AI and its future possibilities. Reference https://worldcoin.org/blog/engineering/intro-to-zkml https://medium.com/coinmonks/chapter-1-how-to-put-your-ai-on-chain-8af2db013c6b https://medium.com/@ModulusLabs/chapter-2-why-put-your-ai-on-chain-33b7625147b7 https://medium.com/@ModulusLabs/chapter-3-the-worlds-first-on-chain-ai-trading-bot-c387afe8316c https://medium.com/@ModulusLabs/chapter-4-blockchains-that-self-improve-e9716c041f36 https://medium.com/@ModulusLabs/chapter-4-5-trustless-ai-for-living-breathing-games-cc9f24080e30 https://medium.com/@ModulusLabs/chapter-5-the-cost-of-intelligence-da26dbf93307 https://medium.com/@ModulusLabs/chapter-6-leela-vs-the-world-the-worlds-1st-on-chain-ai-game-17ea299a06b6 https://drive.google.com/file/d/1tylpowpaqcOhKQtYolPlqvx6R2Gv4IzE/view https://medium.com/@danieldkang/trustless-verification-of-machine-learning-6f648fd8ba88 About Foresight Ventures Foresight Ventures is dedicated to backing the disruptive innovation of blockchain for the next few decades. We manage multiple funds: a VC fund, an actively-managed secondary fund, a multi-strategy FOF, and a private market secondary fund, with AUM exceeding $400 million. Foresight Ventures adheres to the belief of a “Unique, Independent, Aggressive, Long-Term mindset” and provides extensive support for portfolio companies within a growing ecosystem. Our team is composed of veterans from top financial and technology companies like Sequoia Capital, CICC, Google, Bitmain, and many others. Website: https://www.foresightventures.com/ Disclaimer: All articles by Foresight Ventures are not intended to be investment advice. Individuals should assess their own risk tolerance and make investment decisions prudently.

Foresight Ventures: AI + Web3 =

Author: Ian Xu@Foresight Ventures

TL;DR

This article discusses the intersection of AI and Web3, exploring how on-chain AI can bring significant value to the decentralized internet. It highlights several projects, including Worldcoin, Pragma, Lyra Finance, Giza, Zama.ai, and potential ML-as-a-service applications.

The article emphasizes that AI in the Web3 context is in its early stages but holds great potential. On-chain AI, being transparent and verifiable, can significantly enhance efficiency and security, enabling new product forms. The ZKML is particularly promising, with ZK-rollup potentially serving as an entry point for AI into the Web3 world.

While the current infrastructure can support models of a certain scale, there are still many uncertainties, particularly in terms of verifying models through Zero-Knowledge Proofs. This is seen as the inevitable path for AI on-chain but requires exponential improvements in proof systems to support increasingly large models.

In terms of applications, on-chain AI could potentially participate in any aspect of Web3, including gaming, DeFi, DID, and tooling. Despite the scarcity of existing projects, the article remains optimistic about the potential of on-chain AI and its transformative impact on the Web3 space.

1. AI + Web3 = ?

Developers’ obsessive dedication to infrastructure construction and the continuous updates of various rollup solutions have indeed made a breakthrough in the originally lagging computing power of web3. This has also made it possible for AI to be put on the blockchain. But you might want to say that instead of going to great lengths to implement on-chain AI, running models off-chain seems to be able to meet most needs. In fact, almost all AI models are currently running in a black-box, centralized mode and are creating irreplaceable value in various fields.

1.1 Let’s go back to the most basic question, what is AI on the blockchain?

The mainstream understanding is to make AI models transparent + verifiable through Web3.

To be more specific, AI on the blockchain means the complete verification of artificial intelligence models. That is to say, a model needs to disclose the following three points to the entire network (users or verifiers):

Model architecture;

Model parameters and weights: Disclosing parameters and weights may sometimes have a negative impact on product security. Therefore, for specific scenarios, such as risk control models, weights can be hidden to ensure security;

Model input: In the context of web3, it is generally public data on the chain.

When the above conditions are met, the whole model execution process is deterministic and no longer a black-box operation. Anyone can verify the model’s input and results on the blockchain, thereby preventing the model owner or related authorized person from manipulating the model.

ï»ż

1.2 What is the driving force for on-chain AI?

The significance of on-chain AI is not to replace the centralized operation mode of Web2 AI, but:

Without sacrificing decentralization and trustlessness, it creates the next stage of value for the web3 world. The current Web3 is like the early stage of web2, and it does not yet have the ability to undertake broader applications or create greater value. Only after incorporating AI, can the imagination of Dapp truly leap to the next stage, and these on-chain applications may become closer to the level of web2 applications. This proximity is not in making the functions more similar, but in enhancing user experience and possibilities by leveraging the value of Web3.

It provides a transparent, trustless solution for the black-box operation mode of web2 AI.

Imagine the application scenarios of web3:

Adding a recommendation algorithm to the NFT trading platform, recommending corresponding NFTs based on user preferences, and improving conversion;

Adding AI opponents in games for a more transparent and fair gaming experience;





However, these applications are further improvements in efficiency or user experience through AI for existing functions.

Is it valuable? Yes.

Is the value big? It depends on the product and the scenario.

The value that AI can create is not limited to optimizing from 99 to 100. What really excites me is the new applications from 0 to 1, some use cases that can only be achieved through transparent + verifiable on-chain models. However, these “exciting” use cases currently rely mainly on imagination, without mature applications. Here are some brainstorming ideas:

Crypto trading based on neural network decision models: One product form could be more like an upgraded version of copy trading, or even a completely new way of trading. Users no longer need to trust or research other experienced traders but bet on completely open and transparent models and their performance. Essentially, AI trades faster and more decisively based on predictions of future crypto prices. However, without the “trustless autonomy” inherent in on-chain AI, such betting objects or standards simply do not exist. Users/investors can transparently see the reasons, processes, and even the exact probability of future rises/falls in the model’s decision-making.

AI models acting as referees: A product could be a new form of the oracle, predicting the accuracy of data sources through AI models. Users no longer need to trust validators and don’t have to worry about nodes misbehaving. Oracle providers don’t even need to design complex node networks and reward-punishment mechanisms to achieve decentralization. Correspondingly, the on-chain transparent + verifiable AI is already sufficient to verify the confidence level of off-chain data sources. This new product form has the potential to dominate in terms of security, efficiency, and cost, and the object of decentralization jumps from humans to “trustless autonomy” AI tools, which is undoubtedly safer.

Organization management/operating systems based on large models: The governance of DAOs should inherently be efficient, decentralized, and fair, but the current situation is quite the opposite, loose and bloated, lacking transparency and fairness. The introduction of on-chain AI can provide a very fitting solution, maximizing management mode and efficiency, and minimizing systemic and human risks in management. We can even imagine a new development and operation mode for web3 projects, where the entire framework and future development direction and proposals hardly depend on the decision-making of the development team or DAO voting. Instead, decisions are made based on the larger data acquisition and computational abilities of large models. But all this is premised on the model being on-chain. Without AI’s “trustless autonomy”, there is no transition from humans to tools in the decentralized world.





In summary,

New product forms based on on-chain AI can be summarized as transitioning the subject of decentralization and trustlessness from humans to AI tools. This is in line with the evolution of productivity in the traditional world, where initially, efforts were made to upgrade and enhance human efficiency, and later, humans were replaced by intelligent tools, revolutionizing the original product design in terms of security and efficiency.

The most critical point, and the premise of all the above, is to make AI transparent + verifiable through Web3.

ï»ż

1.3 The Next Stage of Web3

Web3, as a phenomenal technological innovation, cannot just stay in its initial stage. Traffic and economic models are important, but users will not always stay in pursuit of traffic or spend a lot of resources to do X to earn, and Web3 will not onboard the next wave of new users because of this. But one thing is certain: the revolution of productivity and value in the crypto world must come from the addition of AI.

I think it can be roughly divided into the following three stages:

Start: The update and iteration of zero-knowledge proof algorithms and hardware provide the first possibility for the emergence of on-chain AI; (we are here)

Development: Whether it’s the improvement of existing applications by AI or the new products based on on-chain AI, both are pushing the entire industry forward;

Endgame: What is the ultimate direction of on-chain AI?

The above discussions are all about exploring application scenarios bottom-up through the combination of AI and Web3. If we switch to a top-down approach to view the on-chain AI, could AI trace back to Web3 itself? AI + blockchain = adaptive blockchain

Some public chains will take the lead in integrating on-chain AI, transforming from the level of public chains into a kind of adaptive one. The development direction no longer depends on project foundation decisions but is based on massive data decision-making, and the level of automation far exceeds traditional Web3, thus standing out from the current multi-chain prosperity.

With the blessing of verifiable + transparent AI, where web3’s self-regulation is manifested can refer to a few examples mentioned by Modulus Lab:

On-chain transaction markets can automatically adjust in a decentralized manner, such as adjusting the interest rate of stablecoins in real-time based on publicly available on-chain data, without the need for trust assumptions;

Multimodal learning can allow on-chain protocol interactions to be completed through biometric recognition, providing secure KYC and achieving complete trustless identity management;

Allow on-chain applications to maximize the value brought by on-chain data, supporting services such as customized content recommendation.

From another perspective, zkrollup keeps iterating and optimizing, but it always lacks a real application that can only run on the zk ecosystem, ZKML exactly meets this point, and its imagination space is also large enough. ZK-rollup is likely to serve as the entry point for AI into web3 in the future, creating greater value, and the two complement each other.

2. Implementation and Feasibility

2.1 What Can Web3 Provide for AI?

Infrastructure and ZK are undoubtedly the most fiercely competitive tracks in web3. Various ZK projects have made great efforts in circuit optimization and algorithm upgrading, whether it’s the exploration of multi-layer networks, the development of modularization and data availability layers, further customizing rollup as a service or even hardware acceleration
 These attempts are pushing the scalability, cost, and computing power of Web3 infrastructure to the next level.

ï»ż

It sounds good to put AI on the chain, but how exactly is it done?

One approach is through the ZK-proof system. For example, create a customized circuit for machine learning, the process of generating a witness off-chain is the process of model execution, and generate a proof for the model prediction process (including model parameters and inputs), anyone can verify the proof on-chain.

The AI model still runs on an efficient cluster, even with some hardware acceleration to further enhance computational speed, maximizing the use of computing power while ensuring that no centralized person or institution can tamper with or interfere with the model, that is, to ensure:

Model prediction result certainty = verifiable (input + model architecture + parameters)

Based on the above approach, we can further infer which infrastructures are crucial for AI on-chain:

ZKP system, rollup: Rollups expand our imagination of on-chain computing capabilities, packaging a bunch of transactions, and even recursively generating proof of proof to further reduce costs. For the current large models, the first step to provide possibilities is the proof system and rollup;

Hardware acceleration: ZK rollup provides a verifiable basis, but the generation speed of proof directly relates to the usability and user experience of the model. Waiting for several hours to generate a model’s proof is obviously not going to work, so hardware acceleration through FPGA is a great boost.

Cryptography: Cryptography is the foundation of the crypto world, and on-chain models and sensitive data also need to ensure privacy.

Supplement:

The basis of large models is GPU. Without high parallel support, the efficiency of large models will be very low, and they cannot run. Therefore, for an on-chain zk ecosystem:

GPU-friendly = AI-friendly

Take Starknet as an example, Cario can only run on CPU, so only some small decision tree models can be deployed, which is not conducive to the deployment of large models in the long term.

2.2 Challenge: More Powerful Proof System

The generation speed and memory usage of ZK Proof are crucial, one is related to user experience and feasibility, while the other pertains to cost and scalability.

Is the current zkp system sufficient?

Sufficient, but not good enough


Modulus Lab has detailed the specific situation of models and computing power in the article “The Cost of Intelligence: Proving Machine Learning Inference with Zero-Knowledge”. When you have time, you can read this “Paper0” in the ZKML field: https://drive.google.com/file/d/1tylpowpaqcOhKQtYolPlqvx6R2Gv4IzE/view

Below are the different proof systems mentioned in Paper 0.

ï»ż

Based on the above zk algorithms, Modulus Lab conducts tests from two dimensions: time consumption and memory occupancy and controls two core variables: parameters, and layers in these two dimensions. The following are benchmark suites. Such a design can roughly cover LeNet5’s 60k parameter volume, 0.5MFLOPs, to ResNet-34’s 22M parameter volume, 3.77 GFLOPs.

ï»ż

Time consumption test:

ï»ż

Memory consumption test:

ï»ż

Based on the above data, overall, the current zk algorithm and the potential to support the generation of large model proofs are available, but the corresponding costs are still high, requiring even more than 10 times optimization. Taking Gloth16 as an example, although it benefits from the optimization of computation time brought by high concurrency, as a trade-off, memory usage significantly increases. The performance of Plonky2 and zkCNN in time and space also verifies this point.

So now the question has actually changed from whether the zkp system can support on-chain AI to is the cost worth supporting AI on-chain. And with the exponential rise in model parameters, the pressure on the proof system will also rapidly increase. Indeed, is there a trustless neural network now? No! It’s because the cost is too high.

Therefore, creating an AI-customized proof system is of vital importance. At the same time, to implement AI logic, which is very complex in a single call, the gas consumption model also needs to be redesigned. A high-performance zkvm is essential. But now we can see many high-performance attempts, such as OlaVM, polygon Miden, etc. The continuous optimization of these infrastructures greatly improves the feasibility of on-chain AI.

3. Is the application worth looking forward to?

Although on-chain AI is still in its early stages, it may be between the starting and development stages when viewed from the above layers. However, the AI direction never lacks excellent teams and innovative ideas.

As mentioned above, looking at the development stage of AI in the web3 world, the current market is at the mid-stage from starting to developing, and the product attempt direction is still mainly based on user experience optimization based on existing functions. But the most valuable thing is to turn trustless subjects from people into tools through AI on the chain, subverting the original product form in terms of security and efficiency.

Next, starting from some existing application attempts, analyze the long-term product development direction of on-chain AI

3.1 The Rockefeller Bot: The world’s first on-chain AI

Rockefeller is the first on-chain AI product launched by the Modulus Lab team, with a strong “commemorative value”. This model is essentially a trading bot. Specifically, the training data of Rockefeller is a large amount of publicly available WEth-USDC price/exchange rate on the chain. It is a three-layer feed-forward neural network model, and the prediction target is the future WEth price rise and fall.

Here is the process when the trading bot decides to trade:

Rockefeller generates ZKP for the prediction results on ZK-rollup;

ZKP is verified on L1 (funds are kept by L1 contract) and operations are executed;

It can be seen that the prediction and fund operations of the trading bot are completely decentralized and trustless. As mentioned above, from a higher dimension, Rockefeller is more like a new type of Defi gameplay. Compared to trusting other traders, in this mode, users are actually betting on the transparent + verifiable + autonomous model. Users do not need to trust centralized institutions to ensure the legality of the model decision-making process. At the same time, AI can also eliminate the impact of human nature to the greatest extent and make decisions more decisively.

ï»ż

You might already want to invest some money in Rockefeller and give it a try, but can this really make money?

No, it can’t, according to the Modulus team. Rather than being an application, Rockefeller is more like a Proof of Concept (POC) for on-chain AI. Due to limitations in cost, efficiency, and proof systems, Rockefeller’s primary purpose is to serve as a demo to show the feasibility of on-chain AI to the web3 world. (Rockefeller has completed its mission and is now offline T T)

3.2 Leela: The world’s first on-chain AI game

Leela v.s. the World, recently released, is also from Modulus Lab. The game mechanism is simple, where human players form teams to battle against AI. In the game, players can stake their bets, and at the end of each match, the loser’s pool will be distributed to the winner according to the number of tokens staked.

ï»ż

Speaking of on-chain AI, this time Modulus Lab has deployed a larger deep neural network (with a number of Parameters > 3,700,000). Although Leela surpasses Rockefeller in terms of model scale and product content, it is essentially still a large-scale on-chain AI experiment. The mechanism and operation mode behind Leela is what need attention, which can help us better understand the operation mode and improvement space of on-chain AI. Here is the logic diagram given by the official:

ï»ż

Every move Leela makes, or every prediction, will generate a ZKP, and only after being verified by the contract will it take effect in the game. That is to say, thanks to the trustless autonomous AI, the funds bet by users and the fairness of the game are fully protected by cryptography, and there is no need to trust the game developer.

Leela uses the Halo2 algorithm, mainly because its tools and flexible design can help design a more efficient proof system. The specific performance situation can refer to the test data above. But at the same time, during the operation of Leela, the Modulus team also found the drawbacks of Halo2, such as slow proof generation and unfriendliness to one-shot proving, etc. Therefore, it further confirms the conclusion drawn from the previous test data: if we need to bring larger models into web3, we need to develop a more powerful proof system.

However, the value of Leela lies in bringing us a larger imagination space for AI + Web3 game, at this moment, King of Glory players should be extremely hopeful for the matchmaking algorithm to be fully on-chain:) Gamefi needs more high-quality content support and a fairer game system, and on-chain AI just provides this. For example, introducing AI-driven game scenes or NPCs into the game provides a huge imagination space for both the player’s game experience and the gameplay of the economic system.

3.3 Worldcoin: AI + KYC

Worldcoin is an on-chain identity system (Privacy-Preserving Proof-of-Personhood Protocol) that uses biometrics to establish an identity system and achieve derivative functions like payments. The goal is to combat Sybil attacks, and it now has more than 1.4 million registered users.

ï»ż

Users scan their iris with a hardware device called Orb, and personal information is added to a database. Worldcoin runs a CNN model in the computational environment of the Orb hardware to compress and validate the effectiveness of user iris data. It sounds powerful, but for true decentralized identity verification, the Worldcoin team is exploring model output verification through ZKPs.

Challenges

Worth mentioning is that the CNN model used by Worldcoin has a size: parameters = 1.8 million, layers = 50. Based on the test data shown above, the current proof system can handle this in terms of time, but the memory consumption is impossible to complete for consumer-grade hardware.

3.4 Other projects

Pragma: Pragma is a ZK oracle developed from the Starkware ecosystem. The team is also exploring how to solve the problem of decentralized off-chain data verification through on-chain AI. Users no longer need to trust validators but can verify off-chain data sources through sufficiently accurate and verifiable on-chain AI, such as reading corresponding physical information as input and making decisions for actual asset or identity verification.

Lyra finance: Lyra finance is an option AMM that provides a derivatives trading market. To improve capital utilization, the Lyra team and Modulus Lab are collaborating to develop an AMM based on a verifiable AI model. With a verifiable, fair AI model, Lyra finance has the opportunity to become a large-scale implementation experiment for on-chain AI, bringing fair matchmaking to web3 users for the first time, optimizing the on-chain market through AI, and providing higher returns.

Giza: A ZKML platform that deploys models directly on-chain rather than off-chain verification. Nice try, but
 Due to the computational power and Cairo’s lack of support for CUDA-based proof generation, Giza can only support the deployment of small models. This is the most fatal problem. In the long run, large models that can have a disruptive impact on web3 will require powerful hardware support, such as GPUs.

Zama-ai: Homomorphic encryption of models. Homomorphic encryption is a form of encryption where: f[E(x)] = E[f(x)], where f is an operation, E is a homomorphic encryption algorithm, and x is a variable, for example, E(a) + E(b) = E(a + b). It allows specific forms of algebraic operations on ciphertext to result in an encrypted result, and decrypting this result will yield the same result as performing the same operation on the plaintext. Model privacy has always been a hotspot and bottleneck of AI. Although zk is privacy-friendly, zk does not equate to privacy. Zama is committed to ensuring the privacy-preserving execution of models.

ML-as-a-service: This is currently just a thought direction, without specific applications, but the goal is to solve the problems of malicious behavior by centralized ML service providers and user trust through ZKPs. Daniel Kang has a detailed description in the article “Trustless Verification of Machine Learning” (refer to the diagram in the article).

ï»ż

4. Conclusion

Overall, AI in the web3 world is in a very early stage, but there is no doubt that the maturation and popularization of on-chain AI will take the value of web3 to another level. Technically, web3 can provide a unique infrastructure for AI, and AI is an essential tool for changing the production relations of web3. The combination of the two can spark many possibilities, which is an exciting and imaginative place.

From the perspective of AI’s motivation to go on-chain, on one hand, the transparent + verifiable on-chain AI transforms the decentralized and trustless entities from people to AI tools, greatly enhancing efficiency and security, and providing possibilities for creating entirely new product forms. On the other hand, as the infrastructure of web3 continues to iterate, web3 genuinely needs a killer application that can maximize the value of this infrastructure. ZKML fits this point, for example, ZK-rollup is likely to be the entry point for AI into web3 in the future.

From a feasibility perspective, the current infrastructure can support models of a certain scale to some extent, but there are still many uncertainties. Using ZKP to create verifiable models currently appears to be the only path for AI to go on-chain and may also be the most deterministic technical path for bringing AI into web3 applications. However, in the long run, the current proof system needs to be exponentially improved to sufficiently support the increasingly large models.

From the perspective of application scenarios, AI can almost perfectly participate in any direction of web3, whether it is gaming, DeFi, DID, or tooling
 Although the existing projects are very scarce and lack long-term value, they have not yet transitioned from a tool to improve efficiency to an application that changes production relations. But it’s exciting that someone has taken the first step, and we can see the earliest look at on-chain AI and its future possibilities.

Reference

https://worldcoin.org/blog/engineering/intro-to-zkml

https://medium.com/coinmonks/chapter-1-how-to-put-your-ai-on-chain-8af2db013c6b

https://medium.com/@ModulusLabs/chapter-2-why-put-your-ai-on-chain-33b7625147b7

https://medium.com/@ModulusLabs/chapter-3-the-worlds-first-on-chain-ai-trading-bot-c387afe8316c

https://medium.com/@ModulusLabs/chapter-4-blockchains-that-self-improve-e9716c041f36

https://medium.com/@ModulusLabs/chapter-4-5-trustless-ai-for-living-breathing-games-cc9f24080e30

https://medium.com/@ModulusLabs/chapter-5-the-cost-of-intelligence-da26dbf93307

https://medium.com/@ModulusLabs/chapter-6-leela-vs-the-world-the-worlds-1st-on-chain-ai-game-17ea299a06b6

https://drive.google.com/file/d/1tylpowpaqcOhKQtYolPlqvx6R2Gv4IzE/view

https://medium.com/@danieldkang/trustless-verification-of-machine-learning-6f648fd8ba88

About Foresight Ventures

Foresight Ventures is dedicated to backing the disruptive innovation of blockchain for the next few decades. We manage multiple funds: a VC fund, an actively-managed secondary fund, a multi-strategy FOF, and a private market secondary fund, with AUM exceeding $400 million. Foresight Ventures adheres to the belief of a “Unique, Independent, Aggressive, Long-Term mindset” and provides extensive support for portfolio companies within a growing ecosystem. Our team is composed of veterans from top financial and technology companies like Sequoia Capital, CICC, Google, Bitmain, and many others.

Website: https://www.foresightventures.com/

Disclaimer: All articles by Foresight Ventures are not intended to be investment advice. Individuals should assess their own risk tolerance and make investment decisions prudently.
Foresight Ventures: What is RaaS? Which Type of RaaS Will Win The Market?Author: Ian Xu@Foresight Ventures TL;DR This article introduces Rollup-as-a-Service, which caters to the growing and diverse needs of Dapps. RaaS enables Dapps to maintain high performance, reduce costs, and deliver better user experiences without compromising their interaction with the broader ecosystem. While RaaS has potential, the primary advantage of RaaS appears to be its customization options, and overall, demand for RaaS is currently limited, and its value needs further exploration. While Optimistic Rollups currently lead due to better compatibility and lower thresholds, Zero Knowledge rollups may eventually dominate due to superior performance and customization. This article compares L3 and L2 for RaaS and RaaS vs app chains, indicating RaaS may provide better interoperability and security if a strong ecosystem is built. This article analyzes various RaaS projects to determine the most promising approaches. It covers ZK-based projects (like StarkWare and Opside), Optimistic-based projects (such as Caldera), and Modular Blockchain solutions (like Celestia). 1. RaaS Introduction 1.1 Rollup: The Most Promising Scalability Solution The initial intention of Layer 2 is to alleviate the congestion problem of the mainnet and provide services for Dapps with lower costs and higher TPS under the premise of ensuring security. Rollup executes high-cost transaction execution on L2 and packs transactions to L1 for verification while ensuring that the complete transaction content can be verified. Under the premise of inheriting Ethereum’s security, it has stronger comprehensive performance. Therefore, rollup has made its way out among various Layer 2 solutions and is undoubtedly the most promising off-chain scaling solution currently. 1.2 Rollup-as-a-service: One form of App-specific chain With the gradual growth of some Dapps and the expansion of various new applications, rollup as a general-purpose scaling can’t perfectly meet these projects’ pursuit of user experience and cost structure. The high traffic and high-performance requirements (such as AAA games focusing on player interaction) make these applications need more customized scalability solutions. App-specific chain is one of the best solutions for these Dapps. The concept of an App chain is not strange. Different projects can customize the design of blockchain according to their own application scenarios and needs, allowing Dapps to enjoy the resources of a chain exclusively. While ensuring not to break away from other ecosystems, they can achieve lower operating costs and higher performance, bringing a better user experience. For example, Cosmos, based on Tendermint consensus, provides Dapps with a low-cost environment to build a sovereign L1 public chain. At the same time, based on the IBC communication protocol, different app chains can more easily achieve cross-chain assets/information. You can refer to the IBC packet lifecycle given by Cosmos official👇 ï»ż Talking about scalability without considering the ecosystem is meaningless. The feasibility of the App chain solution is definitely based on strong interoperability and ecosystem support. For example, Cosmos gradually improves its own ecosystem through the sovereign L1 public chain and cross-chain advantages brought by IBC in the ecosystem. Based on the above understanding, another thought of the app-specific chain is to achieve Dapp’s pursuit of customized functions, high performance, and low cost through customized roll-up. Some RaaS based on the second-layer network can also make the project’s interaction more convenient, and have a positive impact on the ecological layout. 2. The Value of RaaS The trend of multiple chains and rollups in the crypto world seems inevitable. The emergence of RaaS projects like spring bamboo shoots has laid the foundation for the development of new Dapp forms. But under such consensus, I still want to ask a realistic question in the opposite direction: It is indeed attractive to allow anyone to quickly launch a rollup, but besides being in the right direction and cool, does it really create enough value for those in need? This question can be further divided into two points: Whether there are enough projects in the market that have sufficient motivation to use RaaS; Whether RaaS has created considerable value for the project parties. This question essentially discusses the demand and the value brought by RaaS. Sufficient projects have demand or RaaS can provide attractive improvements. In terms of demand, as some Dapps continue to grow, project parties do indeed urgently need to seek: Lower cost Higher performance Special functions Cost Referring to the data given by L2fees, L2 rollup has achieved the ultimate in cost optimization, which is a huge improvement compared to the Ethereum mainnet. Looking at the RaaS test data of Caldera Chains again, there is no qualitative change in cost, more like a 99–100 optimization. At the same time, the implementation of EIP4844 and danksharding will further reduce the cost of L2 rollup, and the difference brought by RaaS in cost and efficiency will also further narrow. A solution that can significantly reduce the transaction fee is attractive, but most RaaS can’t do this. Considering the migration, overall ecology, interoperability, security, and other costs, do project parties really have enough motivation to use RaaS? For most conventional Dapps or users who are not so sensitive to performance and cost, perhaps general-purpose scaling is enough. ï»ż ï»ż Performance L2 rollup already has the ability to provide ultra-high TPS. Referring to the data provided by Caldera, RaaS based on Op has almost no advantage in block time. Although ZK RaaS can provide more customized data storage and compression, there is not much demand for such services. RaaS based on the second layer network can indeed achieve faster speed and lower cost by settling transactions on L2, thereby improving user experience. As mentioned above, facing the imperfect ecosystem and other migration/development costs, do project parties still have enough motivation to use RaaS? ï»ż Customized Features In terms of value creation, some RaaS can indeed provide features that are currently difficult to implement or designs that are inefficient in general-purpose scaling. For example: The first element of the current L2’s ZK circuit design is compatibility, in order to serve all Dapps, the circuit design sacrifices efficiency to a certain extent and does not optimize for specific Dapps. The value of RaaS can be clearly demonstrated: customizing the design of the ZK circuit for specific Dapps or providing more efficient storage structures and data compression services to achieve higher performance; Implementation of privacy functions. Although ZK rollup is friendly to privacy, due to considerations of decentralization and security, users’ transaction data still needs to be published to L1 as a history log after compression, allowing all users to verify. Therefore, the current general-purpose scaling rollup cannot achieve privacy. RaaS can customize the implementation of privacy functions on the basis of rollup or even rollup of rollup, creating value for projects with strong privacy needs. Therefore, the current value of RaaS is customization > pure cost and efficiency. (Not excluding the cost and efficiency improvements brought by customization) To answer the initial question: Does RaaS really create enough value for people in need? I think the current demand for RaaS is limited, and general-purpose scaling can meet more than 90% of needs. Although customized rollups have begun to play an irreplaceable role in some niche areas, they are not mainstream after all. The value created by RaaS is limited and needs to be further explored on the basis of considering the ecosystem, interoperability, and other comprehensive factors. 3. Exploring the Ultimate Form of RaaS Since the emergence of the L2 rollup, the exploration of RaaS has never stopped, and various Rollup-as-a-service implementation solutions have appeared on the market so far. Referring to the ecosystem layout on Messari, you can roughly see the implementation paths of different RaaS. So the key questions are: What solutions make sense? What kind of RaaS will eventually win the market? ï»ż 3.1 OP or ZK The discussion about optimistic or zero knowledge has never stopped. Although ZKrollup theoretically has stronger performance, much faster finality time than optimistic rollup, and higher security, optimistic rollup has better compatibility and a lower threshold. In the existing RaaS projects, most projects are primarily based on optimistic rollups. I think the main reasons are: Ecosystem always comes first. RaaS based on optimistic has better compatibility, greatly reducing the threshold for project parties to migrate/develop, allowing more project parties to deploy quickly, quickly build a more prosperous ecosystem, and occupy the first-mover advantage. Lower threshold, not dependent on computing power support. RaaS based on optimistic also verifies the validity of transactions through fraud-proof, so the requirements for machine performance and reserves are lower in terms of computing power. This is also a limiting factor for many RaaS that can’t start with ZK**.** Easier to scale. The development threshold of RaaS based on optimistic is lower, unlike ZK RaaS which pursues performance and more underlying customization, which requires providers to deeply participate in development. At the same time, limited by the computing power to generate ZKP, ZK RaaS is difficult to deploy on a large scale like optimistic RaaS. Although optimistic rollup has obvious advantages in ecological layout, RaaS based on ZK also has obvious strengths. True customization, better performance, and lower cost. In the design of rollup customization, RaaS based on ZK can bring greater value to the project in terms of functions and performance, which is difficult to achieve with general-purpose scaling. It can be seen as a change from 0 to 1. RaaS based on optimistic is more about making changes from 90 to 99 in terms of cost and efficiency. Higher security. ZK’s RaaS can be trustless, while services based on op require trust in the challenger to work normally and prevent the sequencer from doing evil. Better interoperability and finality time. RaaS based on OP needs to carry out a 7-day fraud-proof verification, while the trustless feature of ZK gives it a faster finality time, and the 7-day verification period makes OP-RaaS face challenges in cross-rollup construction. Summary In the short term, the ecological advantage of RaaS based on optimistic is unshakable, but from the perspective of long-term demand and value creation, I believe that RaaS based on ZK is likely to gain a larger market share in the future. 3.2 Layer 2 or Layer 3? Depending on the use case and implementation objectives of different RaaS, the most suitable implementation plan should be chosen. In my opinion, the biggest difference lies in the cost and user experience (interoperability). By positioning Layer 2 (L2) as the settlement layer and arranging RaaS as Layer 3 (L3), lower transaction costs and faster cross-rollup interactions can be achieved, thereby enhancing the overall user experience. Although L2 RaaS based on Ethereum has successfully inherited the security of the main network, its cross-chain cost and speed are far inferior to multi-layer network designs. Therefore, L3 > L2 For more information about Layer 3, you can refer to an article I wrote previously: Foresight Ventures: In-depth Explanation of Layer3 3.3 RaaS or L1 app chain: Balancing between Ecosystem and Cost Cosmos and Polkadot were the first to propose an app-specific chain solution. So, between app-specific chain and RaaS, which is more suitable for providing customized services for dapps? ï»ż Interoperability For L1 app chains, apart from the Cosmos ecosystem based on the IBC communication protocol mentioned in the first section, applications can establish parachains on Polkadot and carry out cross-chain information exchange based on XCM. However, due to considerations of security and cost, in actual applications, we can see that most projects are only based on the Tendermint or Substrate consensus engine to develop customized L1 app chains, and rarely use cross-chain communication. This leads to relative independence among these cross-chain ecosystems, to a certain extent, it does not fit my ultimate vision of the app chain, where different app chains should together form a prosperous ecosystem with strong interoperability. For structures like StarkNet that further extend RaaS based on Layer 2 networks, they have a greater advantage in terms of interoperability. Different dapps maintaining their rollups can conduct low-cost cross-chains, and because they can settle in Layer 2 networks, the speed and user experience will be better. However, all these interoperability premises are based on RaaS being able to build a sufficiently strong ecosystem. Security Depending on the design of RaaS, DA based on Ethereum’s RaaS mostly inherits the security equivalent to Ethereum L1, which is higher than the security and decentralization level of L1 app chains. For RaaS based on the DA layer or side chain, the security is guaranteed by these Layer 2 networks. Cost For L1 app chains, transaction costs converge to the native token of the dapp project itself, which can achieve extremely low operating costs; For RaaS, L2 RaaS has a relatively high cost because it needs to interact directly with the Ethereum mainnet, while L3 RaaS based on Polygon, StarkNet, etc., can settle on L2, hence, they have relatively lower costs. 4. RaaS Project Analysis: Who Will Win the RaaS Market There are many RaaS projects currently being developed or already deployed, including but not limited to StarkNet L3, Opside, Caldera, Celestia, Dymension, Sovereign, Stackr, Eclipse, Altlayer, Saga
 Below are some representative ones for analysis. 4.1 ZK Series Including but not limited to Sovereign Labs, Fractal, StarkNet, Opside, ZKsync StarkWare: Customized L3 Based on ZKRollup Referring to the old graph, the StarkWare team first proposed the design of Ethereum’s multi-layer network in the article “Fractal Scaling: From L2 to L3”. However, the introduction of multi-layer networks is not just for further expansion, but more about allowing project owners to control more chain resources by stacking customized rollups on the basis of L2 general-purpose scaling, providing a user experience that L2 rollup cannot reach. Although from a computational point of view, a ZKP can be generated for a bunch of ZKPs to prove their validity, but data cannot be compressed and then further compressed. Because data availability must be ensured, allowing anyone to verify the validity of the proof, rollup needs to send the full or compressed transaction content to L1. Therefore, the application scenarios of StarkWare’s app-specific chain must be pursuing high performance or specific features. High performance: High-performance-demanding games can exclusively use ZK circuit resources to provide a better user experience; Privacy: For some projects with privacy needs, privacy functions can be implemented on a customized basis on top of rollup or rollup of rollup; Compatibility expansion: Providing an EVM-compatible environment, or even compatibility with more programming languages, brings positive value to the ecosystem itself; Low cost: Greatly reduce operating costs by sacrificing a certain degree of decentralization and security through Validium. The L3 solution based on StarkNet’s Validium theoretically can intuitively reduce costs, and interoperability is also guaranteed. However, from the perspective of customization, it can be further inferred that this app-specific chain based on ZKrollup, while providing considerable performance improvement, also raises the development cost and participation threshold for project parties. Therefore, RaaS providers need to be deeply involved in the development, and the speed and scale of expansion in the commercialization process are limited. Opside: Another Three-Layer Network Structure Designed for App-specific Chains Refer to the diagram below, compared to StarkWare, ZKsync’s L2 rollup-based app-specific L3 design, Opside proposes a three-layer network designed specifically for high TPS applications. It designs a sidechain as L2 based on a PoS+PoW consensus and connects the app-specific chain as L3 to the sidechain. ï»ż Opside interacts with data through the ZK-bridge it has developed, and unlike traditional sidechains, the proof of legality is completed through zkp instead of multi-signing, thus it has higher security. Meanwhile, Opside integrates the app-specific rollup into the consensus of the L2 sidechain through native rollup, that is, it motivates third parties to maintain the rollup on the L2 sidechain from the perspective of consensus. Interoperability is crucial to RaaS, and the native rollups in Opside share a world state tree and global message queue. Therefore, the interaction of assets and information between app-specific rollups will be very efficient and cost less. Cross-chain asset interaction only needs to directly call the contract method of the target rollup in an L3 rollup contract. However, compatibility and the development of the ecosystem remain a challenge for ZK-based rollups. The tradeoff brought by ZK’s trustless and faster finality time is that the commercial scale of RaaS is limited by computing power, requiring hardware support to generate ZKP, which is also one of the reasons why most RaaS do not adopt ZK. In addition, the design of the sidechain as L2 poses a challenge to the security of RaaS providers. 4.2 Optimistic Series Including but not limited to Caldera, Eclipse Caldera: Maximizing User Experience Based on Op Stack Caldera is an Op stack-based RaaS, that provides project teams with high throughput, low latency, and customizable functionalities of L2 rollup. Its current testnet allows anyone to create an L2 rollup in a very short time. The user experience is very smooth; you can try it out here: https://dashboard.caldera.xyz/ ï»ż The design based on the Op stack gives Caldera a great advantage in compatibility. With full EVM compatibility and the team’s optimization of user experience, it significantly lowers the barriers to migration/development. Also, Caldera’s RaaS is not limited by the computing power of the underlying hardware, enabling more project teams to deploy quickly, thus building a more prosperous ecosystem. Referring to the structural diagram in Caldera’s official documentation, Caldera Chains can not only launch L2 rollup-as-a-service on Ethereum but also provide services on any EVM-compatible L1, ensuring the validity of transactions by sending fraud proofs to L1. On the data availability layer, Caldera also made innovations, decoupling the Data Availability Layer from the Settlement Layer. Customized rollups can send transaction contents to Ethereum, or a dedicated DA layer, such as Eigenlayer or Celestia. This design optimizes Caldera’s scalability and transaction costs to a greater extent. The interoperability of Caldera’s ecosystem is realized by the internal cross-chain bridge. It enables cross-chain assets and data by deploying contracts on the corresponding L1 and app-specific rollups. Meanwhile, Caldera also provides a high-level JavaScript SDK to help developers more efficiently add cross-chain functions in customized rollups. ï»ż Although Caldera has done a lot in interoperability and cross-chain bridges, optimistic rollups require a 7-day fraud-proof time, making it a challenge to build interoperability among rollups. At the same time, optimistic RaaS cannot achieve trustlessness; one must trust that at least one challenger exists to prevent the sequencer from misbehaving. Additionally, in customization, Caldera and other Optimistic RaaS focus more on low cost and high TPS, and it’s hard to bring as much value in functionality and performance to projects as ZK-based RaaS. Looking at the current general-purpose scaling rollups, they can achieve quite a considerable block time, tps, and transaction costs. The data and RaaS are not significantly different, indicating a 0–1 improvement. Therefore, it’s worth questioning whether the cost and throughput improvements brought by Op-based RaaS are what the current market needs. 4.3 Modular Blockchain Including but not limited to Celestia, Dymension Celestia: Building Modular Blockchain Based on DA Layer Celestia is essentially a data availability layer. A scalable blockchain hierarchical architecture is built on a DA layer based on the Tendermint consensus. Through rollmint (a kind of application blockchain interface implementation), dapps can build their rollup and deploy it to Celestia, with data stored in the DA layer and state root and proof uploaded to L1 for verification. Celestia optimizes the DA layer through data availability sampling (DAS), where each light node in the network only needs to sample and download a small portion of block data. Hence, the more nodes, the more transactions each block can contain, achieving the purpose of scaling the DA layer. ï»ż This brings to mind the familiar Validiums: a scalability solution that verifies computation results using ZK algorithms, does not upload data to L1, and relies on validators for data custody. Since the data exists off-chain instead of being directly published to Layer 1, Validium reduces gas costs. However, from the perspective of decentralization and security, Data Availability depends on a third-party committee, so Validiums are not widely used. From the implementation perspective, the dapps in the entire ecosystem are essentially building their Validium, maintaining sequencer and prover, with Celestia providing a unified data storage space. Similar to Validiums, this implementation method lowers the operational cost of dapps but also sacrifices a certain degree of decentralization and security. Compared with other solutions inheriting Ethereum’s security, the security of dapp chains on Celestia relies on nodes and the DA layer. Additionally, Celestia currently does not support fraud-proof. Therefore, nodes need to re-execute all transactions based on a pessimistic assumption to ensure their validity. At the same time, rollmint only supports a single sequencer, leaving a lot of room for improvement in terms of efficiency and decentralization. However, as a DA layer, Celestia’s potential extends far beyond this. For instance, the optimistic RaaS solution Eclipse uses Celestia as its consensus and DA layer. 5. Conclusion and Outlook RaaS can intuitively bring improvements in cost and performance, but inferring from performance, these optimizations do not have strong appeal; greater value still needs to be tied to customized features. Currently, the market demand is limited, but with the future development of crypto, larger traffic will lead to a linear increase in dapp’s pursuit of low cost and high performance, and customized rollup services are clearly a viable solution. To answer the question raised at the very beginning, what is my understanding of the ultimate form of RaaS? What kind of RaaS will capture the market? From the product itself The advantage of OP-based RaaS lies in rapidly building an ecosystem and forming barriers, but the minor improvements brought purely from cost and efficiency are not enough to attract projects, hence there is no long-term value. On the other hand, ZK-based RaaS can solve pain points with customized features, but the demand is still not mainstream. The design of a multi-layer network structure enables L3 RaaS to have lower costs and stronger interoperability. Moreover, strong interoperability is the foundation for building a thriving RaaS ecosystem. Therefore, a ZK-based multi-layer network design can combine the advantages of customization and low cost, and we can see its longer-term value. I believe that in the long run, ZK-based multi-layer network RaaS will become the market’s ultimate choice. Market and Demand A RaaS with sufficient scalability can meet the needs of all projects for customized rollups while ensuring performance. At the same time, the real rise of RaaS heavily depends on the construction of the ecosystem. Therefore, a pattern where multiple RaaS coexist clearly doesn’t make sense. I believe that the endgame will definitely be one or very few RaaS dominating the entire market. Reference https://ethresear.ch/t/rollup-as-a-service-opportunities-and-challenges/13051 https://ibcprotocol.org/ https://messari.io/report/the-rollups-as-a-service-ecosystem About Foresight Ventures Foresight Ventures is dedicated to backing the disruptive innovation of blockchain for the next few decades. We manage multiple funds: a VC fund, an actively-managed secondary fund, a multi-strategy FOF, and a private market secondary fund, with AUM exceeding $400 million. Foresight Ventures adheres to the belief of “Unique, Independent, Aggressive, Long-Term mindset” and provides extensive support for portfolio companies within a growing ecosystem. Our team is composed of veterans from top financial and technology companies like Sequoia Capital, CICC, Google, Bitmain, and many others. Website: https://www.foresightventures.com/ Disclaimer: All articles by Foresight Ventures are not intended to be investment advice. Individuals should assess their own risk tolerance and make investment decisions prudently.

Foresight Ventures: What is RaaS? Which Type of RaaS Will Win The Market?

Author: Ian Xu@Foresight Ventures

TL;DR

This article introduces Rollup-as-a-Service, which caters to the growing and diverse needs of Dapps. RaaS enables Dapps to maintain high performance, reduce costs, and deliver better user experiences without compromising their interaction with the broader ecosystem.

While RaaS has potential, the primary advantage of RaaS appears to be its customization options, and overall, demand for RaaS is currently limited, and its value needs further exploration.

While Optimistic Rollups currently lead due to better compatibility and lower thresholds, Zero Knowledge rollups may eventually dominate due to superior performance and customization.

This article compares L3 and L2 for RaaS and RaaS vs app chains, indicating RaaS may provide better interoperability and security if a strong ecosystem is built.

This article analyzes various RaaS projects to determine the most promising approaches. It covers ZK-based projects (like StarkWare and Opside), Optimistic-based projects (such as Caldera), and Modular Blockchain solutions (like Celestia).

1. RaaS Introduction

1.1 Rollup: The Most Promising Scalability Solution

The initial intention of Layer 2 is to alleviate the congestion problem of the mainnet and provide services for Dapps with lower costs and higher TPS under the premise of ensuring security. Rollup executes high-cost transaction execution on L2 and packs transactions to L1 for verification while ensuring that the complete transaction content can be verified. Under the premise of inheriting Ethereum’s security, it has stronger comprehensive performance. Therefore, rollup has made its way out among various Layer 2 solutions and is undoubtedly the most promising off-chain scaling solution currently.

1.2 Rollup-as-a-service: One form of App-specific chain

With the gradual growth of some Dapps and the expansion of various new applications, rollup as a general-purpose scaling can’t perfectly meet these projects’ pursuit of user experience and cost structure. The high traffic and high-performance requirements (such as AAA games focusing on player interaction) make these applications need more customized scalability solutions.

App-specific chain is one of the best solutions for these Dapps.

The concept of an App chain is not strange. Different projects can customize the design of blockchain according to their own application scenarios and needs, allowing Dapps to enjoy the resources of a chain exclusively. While ensuring not to break away from other ecosystems, they can achieve lower operating costs and higher performance, bringing a better user experience.

For example, Cosmos, based on Tendermint consensus, provides Dapps with a low-cost environment to build a sovereign L1 public chain. At the same time, based on the IBC communication protocol, different app chains can more easily achieve cross-chain assets/information. You can refer to the IBC packet lifecycle given by Cosmos official👇

ï»ż

Talking about scalability without considering the ecosystem is meaningless.

The feasibility of the App chain solution is definitely based on strong interoperability and ecosystem support. For example, Cosmos gradually improves its own ecosystem through the sovereign L1 public chain and cross-chain advantages brought by IBC in the ecosystem.

Based on the above understanding, another thought of the app-specific chain is to achieve Dapp’s pursuit of customized functions, high performance, and low cost through customized roll-up. Some RaaS based on the second-layer network can also make the project’s interaction more convenient, and have a positive impact on the ecological layout.

2. The Value of RaaS

The trend of multiple chains and rollups in the crypto world seems inevitable. The emergence of RaaS projects like spring bamboo shoots has laid the foundation for the development of new Dapp forms. But under such consensus, I still want to ask a realistic question in the opposite direction:

It is indeed attractive to allow anyone to quickly launch a rollup, but besides being in the right direction and cool, does it really create enough value for those in need? This question can be further divided into two points:

Whether there are enough projects in the market that have sufficient motivation to use RaaS;

Whether RaaS has created considerable value for the project parties.

This question essentially discusses the demand and the value brought by RaaS. Sufficient projects have demand or RaaS can provide attractive improvements.

In terms of demand, as some Dapps continue to grow, project parties do indeed urgently need to seek:

Lower cost

Higher performance

Special functions

Cost

Referring to the data given by L2fees, L2 rollup has achieved the ultimate in cost optimization, which is a huge improvement compared to the Ethereum mainnet. Looking at the RaaS test data of Caldera Chains again, there is no qualitative change in cost, more like a 99–100 optimization. At the same time, the implementation of EIP4844 and danksharding will further reduce the cost of L2 rollup, and the difference brought by RaaS in cost and efficiency will also further narrow.

A solution that can significantly reduce the transaction fee is attractive, but most RaaS can’t do this. Considering the migration, overall ecology, interoperability, security, and other costs, do project parties really have enough motivation to use RaaS? For most conventional Dapps or users who are not so sensitive to performance and cost, perhaps general-purpose scaling is enough.

ï»ż

ï»ż

Performance

L2 rollup already has the ability to provide ultra-high TPS. Referring to the data provided by Caldera, RaaS based on Op has almost no advantage in block time. Although ZK RaaS can provide more customized data storage and compression, there is not much demand for such services. RaaS based on the second layer network can indeed achieve faster speed and lower cost by settling transactions on L2, thereby improving user experience.

As mentioned above, facing the imperfect ecosystem and other migration/development costs, do project parties still have enough motivation to use RaaS?

ï»ż

Customized Features

In terms of value creation, some RaaS can indeed provide features that are currently difficult to implement or designs that are inefficient in general-purpose scaling. For example:

The first element of the current L2’s ZK circuit design is compatibility, in order to serve all Dapps, the circuit design sacrifices efficiency to a certain extent and does not optimize for specific Dapps. The value of RaaS can be clearly demonstrated: customizing the design of the ZK circuit for specific Dapps or providing more efficient storage structures and data compression services to achieve higher performance;

Implementation of privacy functions. Although ZK rollup is friendly to privacy, due to considerations of decentralization and security, users’ transaction data still needs to be published to L1 as a history log after compression, allowing all users to verify. Therefore, the current general-purpose scaling rollup cannot achieve privacy. RaaS can customize the implementation of privacy functions on the basis of rollup or even rollup of rollup, creating value for projects with strong privacy needs.

Therefore, the current value of RaaS is customization > pure cost and efficiency. (Not excluding the cost and efficiency improvements brought by customization)

To answer the initial question: Does RaaS really create enough value for people in need?

I think the current demand for RaaS is limited, and general-purpose scaling can meet more than 90% of needs. Although customized rollups have begun to play an irreplaceable role in some niche areas, they are not mainstream after all. The value created by RaaS is limited and needs to be further explored on the basis of considering the ecosystem, interoperability, and other comprehensive factors.

3. Exploring the Ultimate Form of RaaS

Since the emergence of the L2 rollup, the exploration of RaaS has never stopped, and various Rollup-as-a-service implementation solutions have appeared on the market so far. Referring to the ecosystem layout on Messari, you can roughly see the implementation paths of different RaaS. So the key questions are:

What solutions make sense?

What kind of RaaS will eventually win the market?

ï»ż

3.1 OP or ZK

The discussion about optimistic or zero knowledge has never stopped. Although ZKrollup theoretically has stronger performance, much faster finality time than optimistic rollup, and higher security, optimistic rollup has better compatibility and a lower threshold.

In the existing RaaS projects, most projects are primarily based on optimistic rollups. I think the main reasons are:

Ecosystem always comes first. RaaS based on optimistic has better compatibility, greatly reducing the threshold for project parties to migrate/develop, allowing more project parties to deploy quickly, quickly build a more prosperous ecosystem, and occupy the first-mover advantage.

Lower threshold, not dependent on computing power support. RaaS based on optimistic also verifies the validity of transactions through fraud-proof, so the requirements for machine performance and reserves are lower in terms of computing power. This is also a limiting factor for many RaaS that can’t start with ZK**.**

Easier to scale. The development threshold of RaaS based on optimistic is lower, unlike ZK RaaS which pursues performance and more underlying customization, which requires providers to deeply participate in development. At the same time, limited by the computing power to generate ZKP, ZK RaaS is difficult to deploy on a large scale like optimistic RaaS.

Although optimistic rollup has obvious advantages in ecological layout, RaaS based on ZK also has obvious strengths.

True customization, better performance, and lower cost. In the design of rollup customization, RaaS based on ZK can bring greater value to the project in terms of functions and performance, which is difficult to achieve with general-purpose scaling. It can be seen as a change from 0 to 1. RaaS based on optimistic is more about making changes from 90 to 99 in terms of cost and efficiency.

Higher security. ZK’s RaaS can be trustless, while services based on op require trust in the challenger to work normally and prevent the sequencer from doing evil.

Better interoperability and finality time. RaaS based on OP needs to carry out a 7-day fraud-proof verification, while the trustless feature of ZK gives it a faster finality time, and the 7-day verification period makes OP-RaaS face challenges in cross-rollup construction.

Summary

In the short term, the ecological advantage of RaaS based on optimistic is unshakable, but from the perspective of long-term demand and value creation, I believe that RaaS based on ZK is likely to gain a larger market share in the future.

3.2 Layer 2 or Layer 3?

Depending on the use case and implementation objectives of different RaaS, the most suitable implementation plan should be chosen. In my opinion, the biggest difference lies in the cost and user experience (interoperability).

By positioning Layer 2 (L2) as the settlement layer and arranging RaaS as Layer 3 (L3), lower transaction costs and faster cross-rollup interactions can be achieved, thereby enhancing the overall user experience. Although L2 RaaS based on Ethereum has successfully inherited the security of the main network, its cross-chain cost and speed are far inferior to multi-layer network designs.

Therefore, L3 > L2

For more information about Layer 3, you can refer to an article I wrote previously:

Foresight Ventures: In-depth Explanation of Layer3

3.3 RaaS or L1 app chain: Balancing between Ecosystem and Cost

Cosmos and Polkadot were the first to propose an app-specific chain solution. So, between app-specific chain and RaaS, which is more suitable for providing customized services for dapps?

ï»ż

Interoperability

For L1 app chains, apart from the Cosmos ecosystem based on the IBC communication protocol mentioned in the first section, applications can establish parachains on Polkadot and carry out cross-chain information exchange based on XCM. However, due to considerations of security and cost, in actual applications, we can see that most projects are only based on the Tendermint or Substrate consensus engine to develop customized L1 app chains, and rarely use cross-chain communication. This leads to relative independence among these cross-chain ecosystems, to a certain extent, it does not fit my ultimate vision of the app chain, where different app chains should together form a prosperous ecosystem with strong interoperability.

For structures like StarkNet that further extend RaaS based on Layer 2 networks, they have a greater advantage in terms of interoperability. Different dapps maintaining their rollups can conduct low-cost cross-chains, and because they can settle in Layer 2 networks, the speed and user experience will be better. However, all these interoperability premises are based on RaaS being able to build a sufficiently strong ecosystem.

Security

Depending on the design of RaaS, DA based on Ethereum’s RaaS mostly inherits the security equivalent to Ethereum L1, which is higher than the security and decentralization level of L1 app chains. For RaaS based on the DA layer or side chain, the security is guaranteed by these Layer 2 networks.

Cost

For L1 app chains, transaction costs converge to the native token of the dapp project itself, which can achieve extremely low operating costs;

For RaaS, L2 RaaS has a relatively high cost because it needs to interact directly with the Ethereum mainnet, while L3 RaaS based on Polygon, StarkNet, etc., can settle on L2, hence, they have relatively lower costs.

4. RaaS Project Analysis: Who Will Win the RaaS Market

There are many RaaS projects currently being developed or already deployed, including but not limited to StarkNet L3, Opside, Caldera, Celestia, Dymension, Sovereign, Stackr, Eclipse, Altlayer, Saga


Below are some representative ones for analysis.

4.1 ZK Series

Including but not limited to Sovereign Labs, Fractal, StarkNet, Opside, ZKsync

StarkWare: Customized L3 Based on ZKRollup

Referring to the old graph, the StarkWare team first proposed the design of Ethereum’s multi-layer network in the article “Fractal Scaling: From L2 to L3”. However, the introduction of multi-layer networks is not just for further expansion, but more about allowing project owners to control more chain resources by stacking customized rollups on the basis of L2 general-purpose scaling, providing a user experience that L2 rollup cannot reach.

Although from a computational point of view, a ZKP can be generated for a bunch of ZKPs to prove their validity, but data cannot be compressed and then further compressed. Because data availability must be ensured, allowing anyone to verify the validity of the proof, rollup needs to send the full or compressed transaction content to L1.

Therefore, the application scenarios of StarkWare’s app-specific chain must be pursuing high performance or specific features.

High performance: High-performance-demanding games can exclusively use ZK circuit resources to provide a better user experience;

Privacy: For some projects with privacy needs, privacy functions can be implemented on a customized basis on top of rollup or rollup of rollup;

Compatibility expansion: Providing an EVM-compatible environment, or even compatibility with more programming languages, brings positive value to the ecosystem itself;

Low cost: Greatly reduce operating costs by sacrificing a certain degree of decentralization and security through Validium.

The L3 solution based on StarkNet’s Validium theoretically can intuitively reduce costs, and interoperability is also guaranteed.

However, from the perspective of customization, it can be further inferred that this app-specific chain based on ZKrollup, while providing considerable performance improvement, also raises the development cost and participation threshold for project parties. Therefore, RaaS providers need to be deeply involved in the development, and the speed and scale of expansion in the commercialization process are limited.

Opside: Another Three-Layer Network Structure Designed for App-specific Chains

Refer to the diagram below, compared to StarkWare, ZKsync’s L2 rollup-based app-specific L3 design, Opside proposes a three-layer network designed specifically for high TPS applications. It designs a sidechain as L2 based on a PoS+PoW consensus and connects the app-specific chain as L3 to the sidechain.

ï»ż

Opside interacts with data through the ZK-bridge it has developed, and unlike traditional sidechains, the proof of legality is completed through zkp instead of multi-signing, thus it has higher security. Meanwhile, Opside integrates the app-specific rollup into the consensus of the L2 sidechain through native rollup, that is, it motivates third parties to maintain the rollup on the L2 sidechain from the perspective of consensus.

Interoperability is crucial to RaaS, and the native rollups in Opside share a world state tree and global message queue. Therefore, the interaction of assets and information between app-specific rollups will be very efficient and cost less. Cross-chain asset interaction only needs to directly call the contract method of the target rollup in an L3 rollup contract. However, compatibility and the development of the ecosystem remain a challenge for ZK-based rollups.

The tradeoff brought by ZK’s trustless and faster finality time is that the commercial scale of RaaS is limited by computing power, requiring hardware support to generate ZKP, which is also one of the reasons why most RaaS do not adopt ZK. In addition, the design of the sidechain as L2 poses a challenge to the security of RaaS providers.

4.2 Optimistic Series

Including but not limited to Caldera, Eclipse

Caldera: Maximizing User Experience Based on Op Stack

Caldera is an Op stack-based RaaS, that provides project teams with high throughput, low latency, and customizable functionalities of L2 rollup. Its current testnet allows anyone to create an L2 rollup in a very short time. The user experience is very smooth; you can try it out here: https://dashboard.caldera.xyz/

ï»ż

The design based on the Op stack gives Caldera a great advantage in compatibility. With full EVM compatibility and the team’s optimization of user experience, it significantly lowers the barriers to migration/development. Also, Caldera’s RaaS is not limited by the computing power of the underlying hardware, enabling more project teams to deploy quickly, thus building a more prosperous ecosystem.

Referring to the structural diagram in Caldera’s official documentation, Caldera Chains can not only launch L2 rollup-as-a-service on Ethereum but also provide services on any EVM-compatible L1, ensuring the validity of transactions by sending fraud proofs to L1. On the data availability layer, Caldera also made innovations, decoupling the Data Availability Layer from the Settlement Layer. Customized rollups can send transaction contents to Ethereum, or a dedicated DA layer, such as Eigenlayer or Celestia. This design optimizes Caldera’s scalability and transaction costs to a greater extent.

The interoperability of Caldera’s ecosystem is realized by the internal cross-chain bridge. It enables cross-chain assets and data by deploying contracts on the corresponding L1 and app-specific rollups. Meanwhile, Caldera also provides a high-level JavaScript SDK to help developers more efficiently add cross-chain functions in customized rollups.

ï»ż

Although Caldera has done a lot in interoperability and cross-chain bridges, optimistic rollups require a 7-day fraud-proof time, making it a challenge to build interoperability among rollups. At the same time, optimistic RaaS cannot achieve trustlessness; one must trust that at least one challenger exists to prevent the sequencer from misbehaving.

Additionally, in customization, Caldera and other Optimistic RaaS focus more on low cost and high TPS, and it’s hard to bring as much value in functionality and performance to projects as ZK-based RaaS. Looking at the current general-purpose scaling rollups, they can achieve quite a considerable block time, tps, and transaction costs. The data and RaaS are not significantly different, indicating a 0–1 improvement. Therefore, it’s worth questioning whether the cost and throughput improvements brought by Op-based RaaS are what the current market needs.

4.3 Modular Blockchain

Including but not limited to Celestia, Dymension

Celestia: Building Modular Blockchain Based on DA Layer

Celestia is essentially a data availability layer. A scalable blockchain hierarchical architecture is built on a DA layer based on the Tendermint consensus. Through rollmint (a kind of application blockchain interface implementation), dapps can build their rollup and deploy it to Celestia, with data stored in the DA layer and state root and proof uploaded to L1 for verification. Celestia optimizes the DA layer through data availability sampling (DAS), where each light node in the network only needs to sample and download a small portion of block data. Hence, the more nodes, the more transactions each block can contain, achieving the purpose of scaling the DA layer.

ï»ż

This brings to mind the familiar Validiums: a scalability solution that verifies computation results using ZK algorithms, does not upload data to L1, and relies on validators for data custody. Since the data exists off-chain instead of being directly published to Layer 1, Validium reduces gas costs. However, from the perspective of decentralization and security, Data Availability depends on a third-party committee, so Validiums are not widely used.

From the implementation perspective, the dapps in the entire ecosystem are essentially building their Validium, maintaining sequencer and prover, with Celestia providing a unified data storage space. Similar to Validiums, this implementation method lowers the operational cost of dapps but also sacrifices a certain degree of decentralization and security. Compared with other solutions inheriting Ethereum’s security, the security of dapp chains on Celestia relies on nodes and the DA layer.

Additionally, Celestia currently does not support fraud-proof. Therefore, nodes need to re-execute all transactions based on a pessimistic assumption to ensure their validity. At the same time, rollmint only supports a single sequencer, leaving a lot of room for improvement in terms of efficiency and decentralization.

However, as a DA layer, Celestia’s potential extends far beyond this. For instance, the optimistic RaaS solution Eclipse uses Celestia as its consensus and DA layer.

5. Conclusion and Outlook

RaaS can intuitively bring improvements in cost and performance, but inferring from performance, these optimizations do not have strong appeal; greater value still needs to be tied to customized features. Currently, the market demand is limited, but with the future development of crypto, larger traffic will lead to a linear increase in dapp’s pursuit of low cost and high performance, and customized rollup services are clearly a viable solution.

To answer the question raised at the very beginning, what is my understanding of the ultimate form of RaaS? What kind of RaaS will capture the market?

From the product itself

The advantage of OP-based RaaS lies in rapidly building an ecosystem and forming barriers, but the minor improvements brought purely from cost and efficiency are not enough to attract projects, hence there is no long-term value. On the other hand, ZK-based RaaS can solve pain points with customized features, but the demand is still not mainstream.

The design of a multi-layer network structure enables L3 RaaS to have lower costs and stronger interoperability. Moreover, strong interoperability is the foundation for building a thriving RaaS ecosystem. Therefore, a ZK-based multi-layer network design can combine the advantages of customization and low cost, and we can see its longer-term value.

I believe that in the long run, ZK-based multi-layer network RaaS will become the market’s ultimate choice.

Market and Demand

A RaaS with sufficient scalability can meet the needs of all projects for customized rollups while ensuring performance. At the same time, the real rise of RaaS heavily depends on the construction of the ecosystem. Therefore, a pattern where multiple RaaS coexist clearly doesn’t make sense.

I believe that the endgame will definitely be one or very few RaaS dominating the entire market.

Reference

https://ethresear.ch/t/rollup-as-a-service-opportunities-and-challenges/13051

https://ibcprotocol.org/

https://messari.io/report/the-rollups-as-a-service-ecosystem

About Foresight Ventures

Foresight Ventures is dedicated to backing the disruptive innovation of blockchain for the next few decades. We manage multiple funds: a VC fund, an actively-managed secondary fund, a multi-strategy FOF, and a private market secondary fund, with AUM exceeding $400 million. Foresight Ventures adheres to the belief of “Unique, Independent, Aggressive, Long-Term mindset” and provides extensive support for portfolio companies within a growing ecosystem. Our team is composed of veterans from top financial and technology companies like Sequoia Capital, CICC, Google, Bitmain, and many others.

Website: https://www.foresightventures.com/

Disclaimer: All articles by Foresight Ventures are not intended to be investment advice. Individuals should assess their own risk tolerance and make investment decisions prudently.
Foresight Ventures: Blockchain and DApp StorageMaggie@Foresight Ventures ï»ż Key Insights To achieve full decentralization in Web3 applications, we need technological advancements in four areas including data availability(blockchain scalability), decentralized file systems, decentralized databases, and decentralized computing. Data retrieval speed, incentive model & tokenomics, and the guarantee algorithm for data availability are key factors that determine whether a file/database storage protocol will be widely used or not. The main focus for improving decentralized file systems and database protocols will be on reducing retrieval times. The data availability layer is a promising and important method for scaling blockchain. Celestia’s technology still needs market validation, and ETH and Celestia may converge technically in the future App Architectures of Both Web2 and Web3 Applications. Compared to Web2 applications that often consist of a frontend, backend, and data layer with a database and file system, Web3 DApps can be simpler as they only need a frontend and a smart contract that serves as both the backend and database. ï»ż But, as these DApps lack a file system to store files, their frontend pages, pictures, and other files are still hosted on centralized servers. To achieve full decentralization, developers are now using decentralized file systems to store the required files, including frontend pages, NFT metadata, and images, for DApps. ï»ż To improve structured data storage and backend computing capabilities, we utilize data availability technology to scale blockchain. Additionally, two types of products have emerged: decentralized databases and decentralized computing. By utilizing blockchain, developers can store financial data and other critical information related to DApps. On the other hand, decentralized databases can be utilized for storing structured data such as NFT metadata, DAO voting data, DEX order books, social data, and so on. Additionally, decentralized computing can help in scaling the backend. ï»ż Overall, to build fully decentralized, flexible, and rich Web3 DApps, four types of products and technological advancements are necessary. Decentralized file system: Store DApp frontend web pages, NFT pictures, videos, and other files of Dapps. Decentralized database: Store structured data like NFT metadata, DAO votes, and DEX order book. Data Availability: Scale blockchain and store financial and important data for DApps. Decentralized Computing Tools: Scale the backend of DApps. ï»ż 1. Decentralized File System Decentralized file storage serves as a substitute for centralized storage, facilitating the realization of serverless DApps. The demand of DApps for decentralized file systems is growing and will be a vital component of the Web3 technology stack. Compared to using centralized storage, the main advantages of decentralized storage are the removal of trusted third parties, increased redundancy, elimination of single point of failure risks, and cheaper costs. ï»ż According to Messari’s statistics, the market cap of the top 4 decentralized file storage protocols was nearly $1.6 billion, down 83% from $9.4 billion. Over 17 million terabytes (TB) of total storage capacity, up 2% YoY, and 532,500 TB of used storage, up 1280% YoY. ï»ż Let’s take a look at the current situation of several popular decentralized storage projects. Storing data using all these decentralized storage protocols is significantly cheaper compared to AWS. While AWS charges around $23/TB/month, these decentralized storage protocols range from $0.0002 to $20/TB/month. IPFS: IPFS is currently the most widely used protocol for storing images and metadata for NFTs. It’s great for storing frequently accessed or “hot” data. However, IPFS doesn’t have any built-in ways to incentivize storage, prove data is stored correctly, or establish agreement among participants as blockchains do. This means there’s a risk of losing data if it’s only stored on IPFS. For instance, Infura’s IPFS service deletes data that hasn’t been accessed in six months. So if you want to keep your data available for a long time, it’s best to run your own IPFS node. Filecoin: Filecoin provides low storage costs and is mainly used for storing “cold” data, such as archival data. Filecoin doesn’t have a built-in charging mechanism for data retrieving, some miners accept low-quality data to earn rewards while refusing to facilitate data retrieval. The Filecoin community is actively addressing this issue and implementing measures to improve the overall quality of stored data. Arweave: Arweave’s idea of permanent storage is welcome for storing DApp data. The ecosystem is developing well, there are decentralized database systems using Arweave to store database files, as well as second-layer scalability solutions based on Arweave. In Arweave, price does not account for bandwidth, some nodes only provide storage services not retrieval. Swarm: Bandwidth fees are charged for both storage and retrieval in Swarm. The system is highly decentralized and has high bandwidth requirements for nodes. StorJ: StorJ is different than other protocols, it is partially decentralized and has good retrieval speed. Has proven effective for large video file sharing. Sia: Skynet Labs, closed due to a lack of new funding, which also led to a decline in Sia’s usage. ï»ż We primarily evaluate the usability of a decentralized file storage protocol based on three factors: Data retrieval speed. It is crucially important, because it determines the efficiency of a storage system in responding to requests from DApps, and it directly affects the user experience of DApps. Factors that may affect the speed of data retrieval include: whether there is a fee for data queries, the degree of decentralization of nodes, node quality, data forwarding logic, and facilities such as CDNs for accelerated queries. Incentive model and tokenomics. Incentive models and token economics impact the participation of storage nodes, influencing their behavior. Currently, the mainstream pricing model consists of storage fees plus bandwidth fees, meaning that users need to pay a storage fee when storing data and a bandwidth fee when accessing it. If data queries are free, nodes often lack the motivation to provide them. Moreover, incentive models and token economics impact the earnings of miners, which can affect the number of nodes and the storage capacity of the services. Data availability guarantee algorithm. It is an algorithm used in decentralized networks to ensure the continuous availability of data and proper service provision by nodes. Currently, the most widely used method is Proof of Random Access. ï»ż Overall, we believe that, The products and services that leverage decentralized storage protocols are still in their early stages. The main focus for improving storage protocols will be on reducing retrieval times. Data retrieval speed, incentive model & tokenomics, and the guarantee algorithm for data availability are key factors that determine whether a protocol will be widely used or not. ï»ż 2. Decentralized Database Databases are widely used in applications, decentralized databases are a crucial technology for achieving full decentralization in DApps. Decentralized databases can replace centralized databases to store structured hot data that DApps require, such as NFT metadata, DAO voting, DEX order books, social media data, etc. ï»ż There are many decentralized database projects, especially in the past two years where several innovative projects have emerged. Ceramic: Ceramic is a project started in 2019. Data is stored and managed in units of streams, and formatted event logs are added to streams. The log will be made into a file and uploaded to IPFS. Provides GraphQL API queries. Ceramic has no incentive model like IPFS and supports data creation, reading, and updating (CRU). OrbitDB: OrbitDB is an earlier project compared to Ceramic, that also uses the IPFS file system for file storage. It supports the storage of both NoSQL databases and files. Tableland: The project started in 2022 and is currently in the public testing phase. The production version of Tableland will be released in 2023. Data storage requires the use of smart contracts, which define SQL statements and set usage permissions. Reading data is performed off-chain and does not require payment. Currently, the contract has been deployed on L2s such as ETH and OP. Polybase: The project is now live on the test network. It is a NoSQL database that supports CRUD operations, with each operation incurring fees. Additionally, Polybase offers support for various file systems to store database files, including local disk, IPFS, Filecoin, Polystore, and even AWS S3. Polybase also utilizes payment channels for data query payments, reducing the frequency of on-chain transactions and avoiding query delays caused by payments. Web3Q: Also known as EthStorage.Project start in 2022. The testnet is alive. Proposed a new URL pattern Web//access protocol for accessing data Kwill: Kwill is a SQL database system based on Arweave, using smart contracts for payment. KYVE: KYVE is a database system based on Arwave. From a technical perspective: Both SQL and NoSQL can be used as databases. The data structure of SQL requires high consistency, with stronger support for joint queries, making it more mature and efficient. The KV format of NoSQL is more suitable for Ethereum’s design pattern, supporting rich data types, and being flexible and easily scalable. In terms of functionality, the best option is to support CRUD, but supporting UD will add complexity to the system. If the system uses local storage, historical value queries may not be supported. If using IPFS and Arweave as file systems, the database needs to be append-only, otherwise, there will be multiple versions of the same data, doubling storage costs. When choosing an underlying file system, there are two options: 1) Store database files in decentralized file systems such as IPFS and Arweave; 2) Store them locally on nodes or in the S3 cloud. If a decentralized database project requires customized retrieval logic or optimization, using local storage or S3 is a more flexible approach. ï»ż Overall, we believe that, The field of decentralized databases is highly worth paying attention to, with an urgent demand, while a widely accepted and used product has not emerged yet. The maturity of decentralized databases is lower than that of decentralized file storage systems. Decentralized database tech is based on the decentralized file system and many projects are started in 2022. The main focus for improving storageData retrieval speed, incentive model & tokenomics, and the guarantee algorithm for data availability are key factors that determine whether a protocol will be widely used or not. protocols will be on reducing retrieval times. ï»ż ï»ż 3. Data Availability The concept of data availability is distinguished from decentralized file systems and databases, as elucidated on the websites of Ethereum and Celestia. Ethereum: Data availability is the guarantee that the block proposer published all transaction data for a block and that the transaction data is available to other network participants. Celestia: Data availability is concerned with whether the data published in the latest block is available. While, decentralized file systems and databases mainly ensure that the data stored by users is available, but do not specifically address transactional data. ï»ż ï»ż Currently, there are several data availability projects including: Ethereum. ETH serves as the DA (data availability) layer for Layer 2 Rollup. Celestia. Celestia is a specifically designed DA layer that only handles data availability and does not execute transactions. It sparked a trend of modular blockchains in 2022. EigenDA and other DA products. Ensuring data availability through committees. ï»ż Ethereum ETH Layer 2 creates and submits batches of transactions to the Ethereum network, and stores the data in an Ethereum smart contract on Layer 1. This ensures the guaranteed availability of L2 transaction data through the ETH network. Although rollups can extend the throughput of ETH through off-chain computation, their capacity is limited by the L1 ETH blockchain data throughput. Therefore, Ethereum needs to increase its data storage and processing capabilities. ï»ż To scale up Ethereum’s DA capacity, Danksharding has been included in ETH’s roadmap and is considered one of the most important and urgent updates currently. Danksharding is a sharding design, data availability is delegated to each shard, and each validator only needs to run a full node for its own shard, while running other shards with light client capacity. Proto-danksharding (EIP-4844) is a preliminary implementation of Dankshading, which is expected to be implemented in the second half of 2023. It introduces a data blob stored off-chain that is mounted on ETH via transactions, as well as precompiled code for validating Blob. Each blob is approximately 125kB in size, while a block is only 90kB. Currently, at most eight blobs can be mounted per block, resulting in additional storage of 1MB. In Proto-danksharding, the data has not been sharded, and validators still need to download and directly verify the availability of all Blob data. After the implementation of EIP4844, Blob can store 10 times more data than Calldata with the same gas consumption. The data of Rollup can be stored in Blob in the future, reducing transaction fees by an order of magnitude. Once fully implemented, Danksharding will become even cheaper. In summary, Danksharding can improve Ethereum’s data storage capacity, reduce the cost of ETH used as DA, and become a more powerful DA layer. ï»ż Celestia Celestia is a minimal blockchain that only orders and publishes transactions and does not execute them. By decoupling the consensus and application execution layers, Celestia modularizes the blockchain technology stack and unlocks new possibilities for decentralized application builders. Celestia is responsible for the DA layer, while ETH handles consensus and settlement, and the application chain is responsible for execution. Celestia is responsible for both the DA layer and the consensus layer, while settlement and execution are handled by the application chain. Alternatively, settlement can use Cevmos, with execution still being the responsibility of the application chain. ï»ż Celestia integrates a 2-dimensional Reed-Solomon encoding scheme and has designed a random sampling scheme to verify the availability of data and recover it, similar to the validation method used by ETH. And Celestia also has significant differences from ETH. Celestia focuses on the DA layer and consensus layer, while ETH also served as a settlement layer for Rollups Celestia does not have a Turing complete smart contract virtual machine, therefore, it does not support smart contracts. Celestia’s sovereign rollup can fork into multiple chains, while ETH’s Rollup cannot. Celestia doesn’t have smart contracts, bridges with sovereign rollups would mainly facilitate the movement of the DA layer token. ï»ż The ecosystem of Celestia is growing fast. ï»ż Off-chain DA Off-chain DA mainly include Data Availability Committees (DACs) are trusted parties that provide, or attest to, data availability. DACs are also used by some validiums. Proof-of-stake Data Availability Committees are considerably more secure than regular DACs because they directly incentivize honest behavior. Here, anyone can become a validator and store data off-chain. However, they must provide a “bond”, which is deposited in a smart contract. ï»ż The overview of data availability products. ETH: ETH currently serves as the data availability layer for L2 optimistic rollups and zk rollups. The adoption of EIP4844 (Proto-Danksharding) will provide additional benefits to L2. Although the storage capacity of ETH may not be as large as Celestia’s, it will become comparable once Danksharding is fully implemented. Celestia: Celestia is designed to function as a consensus and data availability layer. The Celestia testnet went online in June 2022 and its innovative modular design has made it increasingly popular since 2022. Celestia needs to establish its own ecosystem and exist in a competitive relationship with Ethereum. Many projects are built on Celestia. Avail: Avail was originally launched by Polygon in June of 2022. However, following the departure of its founder from Polygon, Avail has become an independent modular blockchain project and a testnet has been released. Avail is a standalone consensus and DA layer like Celestia. The Avail mainnet was planned to be bridged to Polygon and use MATIC as the base currency. Compared to Celestia tokens, MATIC is a more mature token. EigenDA: EigenDA is an Ethereum-based DA layer that incentivizes validators to maintain the network through ETH re-staking, eliminating the need for a startup burden like that required by Celestia. Other off-chain DA: Validium uses off-chain storage for data availability, Ethereum for consensus and settlement, and Validium rollup for execution. Validium may be phased out as Celestia and Danksharding gain widespread adoption. ï»ż In conclusion, we think, A data availability layer is a promising and important approach to scaling blockchains. The current DA products have their own advantages, and they all deserve continuous attention. Celestia’s technology still needs to be verified by the market, and ETH and Celestia may also converge technically in the future. ï»ż 4. Decentralized Computation Although we have observed a few decentralized computing projects, we believe that the development of decentralized computing is still in its nascent stages. One of the major challenges faced in this area is verifying the accuracy of computations. ï»ż More Explain Fully decentralization is not always necessary. Currently, there are three main types of DApp architectures available. Centralized services can be beneficial in situations requiring high performance and involving arbitrary complex computations. ï»ż It appears that some individuals may not have a complete understanding of the differences between the consensus layer and the settlement layer. To clarify, I will elaborate on the four functions in the blockchain using Ethereum’s ZK Rollup as an example. After transactions occur on Layer 2, they are submitted to the Sequencer who batches and rolls them up before submitting them to the smart contract on the ETH blockchain. As the rollup is added to the ETH chain, consensus on the order of transactions is confirmed and ETH becomes the consensus layer of the Rollup. As Layer 2 transactions are stored on the ETH blockchain, ETH also serves as the DA (Data Availability) layer for Layer 2. Layer 2 nodes perform transaction execution, alter the global state of Layer 2, and generate zero-knowledge proofs. Layer 2 serves as the execution layer. Layer 2 submits the ZKP to ETH, where the ETH contract verifies its validity. Once the proof is accepted, the new state of Layer 2 is confirmed. ETH serves as the settlement layer for Layer 2 zk rollup. ï»ż There are other types of data-related projects, such as: Projects that focus on indexing on-chain data, such as The Graph and Space and Time, or indexing IPFS data, such as Filecoin Indexer. DNS networks, including LivePeer, Meson Network, Media.network, and others. Storage node reputation markets like Filgram, Filrep, and Cidgravity, with UI/UX examples such as Web3.storage and NFT.storage. About Foresight Ventures Foresight Ventures is dedicated to backing the disruptive innovation of blockchain for the next few decades. We manage multiple funds: a VC fund, an actively-managed secondary fund, a multi-strategy FOF, and a private market secondary fund, with AUM exceeding $400 million. Foresight Ventures adheres to the belief of “Unique, Independent, Aggressive, Long-Term mindset” and provides extensive support for portfolio companies within a growing ecosystem. Our team is composed of veterans from top financial and technology companies like Sequoia Capital, CICC, Google, Bitmain, and many others. Website: https://www.foresightventures.com/ Twitter: https://twitter.com/ForesightVen Medium: https://foresightventures.medium.com Substack: https://foresightventures.substack.com Discord: https://discord.com/invite/maEG3hRdE3 Linktree: https://linktr.ee/foresightventures Disclaimer: All articles by Foresight Ventures are not intended to be investment advice. Individuals should assess their own risk tolerance and make investment decisions prudently.

Foresight Ventures: Blockchain and DApp Storage

Maggie@Foresight Ventures

ï»ż

Key Insights

To achieve full decentralization in Web3 applications, we need technological advancements in four areas including data availability(blockchain scalability), decentralized file systems, decentralized databases, and decentralized computing.

Data retrieval speed, incentive model & tokenomics, and the guarantee algorithm for data availability are key factors that determine whether a file/database storage protocol will be widely used or not.

The main focus for improving decentralized file systems and database protocols will be on reducing retrieval times.

The data availability layer is a promising and important method for scaling blockchain. Celestia’s technology still needs market validation, and ETH and Celestia may converge technically in the future

App Architectures of Both Web2 and Web3 Applications.

Compared to Web2 applications that often consist of a frontend, backend, and data layer with a database and file system, Web3 DApps can be simpler as they only need a frontend and a smart contract that serves as both the backend and database.

ï»ż

But, as these DApps lack a file system to store files, their frontend pages, pictures, and other files are still hosted on centralized servers. To achieve full decentralization, developers are now using decentralized file systems to store the required files, including frontend pages, NFT metadata, and images, for DApps.

ï»ż

To improve structured data storage and backend computing capabilities, we utilize data availability technology to scale blockchain. Additionally, two types of products have emerged: decentralized databases and decentralized computing.

By utilizing blockchain, developers can store financial data and other critical information related to DApps. On the other hand, decentralized databases can be utilized for storing structured data such as NFT metadata, DAO voting data, DEX order books, social data, and so on. Additionally, decentralized computing can help in scaling the backend.

ï»ż

Overall, to build fully decentralized, flexible, and rich Web3 DApps, four types of products and technological advancements are necessary.

Decentralized file system: Store DApp frontend web pages, NFT pictures, videos, and other files of Dapps.

Decentralized database: Store structured data like NFT metadata, DAO votes, and DEX order book.

Data Availability: Scale blockchain and store financial and important data for DApps.

Decentralized Computing Tools: Scale the backend of DApps.

ï»ż

1. Decentralized File System

Decentralized file storage serves as a substitute for centralized storage, facilitating the realization of serverless DApps. The demand of DApps for decentralized file systems is growing and will be a vital component of the Web3 technology stack.

Compared to using centralized storage, the main advantages of decentralized storage are the removal of trusted third parties, increased redundancy, elimination of single point of failure risks, and cheaper costs.

ï»ż

According to Messari’s statistics, the market cap of the top 4 decentralized file storage protocols was nearly $1.6 billion, down 83% from $9.4 billion. Over 17 million terabytes (TB) of total storage capacity, up 2% YoY, and 532,500 TB of used storage, up 1280% YoY.

ï»ż

Let’s take a look at the current situation of several popular decentralized storage projects. Storing data using all these decentralized storage protocols is significantly cheaper compared to AWS. While AWS charges around $23/TB/month, these decentralized storage protocols range from $0.0002 to $20/TB/month.

IPFS: IPFS is currently the most widely used protocol for storing images and metadata for NFTs. It’s great for storing frequently accessed or “hot” data. However, IPFS doesn’t have any built-in ways to incentivize storage, prove data is stored correctly, or establish agreement among participants as blockchains do. This means there’s a risk of losing data if it’s only stored on IPFS. For instance, Infura’s IPFS service deletes data that hasn’t been accessed in six months. So if you want to keep your data available for a long time, it’s best to run your own IPFS node.

Filecoin: Filecoin provides low storage costs and is mainly used for storing “cold” data, such as archival data. Filecoin doesn’t have a built-in charging mechanism for data retrieving, some miners accept low-quality data to earn rewards while refusing to facilitate data retrieval. The Filecoin community is actively addressing this issue and implementing measures to improve the overall quality of stored data.

Arweave: Arweave’s idea of permanent storage is welcome for storing DApp data. The ecosystem is developing well, there are decentralized database systems using Arweave to store database files, as well as second-layer scalability solutions based on Arweave. In Arweave, price does not account for bandwidth, some nodes only provide storage services not retrieval.

Swarm: Bandwidth fees are charged for both storage and retrieval in Swarm. The system is highly decentralized and has high bandwidth requirements for nodes.

StorJ: StorJ is different than other protocols, it is partially decentralized and has good retrieval speed. Has proven effective for large video file sharing.

Sia: Skynet Labs, closed due to a lack of new funding, which also led to a decline in Sia’s usage.

ï»ż

We primarily evaluate the usability of a decentralized file storage protocol based on three factors:

Data retrieval speed. It is crucially important, because it determines the efficiency of a storage system in responding to requests from DApps, and it directly affects the user experience of DApps. Factors that may affect the speed of data retrieval include: whether there is a fee for data queries, the degree of decentralization of nodes, node quality, data forwarding logic, and facilities such as CDNs for accelerated queries.

Incentive model and tokenomics. Incentive models and token economics impact the participation of storage nodes, influencing their behavior. Currently, the mainstream pricing model consists of storage fees plus bandwidth fees, meaning that users need to pay a storage fee when storing data and a bandwidth fee when accessing it. If data queries are free, nodes often lack the motivation to provide them. Moreover, incentive models and token economics impact the earnings of miners, which can affect the number of nodes and the storage capacity of the services.

Data availability guarantee algorithm. It is an algorithm used in decentralized networks to ensure the continuous availability of data and proper service provision by nodes. Currently, the most widely used method is Proof of Random Access.

ï»ż

Overall, we believe that,

The products and services that leverage decentralized storage protocols are still in their early stages.

The main focus for improving storage protocols will be on reducing retrieval times.

Data retrieval speed, incentive model & tokenomics, and the guarantee algorithm for data availability are key factors that determine whether a protocol will be widely used or not.

ï»ż

2. Decentralized Database

Databases are widely used in applications, decentralized databases are a crucial technology for achieving full decentralization in DApps.

Decentralized databases can replace centralized databases to store structured hot data that DApps require, such as NFT metadata, DAO voting, DEX order books, social media data, etc.

ï»ż

There are many decentralized database projects, especially in the past two years where several innovative projects have emerged.

Ceramic: Ceramic is a project started in 2019. Data is stored and managed in units of streams, and formatted event logs are added to streams. The log will be made into a file and uploaded to IPFS. Provides GraphQL API queries. Ceramic has no incentive model like IPFS and supports data creation, reading, and updating (CRU).

OrbitDB: OrbitDB is an earlier project compared to Ceramic, that also uses the IPFS file system for file storage. It supports the storage of both NoSQL databases and files.

Tableland: The project started in 2022 and is currently in the public testing phase. The production version of Tableland will be released in 2023. Data storage requires the use of smart contracts, which define SQL statements and set usage permissions. Reading data is performed off-chain and does not require payment. Currently, the contract has been deployed on L2s such as ETH and OP.

Polybase: The project is now live on the test network. It is a NoSQL database that supports CRUD operations, with each operation incurring fees. Additionally, Polybase offers support for various file systems to store database files, including local disk, IPFS, Filecoin, Polystore, and even AWS S3. Polybase also utilizes payment channels for data query payments, reducing the frequency of on-chain transactions and avoiding query delays caused by payments.

Web3Q: Also known as EthStorage.Project start in 2022. The testnet is alive. Proposed a new URL pattern Web//access protocol for accessing data

Kwill: Kwill is a SQL database system based on Arweave, using smart contracts for payment.

KYVE: KYVE is a database system based on Arwave.

From a technical perspective:

Both SQL and NoSQL can be used as databases. The data structure of SQL requires high consistency, with stronger support for joint queries, making it more mature and efficient. The KV format of NoSQL is more suitable for Ethereum’s design pattern, supporting rich data types, and being flexible and easily scalable.

In terms of functionality, the best option is to support CRUD, but supporting UD will add complexity to the system. If the system uses local storage, historical value queries may not be supported. If using IPFS and Arweave as file systems, the database needs to be append-only, otherwise, there will be multiple versions of the same data, doubling storage costs.

When choosing an underlying file system, there are two options: 1) Store database files in decentralized file systems such as IPFS and Arweave; 2) Store them locally on nodes or in the S3 cloud. If a decentralized database project requires customized retrieval logic or optimization, using local storage or S3 is a more flexible approach.

ï»ż

Overall, we believe that,

The field of decentralized databases is highly worth paying attention to, with an urgent demand, while a widely accepted and used product has not emerged yet.

The maturity of decentralized databases is lower than that of decentralized file storage systems. Decentralized database tech is based on the decentralized file system and many projects are started in 2022.

The main focus for improving storageData retrieval speed, incentive model & tokenomics, and the guarantee algorithm for data availability are key factors that determine whether a protocol will be widely used or not. protocols will be on reducing retrieval times.

ï»ż

ï»ż

3. Data Availability

The concept of data availability is distinguished from decentralized file systems and databases, as elucidated on the websites of Ethereum and Celestia.

Ethereum: Data availability is the guarantee that the block proposer published all transaction data for a block and that the transaction data is available to other network participants.

Celestia: Data availability is concerned with whether the data published in the latest block is available.

While, decentralized file systems and databases mainly ensure that the data stored by users is available, but do not specifically address transactional data.

ï»ż

ï»ż

Currently, there are several data availability projects including:

Ethereum. ETH serves as the DA (data availability) layer for Layer 2 Rollup.

Celestia. Celestia is a specifically designed DA layer that only handles data availability and does not execute transactions. It sparked a trend of modular blockchains in 2022.

EigenDA and other DA products. Ensuring data availability through committees.

ï»ż

Ethereum

ETH Layer 2 creates and submits batches of transactions to the Ethereum network, and stores the data in an Ethereum smart contract on Layer 1. This ensures the guaranteed availability of L2 transaction data through the ETH network.

Although rollups can extend the throughput of ETH through off-chain computation, their capacity is limited by the L1 ETH blockchain data throughput. Therefore, Ethereum needs to increase its data storage and processing capabilities.

ï»ż

To scale up Ethereum’s DA capacity, Danksharding has been included in ETH’s roadmap and is considered one of the most important and urgent updates currently.

Danksharding is a sharding design, data availability is delegated to each shard, and each validator only needs to run a full node for its own shard, while running other shards with light client capacity.

Proto-danksharding (EIP-4844) is a preliminary implementation of Dankshading, which is expected to be implemented in the second half of 2023. It introduces a data blob stored off-chain that is mounted on ETH via transactions, as well as precompiled code for validating Blob. Each blob is approximately 125kB in size, while a block is only 90kB. Currently, at most eight blobs can be mounted per block, resulting in additional storage of 1MB. In Proto-danksharding, the data has not been sharded, and validators still need to download and directly verify the availability of all Blob data. After the implementation of EIP4844, Blob can store 10 times more data than Calldata with the same gas consumption. The data of Rollup can be stored in Blob in the future, reducing transaction fees by an order of magnitude. Once fully implemented, Danksharding will become even cheaper.

In summary, Danksharding can improve Ethereum’s data storage capacity, reduce the cost of ETH used as DA, and become a more powerful DA layer.

ï»ż

Celestia

Celestia is a minimal blockchain that only orders and publishes transactions and does not execute them. By decoupling the consensus and application execution layers, Celestia modularizes the blockchain technology stack and unlocks new possibilities for decentralized application builders.

Celestia is responsible for the DA layer, while ETH handles consensus and settlement, and the application chain is responsible for execution.

Celestia is responsible for both the DA layer and the consensus layer, while settlement and execution are handled by the application chain. Alternatively, settlement can use Cevmos, with execution still being the responsibility of the application chain.

ï»ż

Celestia integrates a 2-dimensional Reed-Solomon encoding scheme and has designed a random sampling scheme to verify the availability of data and recover it, similar to the validation method used by ETH.

And Celestia also has significant differences from ETH.

Celestia focuses on the DA layer and consensus layer, while ETH also served as a settlement layer for Rollups

Celestia does not have a Turing complete smart contract virtual machine, therefore, it does not support smart contracts.

Celestia’s sovereign rollup can fork into multiple chains, while ETH’s Rollup cannot.

Celestia doesn’t have smart contracts, bridges with sovereign rollups would mainly facilitate the movement of the DA layer token.

ï»ż

The ecosystem of Celestia is growing fast.

ï»ż

Off-chain DA

Off-chain DA mainly include

Data Availability Committees (DACs) are trusted parties that provide, or attest to, data availability. DACs are also used by some validiums.

Proof-of-stake Data Availability Committees are considerably more secure than regular DACs because they directly incentivize honest behavior. Here, anyone can become a validator and store data off-chain. However, they must provide a “bond”, which is deposited in a smart contract.

ï»ż

The overview of data availability products.

ETH: ETH currently serves as the data availability layer for L2 optimistic rollups and zk rollups. The adoption of EIP4844 (Proto-Danksharding) will provide additional benefits to L2. Although the storage capacity of ETH may not be as large as Celestia’s, it will become comparable once Danksharding is fully implemented.

Celestia: Celestia is designed to function as a consensus and data availability layer. The Celestia testnet went online in June 2022 and its innovative modular design has made it increasingly popular since 2022. Celestia needs to establish its own ecosystem and exist in a competitive relationship with Ethereum. Many projects are built on Celestia.

Avail: Avail was originally launched by Polygon in June of 2022. However, following the departure of its founder from Polygon, Avail has become an independent modular blockchain project and a testnet has been released. Avail is a standalone consensus and DA layer like Celestia. The Avail mainnet was planned to be bridged to Polygon and use MATIC as the base currency. Compared to Celestia tokens, MATIC is a more mature token.

EigenDA: EigenDA is an Ethereum-based DA layer that incentivizes validators to maintain the network through ETH re-staking, eliminating the need for a startup burden like that required by Celestia.

Other off-chain DA: Validium uses off-chain storage for data availability, Ethereum for consensus and settlement, and Validium rollup for execution. Validium may be phased out as Celestia and Danksharding gain widespread adoption.

ï»ż

In conclusion, we think,

A data availability layer is a promising and important approach to scaling blockchains.

The current DA products have their own advantages, and they all deserve continuous attention.

Celestia’s technology still needs to be verified by the market, and ETH and Celestia may also converge technically in the future.

ï»ż

4. Decentralized Computation

Although we have observed a few decentralized computing projects, we believe that the development of decentralized computing is still in its nascent stages. One of the major challenges faced in this area is verifying the accuracy of computations.

ï»ż

More Explain

Fully decentralization is not always necessary. Currently, there are three main types of DApp architectures available. Centralized services can be beneficial in situations requiring high performance and involving arbitrary complex computations.

ï»ż

It appears that some individuals may not have a complete understanding of the differences between the consensus layer and the settlement layer. To clarify, I will elaborate on the four functions in the blockchain using Ethereum’s ZK Rollup as an example.

After transactions occur on Layer 2, they are submitted to the Sequencer who batches and rolls them up before submitting them to the smart contract on the ETH blockchain. As the rollup is added to the ETH chain, consensus on the order of transactions is confirmed and ETH becomes the consensus layer of the Rollup. As Layer 2 transactions are stored on the ETH blockchain, ETH also serves as the DA (Data Availability) layer for Layer 2.

Layer 2 nodes perform transaction execution, alter the global state of Layer 2, and generate zero-knowledge proofs. Layer 2 serves as the execution layer.

Layer 2 submits the ZKP to ETH, where the ETH contract verifies its validity. Once the proof is accepted, the new state of Layer 2 is confirmed. ETH serves as the settlement layer for Layer 2 zk rollup.

ï»ż

There are other types of data-related projects, such as:

Projects that focus on indexing on-chain data, such as The Graph and Space and Time, or indexing IPFS data, such as Filecoin Indexer.

DNS networks, including LivePeer, Meson Network, Media.network, and others.

Storage node reputation markets like Filgram, Filrep, and Cidgravity, with UI/UX examples such as Web3.storage and NFT.storage.

About Foresight Ventures

Foresight Ventures is dedicated to backing the disruptive innovation of blockchain for the next few decades. We manage multiple funds: a VC fund, an actively-managed secondary fund, a multi-strategy FOF, and a private market secondary fund, with AUM exceeding $400 million. Foresight Ventures adheres to the belief of “Unique, Independent, Aggressive, Long-Term mindset” and provides extensive support for portfolio companies within a growing ecosystem. Our team is composed of veterans from top financial and technology companies like Sequoia Capital, CICC, Google, Bitmain, and many others.

Website: https://www.foresightventures.com/

Twitter: https://twitter.com/ForesightVen

Medium: https://foresightventures.medium.com

Substack: https://foresightventures.substack.com

Discord: https://discord.com/invite/maEG3hRdE3

Linktree: https://linktr.ee/foresightventures

Disclaimer: All articles by Foresight Ventures are not intended to be investment advice. Individuals should assess their own risk tolerance and make investment decisions prudently.
Why do we invest in LSDxThe emergence of new on-chain assets often spurs shifts in the DeFi paradigm, and Foresight Ventures is dedicated to seeking various opportunities within these transformations. In the PoS ETH-driven LSD narrative, we can see the tremendous potential of LSD assets as both interest-bearing and liquidity assets. The upcoming Ethereum “Shanghai Upgrade” will undoubtedly further expand the scale of LSD assets. Therefore, at this critical juncture, we choose to invest our resources and energy into the LSD space. The primary challenge for new assets lies in liquidity. However, current liquidity solutions for LSD assets have high barriers and fragmented liquidity. High barriers result from established DeFi protocols that have built moats around various DEX liquidity offerings, making it costly for emerging assets to cross them. Fragmented liquidity stems from LSD assets being distributed across different DEX protocols, making it difficult to aggregate liquidity through a unified path. This is where LSDx Finance comes in, aiming to establish a liquidity protocol specifically for LSD assets to address high barriers and fragmentation. In DeFi’s development history, a common business model has been the “vampire attack,” such as Sushiswap’s assault on Uniswap’s token issuance. LSDx Finance targets DEXs like Curve with high barriers in their LSD asset segments (e.g., stETH, FrxETH, and rETH), effectively capturing market share in LSD asset liquidity. To maintain liquidity, LSDx Finance adopts a GLP architecture, similar to GMX. GMX’s GLP pool model is a highly efficient liquidity solution, offering high liquidity and scalability by allowing all assets within the GLP pool to be swapped pairwise. This model significantly addresses the current fragmentation issues faced by LSD assets. As a result, LSDx Finance introduces ETHx, a unified liquidity pool for LSD assets, with all LSD assets composing the ETHx Pool. Like GLP tokens, users receive $ETHx upon providing liquidity for LSD assets, representing ownership in the LSD unified liquidity pool. Building on the foundation of addressing LSD asset liquidity, LSDx Finance takes it a step further by launching the $UM stablecoin, backed by $ETHx as collateral. Users holding $UM can access the ETH staking rewards underlying LSD assets, effectively turning the stablecoin into an interest-bearing asset. In the current climate of heightened inflation, an interest-bearing and fully decentralized stablecoin will undoubtedly serve as a safe and effective means of hedging against external inflation. In summary, LSDx Finance’s advantages include: Timely positioning enables LSDx Finance to potentially execute “vampire attacks”; GLP-like liquidity solutions address LSD asset liquidity issues; ETHx, as a basket of LSD tokens, balances different LSD risks and interests; UM, as a decentralized, interest-bearing stablecoin asset, holds value for the entire crypto ecosystem. We hope this protocol will inject new vitality into the LSD sector.

Why do we invest in LSDx

The emergence of new on-chain assets often spurs shifts in the DeFi paradigm, and Foresight Ventures is dedicated to seeking various opportunities within these transformations. In the PoS ETH-driven LSD narrative, we can see the tremendous potential of LSD assets as both interest-bearing and liquidity assets. The upcoming Ethereum “Shanghai Upgrade” will undoubtedly further expand the scale of LSD assets. Therefore, at this critical juncture, we choose to invest our resources and energy into the LSD space.

The primary challenge for new assets lies in liquidity. However, current liquidity solutions for LSD assets have high barriers and fragmented liquidity. High barriers result from established DeFi protocols that have built moats around various DEX liquidity offerings, making it costly for emerging assets to cross them. Fragmented liquidity stems from LSD assets being distributed across different DEX protocols, making it difficult to aggregate liquidity through a unified path.

This is where LSDx Finance comes in, aiming to establish a liquidity protocol specifically for LSD assets to address high barriers and fragmentation. In DeFi’s development history, a common business model has been the “vampire attack,” such as Sushiswap’s assault on Uniswap’s token issuance. LSDx Finance targets DEXs like Curve with high barriers in their LSD asset segments (e.g., stETH, FrxETH, and rETH), effectively capturing market share in LSD asset liquidity.

To maintain liquidity, LSDx Finance adopts a GLP architecture, similar to GMX. GMX’s GLP pool model is a highly efficient liquidity solution, offering high liquidity and scalability by allowing all assets within the GLP pool to be swapped pairwise. This model significantly addresses the current fragmentation issues faced by LSD assets.

As a result, LSDx Finance introduces ETHx, a unified liquidity pool for LSD assets, with all LSD assets composing the ETHx Pool. Like GLP tokens, users receive $ETHx upon providing liquidity for LSD assets, representing ownership in the LSD unified liquidity pool.

Building on the foundation of addressing LSD asset liquidity, LSDx Finance takes it a step further by launching the $UM stablecoin, backed by $ETHx as collateral. Users holding $UM can access the ETH staking rewards underlying LSD assets, effectively turning the stablecoin into an interest-bearing asset. In the current climate of heightened inflation, an interest-bearing and fully decentralized stablecoin will undoubtedly serve as a safe and effective means of hedging against external inflation.

In summary, LSDx Finance’s advantages include:

Timely positioning enables LSDx Finance to potentially execute “vampire attacks”;

GLP-like liquidity solutions address LSD asset liquidity issues;

ETHx, as a basket of LSD tokens, balances different LSD risks and interests;

UM, as a decentralized, interest-bearing stablecoin asset, holds value for the entire crypto ecosystem.

We hope this protocol will inject new vitality into the LSD sector.
Foresight Ventures: Avatar — How to enter and live in a metaverseAuthor: Alice@Foresight Ventures TL;DR Avatars can create the feeling of being transported to the metaverse, provide an enhanced sense of agency, and satisfy the need to feel connected to others. We believe avatars will be the most valuable assets in the metaverse. Yet the largest games and virtual worlds are closed economies, where the avatars are not interoperable with one another. There are three aspects to consider when building an avatar for mass adoption : Technology — determine the liveness and quality of avatar, identity, and community. Identity — adds to the interoperability and social value of avatar Community — consensus is the key to driving mass adoption 3D virtual identity is on the rise due to 1. growth of on-chain data and 2. proliferation of AIGC. People will counterbalance this with a need for interoperable avatar identities that ensure human-to-human interactions. Intro: Fantasy for the Metaverse Humans have never stopped the quest for the future of the metaverse. The TV series West world is one of the most classic artworks of all. Westworld is a fictional, technologically advanced Wild-West-themed amusement park populated by virtual human “hosts”, which caters to high-paying guests who may indulge their wildest fantasies: You can join bounty hunters to catch criminals or follow a group of gangsters to rob the train; you can enter Westworld to experience the free American country life in the day, and when the night falls you will be transported to Golden Age to experience Luxembourg in the 1940s. The Rise of Avatar Economy What attracts us the most in these fantasy stories are the virtual avatars- Everyone gets a virtual reflection of themselves and NPC avatars that make the game more vivid and exciting. As crypto VC, we believe virtual avatars will be one of the most valuable assets in the metaverse. The word avatar originates from Sanskrit, meaning “descent” or “lapse.” It’s often used in Hinduism to refer to the incarnations of Vishnu. It is popularized in its cyber sense by Neal Stephenson in his 1992 science-fiction novel Snow Crash. Avatars are virtual representations of users in the metaverse, a collective virtual shared space. They allow users to interact with each other and the environment as if they were there in person. Avatar is a junction point between tradition and reality. It has already many use cases in the real business world, both in web2 and web3. The global avatar market size is expected to reach USD 527.58 Billion in 2030 and register a revenue CAGR of 46.4% (Emergen Research). Here are the main uses cases for avatars: a. Reduce Workforce Avatar can be used to replace service roles in live streaming and customer support. The key is to generate content with low cost and high efficiency. On the one hand, enterprises save on wages, on the other, Avatar can do live stream 7 * 24 hours without worrying about making any mistakes. b. Upgrade Experience We can use an avatar to upgrade the customer experience. From text — voice- images-short video-holographic VR, customers are always pursuing richer sensory experiences. Imagine that in the future, by communicating with your favorite avatar, they can help you with fitness plans, financial management, and psychological counseling. The ultimate form of avatar should be a general assistant like Siri, which can not only handle all kinds of daily tasks, they can even be your friend or even lover. c. Virtual Idols According to a Tencent report, 48% of Genz said they have favorite avatars, while 17.2% of them regard avatars as their idols. Using an avatar as the spokesperson will also help brands to save marketing costs while preventing the risk of being involved in celebrities’ future scandals. For the brands, the idea of creating their IP is gaining increased traction and this business model has been tested out by many companies like O’loreal, LVMH d. Self-Representation Avatar is the key for users to enter the metaverse. It shows personality, belief, and social prestige. According to a report, the Gaming Skins market is reached $50 billion in 2022 (Benzinga). Lately, there has been a growing number of articles discussing the Avatar-self, a concept in the philosophy of mind that refers to the self as the aware, conscious, and internally coherent presence within an Avatar or image. Some interesting findings include: Players are more likely to become emotionally attached to a personalized avatar. This emotional attachment can lead to increased game playing time and a willingness to pay for in-game items. (Harvard Business Review) It shapes human interaction and sympathy. People are likely to feel sympathetic to other races if they use their avatars. (Game for Health Journal, Belinda Gutierrez) The use of avatars can increase productivity by 10–15% (World Bank); avatars can help organizations achieve a 20–30% reduction in communication costs (McKinsey & Company) The Metaverse we have and the Problem Interoperability has long been a key guiding principle of the Metaverse. Yet today, the largest games and virtual worlds are closed economies that are not interoperable with one another. From Fortnite to Minecraft to League of Legends, most games do not allow players to trade or carry their digital assets outside a centrally defined space. We believe the next generation of games and virtual worlds will be built with interoperability as a core tenet. Rather than hold players inside walled gardens, the virtual worlds that comprise the open Metaverse will empower players to own their identity and take their digital assets wherever they go. These open economies will be larger and more durable than any closed economy game that we’ve seen yet, with players deeply invested in their assets and driving innovation as co-creators / co-owners of the world around them. How to build a successful avatar in Web3? Assume we are building an avatar from scratch, there are three fundamental things we should consider: Technology Decentralized Identity Community Let’s break down the discussion one by one a. Technology The production of a 3D avatar consists of four parts: image generation, motion capture, rendering, and interaction. Image Generation Modeling is the most important part of image generation. 3D modeling technology currently includes two main categories — static modeling and dynamic modeling. The mainstream is static modeling, in which the rapid development of camera array scanning reconstruction, it currently can achieve millisecond high-speed photo scanning (high-performance camera array accuracy can reach sub-millimeter level) Compared with static reconstruction, dynamic light field reconstruction not only can reconstruct the geometric model of the person but also obtain dynamic data of the person model at one time, which leads to its key development direction or digital human modeling. Microsoft, Google, Intel, Facebook, Tsinghua University, SenseTime Technology, and Huawei have all actively conducted related research on this topic. Motion Capture When the modeling is completed, we need a series of bindings and drivers to make the lifeless model move. Bone and muscle binding determines the naturalness and fluency of the model’s subsequent body movements and facial expressions. Currently, there are two mainstream approaches: i.e. 1. skeletal binding and 2. hybrid deformation binding. And the driven technology is divided into two parts: i.e. 1. Real-person-driven technology 2. AI-driven technology. Driven techniques consist of manual adjustment of keyframes, prefabricated actions, motion capture, intelligent compositing, etc. Rendering The next step is to generate animation by rendering. Rendering refers to the process of computing the visual picture of the model under the action of viewpoint, light, motion trajectory, and other factors. There are two main types of rendering: offline and real-time. Offline rendering is mainly used in film and television, and its representative software is Maya, 3DMax, etc. Real-time rendering refers to the computer calculating the picture while displaying its output, mainly used in games and other aspects, representative engines are Unreal Engine (Unreal), Unity Engine, etc. With hardware development and algorithm breakthroughs, the rendering speed, effect realism, and screen resolution have been greatly improved, and in the real-time rendering of virtual characters, it has been possible to achieve falseness. Interaction If deeper interaction with humans is expected, TTS, NPL (Natural Language Processing), and synthetic voice technology can be used. The recently hot ChatGPT is a good demonstration of supervised learning as well as reinforcement learning In terms of presentation, most avatars are presented in the form of pictures, videos, live-streaming, etc., in the future VR devices/holographic projections, are expected to provide us with richer props. In addition, the proliferation of metaverse, 3A games, and online tools (zoom meeting, google meet, vrchat) have also greatly enriched the utility of avatars. b. Decentralised Identity An interoperable decentralized identity adds social value to the avatar. Interoperable 3D Identity Think of Avatar not only as a gaming asset that you can carry around to the metaverse, but as a universal passport that shows your personality, achievement, and social reputation, and players could easily associate their SBT to this avatar that builds up even more value to it. Interoperable identity will be the key to mass adoption. Onboarding more users Having a metaverse universal avatar may become the first stop for the average user to enter the web3 experience in the future. Developing Sims-like games to guide newbies to use Defi, NFT, and Gamfi without obscure on-chain transactions will potentially facilitate web3 to go mainstream. c. Community & Marketing From an operational point of view, the success of avatar projects should be mostly attributed to marketing and community support. Here are some good examples: Hatsune Miku As the world’s first virtual singer-songwriter, Hatsune is a typical community-created idol, a symbol of UGC culture which is now worth more than 10 billion Yen. The character itself does not produce any content, and all her personality, songs, and movements are the co-creation artwork of community creators. HALO nft HALO NFT, presented in high-definition 3D avatars, is created by the HALO label with the Decentralized 3D Artist Community. Despite recent market turbulence, its initial sale has reached 1930 eth in total volume, 1.55eth floor price, and 3% listed. The decentralized artist community behind the project blends an eclectic and multicultural mix of 3D artists from India, Russia, Jordan, and France
 The project team has experience working with film production companies such as Disney, newspapers, video game developers, sculptors, digital art, and other NFT projects. Yexi Lv A new virtual idol named Liu Yexi debuted on Douyin, the Chinese version of TikTok, the figure defined as a “virtual beauty vlogger” by its creators has earned more than 2.8 million followers on her debut video. Liu is considered to be one of the highest-quality virtual characters available online in recent years since the video looks almost photorealistic. Her team has precisely positioned her as a marvel-style cyberpunk heroine in the metaverse. The video containing both Chinese traditional cultural elements and science-fiction elements stunned Chinese netizens. Also, the team is adept at creating interactions with fans, including cosmetic challenges and post-credit scenes, building up continuous tractions in social media. In all, through strong community and marketing strategies, we can identify the early supporters that can help enrich the creative content and boost consensus, and subsequently build up the ecosystem associated with the avatar. Industry Mapping Based on the above analysis, we believe the Avatar industry can be breakdown into three layers: Infrastructure (3D Modeling software, Game engine) Platform (Motion capture equipment, AI solution provider) Application(Avatar companies) The first two are relatively mature with years of development, the bottleneck that really prevents avatar from mass adoption yet lies in the application layer. What we lack before building up a universal avatar IP is market consensus, innovative gameplay, and real-world use cases, which are exactly the areas that the web3 community is very good at. Here’s a summary of the good avatar projects in web3: a. Ready-Player-Me RPM makes it fast and easy to create great-looking 3D avatars. Players can create a personalized avatar by simply uploading a selfie (try it out on their website) and exporting their avatar immediately to any partner experience. The avatars are fully customizable and players love sharing their unique creations across TikTok, Twitter, and Discord. This system runs across desktop, web, and mobile, and is available to developers through a robust SDK and API. Long-term, RPM is building the interoperable identity protocol for the open Metaverse — enabling players and developers alike to take their identity and assets with them to any 3D experience. Team Intro Ready player Me is incubated by Wolf 3D, the company that has built 3D scanning technology for 8 years in businesses like airports, museums, and conferences. Over the years, they aggregated a proprietary database of 20,000+ face scans captured with the company’s own hardware-based 3D scanners. Tokenomics There is currently no plan to issue tokens, and its business model is expected to be SaaS. Project Update Fundraising: Ready Player Me has raised a total of $72.6M in funding over 7 rounds. Their latest funding was raised on Aug 23, 2022, from a Series B round led by A16z. Integration: Over 3,000 apps across both web2 and web3 already integrate RPM, including VRChat, Spatial, Somnium Space, IGG, Pixelynx, RTFKT, and many more. b. Lifeform Lifeform’s core technologies include the following: Hyper-realistic 3D avatar creation tools built with UE5. Users can create more than 10 billion avatar combinations. SDK: Lifeform also provides developers with a corresponding SDK, providing DApp with various functions. Avatar ID. The core of Lifeform Avatar is an NFT that can be connected to the wallet, serving as a 3D virtual identity for users to enter different metaverses. How does avatar ID work: Generate avatar — Live to earn in Lifeform ecosystem-generate on-chain behavioral data — claim Poaps and SBTs Users can also log in with accounts or emails, and use 3D virtual identities in Google Meet, Zoom, Discord, Tiktok, and other apps. Team Intro The founder Brian Hirsh is a multi-time entrepreneur who once created a Saas company; it is also worth mentioning that the 3D modeler has participated in the modeling of multiple characters in Marvel movies. Tokenomics Lifeform has clear plans for tokenomics and ecosystem development. The token has a hard cap of 1 billion, which is mainly divided into two parts: points and tokens. Tokens can be used to generate avatars, stake liquidity, and more. Points are mainly used for sub-projects within the ecosystem and can be obtained through token staking. Project Update Lifeform is invested by Binance labs, and it is also one of the star projects of Korea Blockchain Week. They recently launched a joint airdrop with Binance released a joint airdrop event (click here to participate), avatar NFT owner reached 130k+ 2. The Lifeform incubation team successfully launched HALO NFT in May, and the response has been great. Halo NFT holders will be able to use their characters in the Lifeform ecosystem. Ecosystem: Lifeform will release their first Live-to-earn game Attract game partners to develop more games, creating an interoperable gaming metaverse. Integration with metaverses: e.g. Lifeform SDK has successfully connected to Burger Cities, users can easily bind Avatar with a wallet address, and log in to Burger Metaverse with Lifeform Avatar. c. Genies Digital avatar startup Genies is known for partnerships with celebrities such as Justin Bieber, Migos, and Cardi B. (Genies 3D celebrity avatars -Cardi B, Rhianna, Justin Bieber) Not only does this app provide a user interface to create avatars, but they have also teamed up with different brands to offer digital assets. Users of this app can customize their avatars with accessories and sell them as NFTs. (Genies platform showcasing a fashion collection designed by creator Ian Charms.) Tokenomics Genies is more of a web2.5 project where people can buy and trade digital assets with fiat and credit card. There’s no clear plan for issuing tokens in the near future. Mainly the team profits from the 5% transaction fees for its NFT marketplace. Project Update On April 12, 2022, Genies raised a $150 million Series C round led by Silver Lake Genies partnered with Universal Music Group and Warner Music Group and launched its own NFT marketplace, The Warehouse, to produce Avatar and digital wearable NFT products for its artists. Genies will release a new line of virtual eco-DIY tools that users can use to DIY their closet \ and change the color, pattern, and texture of any item. DIY is available for both creations made from scratch in Genies Studio and other creator items purchased from Warehouse. AIGC & Metaverse One success factor for avatar economy is the interesting gameplay associated with that character. To that end, Ai-generated content can be largely adapted to the metaverse to present more colorful gameplay. We may be interacting with AIGC in our daily lives. This can include things like chatbots, automated response systems, and even some forms of content generation for NPCs. The value of AI-generated content will vary depending on the application and how well it is executed. Some use cases include: a. AI can be used to create images & 3D avatars & NFT Lensa: users only need to upload 10 to 20 pictures with clear faces, and dozens of AI avatars with 10+ unique styles can be automatically generated based on Stable Diffusion models. These avatars are rich in types, such as sci-fi, animation oil painting, etc. In the future, we may be able to generate 3D avatars or NFTs in the same way. b. AI can be used to create non-player characters (NPCs) Red Dead Redemption: NPCs who travel between towns as tourists on horseback or in a carriage with a certain purpose are driven by AI. c. AI can be used to create and manage the environment (physics, lighting, and weather) Microsoft Flight Simulator: in this game, Microsoft partnered with blackshark.ai and trained an AI to create the entire planet Earth (all 197 million square miles). In general, AI-generated content has the potential to be very valuable as it can save time and money while still providing high-quality content. However, there are also many people who believe that AI-generated content will never be able to replace human artwork because it lacks the creativity and emotional connection that only humans can create. In terms of the related projects, most of it is still at a very early stage(even at the conception), still there are some interesting projects that are trying to build AIGC products in web3: Future Outlook Living in the Metaverse With the development of VR/AR/AI infrastructure, We believe that in the future, humans will spend more time truly experiencing living in interoperable metaverses, and the player avatar is very likely to be the most important asset. From direct-to-consumer(D2C) to direct-to-avatar(D2A) Avatars could help brands to better connect with digital natives and Generation Z consumers, who are often more comfortable interacting with digital avatars than with real-life people. This shift in focus from traditional marketing methods to Avatar-based marketing may provide better engagement with potential customers, as well as provide a more immersive experience that can lead to improved conversion rates. Examples: Gucci, Nike, LVHM. AI & Humans coexist in the metaverse The proliferation of virtual beings and the growth of the metaverse will have a profound impact on humanity. As we become more and more comfortable interacting with virtual entities, the line between humans and machines will become increasingly blurred. It may take some time for us to get used to living and interacting with AI on a regular basis, but there are already examples of this happening in the real world. In the game world of Second Life, for example, there are entire communities made up of players who have created avatars that look and act like real people. There are also a growing number of businesses that are using AI-powered chatbots to talk to customers. The proliferation of closed networks with known members and personal relationships Low-level interactions (i.e. interactions that require low levels of social and emotional intelligence) will be outsourced to AI. The number of people who interact with AI systems, and the number of interactions with AI systems, will rise sharply. People will counterbalance this with a need for personal relationships and safe spaces that ensure human-to-human interactions. 3D interoperable DID is the next wave of innovation The value of 3D visual identities will rise. Although the competition in this space is fierce, we believe that only the teams that can gain mass adaption are those who are good at building ecosystems and communities. The value of a 3D avatar should span far beyond its fancy appearance, but has real utility and market consensus. Reference https://metaverseinsider.tech/2022/07/16/metaverse-avatars/#What_Is_A_Metaverse_Avatar https://hbr.org/2006/06/avatar-based-marketing https://mastersofmedia.hum.uva.nl/blog/2021/10/29/real-money-on-virtual-items-a-visual-analysis-of-fortnite-skins/ https://hbr.org/2006/06/avatar-based-marketing https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4559151/ https://a16z.com/2022/11/17/the-generative-ai-revolution-in-games/ https://medium.com/nico-s-ideas/an-avatar-and-a-digital-identity-may-be-the-first-step-towards-being-immortal-d2bd8a22cd2 https://virtualworlds.substack.com/p/the-future-is-now-the-ai-gaming-revolution?sd=pf About Foresight Ventures Foresight Ventures is dedicated to backing the disruptive innovation of blockchain for the next few decades. We manage multiple funds: a VC fund, an actively-managed secondary fund, a multi-strategy FOF, and a private market secondary fund, with AUM exceeding $400 million. Foresight Ventures adheres to the belief of “Unique, Independent, Aggressive, Long-Term mindset” and provides extensive support for portfolio companies within a growing ecosystem. Our team is composed of veterans from top financial and technology companies like Sequoia Capital, CICC, Google, Bitmain, and many others. Website: https://www.foresightventures.com/ Twitter: https://twitter.com/ForesightVen Medium: https://foresightventures.medium.com Substack: https://foresightventures.substack.com Discord: https://discord.com/invite/maEG3hRdE3 Linktree: https://linktr.ee/foresightventures

Foresight Ventures: Avatar — How to enter and live in a metaverse

Author: Alice@Foresight Ventures

TL;DR

Avatars can create the feeling of being transported to the metaverse, provide an enhanced sense of agency, and satisfy the need to feel connected to others. We believe avatars will be the most valuable assets in the metaverse.

Yet the largest games and virtual worlds are closed economies, where the avatars are not interoperable with one another.

There are three aspects to consider when building an avatar for mass adoption :

Technology — determine the liveness and quality of avatar, identity, and community.

Identity — adds to the interoperability and social value of avatar

Community — consensus is the key to driving mass adoption

3D virtual identity is on the rise due to 1. growth of on-chain data and 2. proliferation of AIGC. People will counterbalance this with a need for interoperable avatar identities that ensure human-to-human interactions.

Intro: Fantasy for the Metaverse

Humans have never stopped the quest for the future of the metaverse. The TV series West world is one of the most classic artworks of all. Westworld is a fictional, technologically advanced Wild-West-themed amusement park populated by virtual human “hosts”, which caters to high-paying guests who may indulge their wildest fantasies: You can join bounty hunters to catch criminals or follow a group of gangsters to rob the train; you can enter Westworld to experience the free American country life in the day, and when the night falls you will be transported to Golden Age to experience Luxembourg in the 1940s.

The Rise of Avatar Economy

What attracts us the most in these fantasy stories are the virtual avatars- Everyone gets a virtual reflection of themselves and NPC avatars that make the game more vivid and exciting. As crypto VC, we believe virtual avatars will be one of the most valuable assets in the metaverse.

The word avatar originates from Sanskrit, meaning “descent” or “lapse.” It’s often used in Hinduism to refer to the incarnations of Vishnu. It is popularized in its cyber sense by Neal Stephenson in his 1992 science-fiction novel Snow Crash. Avatars are virtual representations of users in the metaverse, a collective virtual shared space. They allow users to interact with each other and the environment as if they were there in person.

Avatar is a junction point between tradition and reality. It has already many use cases in the real business world, both in web2 and web3. The global avatar market size is expected to reach USD 527.58 Billion in 2030 and register a revenue CAGR of 46.4% (Emergen Research). Here are the main uses cases for avatars:

a. Reduce Workforce

Avatar can be used to replace service roles in live streaming and customer support. The key is to generate content with low cost and high efficiency. On the one hand, enterprises save on wages, on the other, Avatar can do live stream 7 * 24 hours without worrying about making any mistakes.

b. Upgrade Experience

We can use an avatar to upgrade the customer experience. From text — voice- images-short video-holographic VR, customers are always pursuing richer sensory experiences. Imagine that in the future, by communicating with your favorite avatar, they can help you with fitness plans, financial management, and psychological counseling. The ultimate form of avatar should be a general assistant like Siri, which can not only handle all kinds of daily tasks, they can even be your friend or even lover.

c. Virtual Idols

According to a Tencent report, 48% of Genz said they have favorite avatars, while 17.2% of them regard avatars as their idols. Using an avatar as the spokesperson will also help brands to save marketing costs while preventing the risk of being involved in celebrities’ future scandals. For the brands, the idea of creating their IP is gaining increased traction and this business model has been tested out by many companies like O’loreal, LVMH

d. Self-Representation

Avatar is the key for users to enter the metaverse. It shows personality, belief, and social prestige. According to a report, the Gaming Skins market is reached $50 billion in 2022 (Benzinga).

Lately, there has been a growing number of articles discussing the Avatar-self, a concept in the philosophy of mind that refers to the self as the aware, conscious, and internally coherent presence within an Avatar or image. Some interesting findings include:

Players are more likely to become emotionally attached to a personalized avatar. This emotional attachment can lead to increased game playing time and a willingness to pay for in-game items. (Harvard Business Review)

It shapes human interaction and sympathy. People are likely to feel sympathetic to other races if they use their avatars. (Game for Health Journal, Belinda Gutierrez)

The use of avatars can increase productivity by 10–15% (World Bank); avatars can help organizations achieve a 20–30% reduction in communication costs (McKinsey & Company)

The Metaverse we have and the Problem

Interoperability has long been a key guiding principle of the Metaverse. Yet today, the largest games and virtual worlds are closed economies that are not interoperable with one another. From Fortnite to Minecraft to League of Legends, most games do not allow players to trade or carry their digital assets outside a centrally defined space.

We believe the next generation of games and virtual worlds will be built with interoperability as a core tenet. Rather than hold players inside walled gardens, the virtual worlds that comprise the open Metaverse will empower players to own their identity and take their digital assets wherever they go. These open economies will be larger and more durable than any closed economy game that we’ve seen yet, with players deeply invested in their assets and driving innovation as co-creators / co-owners of the world around them.

How to build a successful avatar in Web3?

Assume we are building an avatar from scratch, there are three fundamental things we should consider:

Technology

Decentralized Identity

Community

Let’s break down the discussion one by one

a. Technology

The production of a 3D avatar consists of four parts: image generation, motion capture, rendering, and interaction.

Image Generation

Modeling is the most important part of image generation.

3D modeling technology currently includes two main categories — static modeling and dynamic modeling.

The mainstream is static modeling, in which the rapid development of camera array scanning reconstruction, it currently can achieve millisecond high-speed photo scanning (high-performance camera array accuracy can reach sub-millimeter level)

Compared with static reconstruction, dynamic light field reconstruction not only can reconstruct the geometric model of the person but also obtain dynamic data of the person model at one time, which leads to its key development direction or digital human modeling. Microsoft, Google, Intel, Facebook, Tsinghua University, SenseTime Technology, and Huawei have all actively conducted related research on this topic.

Motion Capture

When the modeling is completed, we need a series of bindings and drivers to make the lifeless model move. Bone and muscle binding determines the naturalness and fluency of the model’s subsequent body movements and facial expressions.

Currently, there are two mainstream approaches: i.e. 1. skeletal binding and 2. hybrid deformation binding.

And the driven technology is divided into two parts: i.e. 1. Real-person-driven technology 2. AI-driven technology. Driven techniques consist of manual adjustment of keyframes, prefabricated actions, motion capture, intelligent compositing, etc.

Rendering

The next step is to generate animation by rendering.

Rendering refers to the process of computing the visual picture of the model under the action of viewpoint, light, motion trajectory, and other factors.

There are two main types of rendering: offline and real-time.

Offline rendering is mainly used in film and television, and its representative software is Maya, 3DMax, etc.

Real-time rendering refers to the computer calculating the picture while displaying its output, mainly used in games and other aspects, representative engines are Unreal Engine (Unreal), Unity Engine, etc.

With hardware development and algorithm breakthroughs, the rendering speed, effect realism, and screen resolution have been greatly improved, and in the real-time rendering of virtual characters, it has been possible to achieve falseness.

Interaction

If deeper interaction with humans is expected, TTS, NPL (Natural Language Processing), and synthetic voice technology can be used. The recently hot ChatGPT is a good demonstration of supervised learning as well as reinforcement learning

In terms of presentation, most avatars are presented in the form of pictures, videos, live-streaming, etc., in the future VR devices/holographic projections, are expected to provide us with richer props. In addition, the proliferation of metaverse, 3A games, and online tools (zoom meeting, google meet, vrchat) have also greatly enriched the utility of avatars.

b. Decentralised Identity

An interoperable decentralized identity adds social value to the avatar.

Interoperable 3D Identity

Think of Avatar not only as a gaming asset that you can carry around to the metaverse, but as a universal passport that shows your personality, achievement, and social reputation, and players could easily associate their SBT to this avatar that builds up even more value to it. Interoperable identity will be the key to mass adoption.

Onboarding more users

Having a metaverse universal avatar may become the first stop for the average user to enter the web3 experience in the future. Developing Sims-like games to guide newbies to use Defi, NFT, and Gamfi without obscure on-chain transactions will potentially facilitate web3 to go mainstream.

c. Community & Marketing

From an operational point of view, the success of avatar projects should be mostly attributed to marketing and community support. Here are some good examples:

Hatsune Miku

As the world’s first virtual singer-songwriter, Hatsune is a typical community-created idol, a symbol of UGC culture which is now worth more than 10 billion Yen. The character itself does not produce any content, and all her personality, songs, and movements are the co-creation artwork of community creators.

HALO nft

HALO NFT, presented in high-definition 3D avatars, is created by the HALO label with the Decentralized 3D Artist Community. Despite recent market turbulence, its initial sale has reached 1930 eth in total volume, 1.55eth floor price, and 3% listed.

The decentralized artist community behind the project blends an eclectic and multicultural mix of 3D artists from India, Russia, Jordan, and France
 The project team has experience working with film production companies such as Disney, newspapers, video game developers, sculptors, digital art, and other NFT projects.

Yexi Lv

A new virtual idol named Liu Yexi debuted on Douyin, the Chinese version of TikTok, the figure defined as a “virtual beauty vlogger” by its creators has earned more than 2.8 million followers on her debut video. Liu is considered to be one of the highest-quality virtual characters available online in recent years since the video looks almost photorealistic.

Her team has precisely positioned her as a marvel-style cyberpunk heroine in the metaverse. The video containing both Chinese traditional cultural elements and science-fiction elements stunned Chinese netizens. Also, the team is adept at creating interactions with fans, including cosmetic challenges and post-credit scenes, building up continuous tractions in social media.

In all, through strong community and marketing strategies, we can identify the early supporters that can help enrich the creative content and boost consensus, and subsequently build up the ecosystem associated with the avatar.

Industry Mapping

Based on the above analysis, we believe the Avatar industry can be breakdown into three layers:

Infrastructure (3D Modeling software, Game engine)

Platform (Motion capture equipment, AI solution provider)

Application(Avatar companies)

The first two are relatively mature with years of development, the bottleneck that really prevents avatar from mass adoption yet lies in the application layer.

What we lack before building up a universal avatar IP is market consensus, innovative gameplay, and real-world use cases, which are exactly the areas that the web3 community is very good at.

Here’s a summary of the good avatar projects in web3:

a. Ready-Player-Me

RPM makes it fast and easy to create great-looking 3D avatars. Players can create a personalized avatar by simply uploading a selfie (try it out on their website) and exporting their avatar immediately to any partner experience. The avatars are fully customizable and players love sharing their unique creations across TikTok, Twitter, and Discord.

This system runs across desktop, web, and mobile, and is available to developers through a robust SDK and API.

Long-term, RPM is building the interoperable identity protocol for the open Metaverse — enabling players and developers alike to take their identity and assets with them to any 3D experience.

Team Intro

Ready player Me is incubated by Wolf 3D, the company that has built 3D scanning technology for 8 years in businesses like airports, museums, and conferences. Over the years, they aggregated a proprietary database of 20,000+ face scans captured with the company’s own hardware-based 3D scanners.

Tokenomics

There is currently no plan to issue tokens, and its business model is expected to be SaaS.

Project Update

Fundraising: Ready Player Me has raised a total of $72.6M in funding over 7 rounds. Their latest funding was raised on Aug 23, 2022, from a Series B round led by A16z.

Integration: Over 3,000 apps across both web2 and web3 already integrate RPM, including VRChat, Spatial, Somnium Space, IGG, Pixelynx, RTFKT, and many more.

b. Lifeform

Lifeform’s core technologies include the following:

Hyper-realistic 3D avatar creation tools built with UE5. Users can create more than 10 billion avatar combinations.

SDK: Lifeform also provides developers with a corresponding SDK, providing DApp with various functions.

Avatar ID. The core of Lifeform Avatar is an NFT that can be connected to the wallet, serving as a 3D virtual identity for users to enter different metaverses.

How does avatar ID work: Generate avatar — Live to earn in Lifeform ecosystem-generate on-chain behavioral data — claim Poaps and SBTs

Users can also log in with accounts or emails, and use 3D virtual identities in Google Meet, Zoom, Discord, Tiktok, and other apps.

Team Intro

The founder Brian Hirsh is a multi-time entrepreneur who once created a Saas company; it is also worth mentioning that the 3D modeler has participated in the modeling of multiple characters in Marvel movies.

Tokenomics

Lifeform has clear plans for tokenomics and ecosystem development. The token has a hard cap of 1 billion, which is mainly divided into two parts: points and tokens. Tokens can be used to generate avatars, stake liquidity, and more. Points are mainly used for sub-projects within the ecosystem and can be obtained through token staking.

Project Update

Lifeform is invested by Binance labs, and it is also one of the star projects of Korea Blockchain Week. They recently launched a joint airdrop with Binance released a joint airdrop event (click here to participate), avatar NFT owner reached 130k+

2. The Lifeform incubation team successfully launched HALO NFT in May, and the response has been great. Halo NFT holders will be able to use their characters in the Lifeform ecosystem.

Ecosystem:

Lifeform will release their first Live-to-earn game

Attract game partners to develop more games, creating an interoperable gaming metaverse.

Integration with metaverses: e.g. Lifeform SDK has successfully connected to Burger Cities, users can easily bind Avatar with a wallet address, and log in to Burger Metaverse with Lifeform Avatar.

c. Genies

Digital avatar startup Genies is known for partnerships with celebrities such as Justin Bieber, Migos, and Cardi B.

(Genies 3D celebrity avatars -Cardi B, Rhianna, Justin Bieber)

Not only does this app provide a user interface to create avatars, but they have also teamed up with different brands to offer digital assets. Users of this app can customize their avatars with accessories and sell them as NFTs.

(Genies platform showcasing a fashion collection designed by creator Ian Charms.)

Tokenomics

Genies is more of a web2.5 project where people can buy and trade digital assets with fiat and credit card. There’s no clear plan for issuing tokens in the near future. Mainly the team profits from the 5% transaction fees for its NFT marketplace.

Project Update

On April 12, 2022, Genies raised a $150 million Series C round led by Silver Lake

Genies partnered with Universal Music Group and Warner Music Group and launched its own NFT marketplace, The Warehouse, to produce Avatar and digital wearable NFT products for its artists.

Genies will release a new line of virtual eco-DIY tools that users can use to DIY their closet \ and change the color, pattern, and texture of any item. DIY is available for both creations made from scratch in Genies Studio and other creator items purchased from Warehouse.

AIGC & Metaverse

One success factor for avatar economy is the interesting gameplay associated with that character. To that end, Ai-generated content can be largely adapted to the metaverse to present more colorful gameplay.

We may be interacting with AIGC in our daily lives. This can include things like chatbots, automated response systems, and even some forms of content generation for NPCs. The value of AI-generated content will vary depending on the application and how well it is executed. Some use cases include:

a. AI can be used to create images & 3D avatars & NFT

Lensa: users only need to upload 10 to 20 pictures with clear faces, and dozens of AI avatars with 10+ unique styles can be automatically generated based on Stable Diffusion models. These avatars are rich in types, such as sci-fi, animation oil painting, etc. In the future, we may be able to generate 3D avatars or NFTs in the same way.

b. AI can be used to create non-player characters (NPCs)

Red Dead Redemption: NPCs who travel between towns as tourists on horseback or in a carriage with a certain purpose are driven by AI.

c. AI can be used to create and manage the environment (physics, lighting, and weather)

Microsoft Flight Simulator: in this game, Microsoft partnered with blackshark.ai and trained an AI to create the entire planet Earth (all 197 million square miles).

In general, AI-generated content has the potential to be very valuable as it can save time and money while still providing high-quality content. However, there are also many people who believe that AI-generated content will never be able to replace human artwork because it lacks the creativity and emotional connection that only humans can create.

In terms of the related projects, most of it is still at a very early stage(even at the conception), still there are some interesting projects that are trying to build AIGC products in web3:

Future Outlook

Living in the Metaverse

With the development of VR/AR/AI infrastructure, We believe that in the future, humans will spend more time truly experiencing living in interoperable metaverses, and the player avatar is very likely to be the most important asset.

From direct-to-consumer(D2C) to direct-to-avatar(D2A)

Avatars could help brands to better connect with digital natives and Generation Z consumers, who are often more comfortable interacting with digital avatars than with real-life people. This shift in focus from traditional marketing methods to Avatar-based marketing may provide better engagement with potential customers, as well as provide a more immersive experience that can lead to improved conversion rates. Examples: Gucci, Nike, LVHM.

AI & Humans coexist in the metaverse

The proliferation of virtual beings and the growth of the metaverse will have a profound impact on humanity. As we become more and more comfortable interacting with virtual entities, the line between humans and machines will become increasingly blurred. It may take some time for us to get used to living and interacting with AI on a regular basis, but there are already examples of this happening in the real world.

In the game world of Second Life, for example, there are entire communities made up of players who have created avatars that look and act like real people. There are also a growing number of businesses that are using AI-powered chatbots to talk to customers.

The proliferation of closed networks with known members and personal relationships

Low-level interactions (i.e. interactions that require low levels of social and emotional intelligence) will be outsourced to AI. The number of people who interact with AI systems, and the number of interactions with AI systems, will rise sharply. People will counterbalance this with a need for personal relationships and safe spaces that ensure human-to-human interactions.

3D interoperable DID is the next wave of innovation

The value of 3D visual identities will rise. Although the competition in this space is fierce, we believe that only the teams that can gain mass adaption are those who are good at building ecosystems and communities. The value of a 3D avatar should span far beyond its fancy appearance, but has real utility and market consensus.

Reference

https://metaverseinsider.tech/2022/07/16/metaverse-avatars/#What_Is_A_Metaverse_Avatar

https://hbr.org/2006/06/avatar-based-marketing

https://mastersofmedia.hum.uva.nl/blog/2021/10/29/real-money-on-virtual-items-a-visual-analysis-of-fortnite-skins/

https://hbr.org/2006/06/avatar-based-marketing

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4559151/

https://a16z.com/2022/11/17/the-generative-ai-revolution-in-games/

https://medium.com/nico-s-ideas/an-avatar-and-a-digital-identity-may-be-the-first-step-towards-being-immortal-d2bd8a22cd2

https://virtualworlds.substack.com/p/the-future-is-now-the-ai-gaming-revolution?sd=pf

About Foresight Ventures

Foresight Ventures is dedicated to backing the disruptive innovation of blockchain for the next few decades. We manage multiple funds: a VC fund, an actively-managed secondary fund, a multi-strategy FOF, and a private market secondary fund, with AUM exceeding $400 million. Foresight Ventures adheres to the belief of “Unique, Independent, Aggressive, Long-Term mindset” and provides extensive support for portfolio companies within a growing ecosystem. Our team is composed of veterans from top financial and technology companies like Sequoia Capital, CICC, Google, Bitmain, and many others.

Website: https://www.foresightventures.com/

Twitter: https://twitter.com/ForesightVen

Medium: https://foresightventures.medium.com

Substack: https://foresightventures.substack.com

Discord: https://discord.com/invite/maEG3hRdE3

Linktree: https://linktr.ee/foresightventures
Decentralized identity and wallet system: redefining identity in web3Author: Alice@Foresight Ventures TL;DR The potential value of DID is very large, and it will capture the huge value second only to the public chain. Compared with Defi, gamefi and other projects, DID’s entry threshold and understanding cost are low enough, and it has a strong Matthew effect. It is expected to become a super entrance for users to enter web3. DID can provide the same personalized content as web2, and at the same time, it can ensure that we have our own identity, preferences, and account information from being leaked and profited by others. The identity layer on the chain brings new possibilities. It redefines the rights of identity, re-changes the power of identity, and returns the power of data realization to users. We divide the DID identity stack into private key management layer-authentication layer-authorization layer-proof layer. The proof layer has a unique data advantage, which is the product form with the most potential to make a unified DID. The competition for wallets in the future will be extremely fierce, not only the existing EOA wallet and aa wallet competition, but also cross-track players at the proof level may join the competition to provide users with a better identity account system and DID ecology. 1. What is DID decentralized identity DID is the abbreviation of Decentralized Identifiers, that is, decentralized identity. The focus of DID is decentralization and then identity. In the world of web3, how we maintain a unified identity in a decentralized environment and how to use our identity is what we want to discuss in this article. a. Traditional centralized identity In the centralized platform of web2, our every sentence, every transaction, and every behavior are collected and recorded by the platform to analyze our preferences and habits to form user portraits. For the web2 platform, this is their tool for harvesting traffic. Fine user portraits plus big data algorithms push articles, videos, and even the next product you may like to your face, so that users can stay on this platform, while the centralized platform has how to control And the way to realize the traffic. this means Control of platform content: For any creator who wants to provide content for platform users, they lose the opportunity to share their views when their content is deemed to violate regulations or not meet platform standards. Control user data: All data generated on the platform is essentially the content written into the platform’s database, which is owned by the centralized platform. That is to say, the platform can modify or even delete the user’s personal information and account at any time. For example, Gmail once deleted the user’s Gmail account and emails due to manual review errors, and the user lost everyone’s contact information overnight. b. Classification of Decentralized Identity DID provides new possibilities, Users have complete control over the generation of identities and the use of identities. In the year from the proposal of DID to the crazy discussion in the market, we have seen different solutions based on the understanding of DID-based decentralized identities, including Data type angle division includes VC is the abbreviation of Verifiable Credential, which is a standard proposed by W3C in March 2022, and verifiers can prove its authenticity. In the process of using VC, there will be three roles. The claimer holds the certificate, and the data in it needs to meet the requirements of fields, data types, and data sorting that match the certificate type. The notary Attester checks the data and confirms it is valid. Then use the private key to sign the VC, and then the verifier can verify the authenticity of the certificate without touching the notary and only needs to see the signed data. VC itself is a standard that provides standards for different subjects to verify data. Although a certain degree of decentralization has been achieved, but VC data does not necessarily need to be uploaded to the chain, so the user’s data ownership is not guaranteed. At the same time, what VC describes is not the identity of the user, but only a certificate that meets a certain standard, such as education, medical insurance, so in the long run, VC may become the standard for Internet verification, but it is not a decentralized identity. Soulbound NFT, Soulbound NFT does not support on-chain transfer, which means that SBT only belongs to a specific on-chain address Based on the Soulbound NFT proposed by Vitalik, the data and information carried by NFT are “soul-bound” in the user’s wallet address, which adds non-transferable attributes and solves the illusion that users themselves have achievements or experience by purchasing NFT or achievements. , which binds the unity of user behavior and identity But the other side of soul binding is that it cannot be unbound with NFT and user identity. In addition to the active display of POAP or NFT by the user, the user’s address can be maliciously airdropped NFT or maliciously labeled. In the case of transfer, the user can only be forced to accept and will be publicly displayed on the chain, causing the user’s reputation loss or even transaction failure, etc. Onchain reputation The reputation on the chain reflects the identity of the user through the interactive behavior on the chain A successful transaction means that it is written into the block on the chain and can be viewed and verified publicly. Of course, ZKP provides a way to directly verify the transaction result without publishing the specific transaction content. But every transaction and interaction of users on the chain can be recorded and analyzed. Whale alert itself is based on publicly visible data on the chain to track the behavior of large households. Nansen’s Smart Money label can be said to vividly display what Black Swan author Nassim Taleb said Don’t tell me what you think, show me your portfolio . The Degen score goes one step further, by analyzing the data on the chain, extracting the score as the proof of the user’s Degen, providing a new identity label, However, these labels themselves lack reasonable context and background introduction. For example, the address marked by smart money may only be one of the wallet addresses of the institution, and the selling or purchasing behavior of a single address may not fully infer the institution’s strategy. Maybe this is just one leg of the delta neutra strategy. The act of an address alone is not enough to form a complete identity. Moreover, the reputation on the chain may be deceptive, because the user can compensate for the interaction after the event, or the illusion of reputation formed by purchasing in the secondary market. Of course, it is also possible to judge whether it is an ex-post compensation through the timeline and NFT transaction records, but This kind of on-chain data itself is a lag proof. Decentralized domain name, the decentralized domain name opened by ENS can be said to be a decentralized identity in itself, and ENS’s name positioning based on ETH and its huge wealth effect have become a symbol of identity ENS can directly reflect the identity through the name of the address by converting the seemingly garbled wallet address into a readable address. At the same time, the first launch effect and brand influence of ENS has basically become the domain name with the most application scenarios. At the same time, ENS can edit the content of websites and subdomain names, providing opportunities for customizing identities. At the same time, ENS is composable as a contract on the chain, and different applications can be combined on the chain. For example, Philand generates a metaverse unique to the user through the relevant information of ENS, and Kola lab generates music NFT based on the user’s ENS. The visualization of information can be said to be the representative of the composability of DID However, the transaction attribute of ENS can untie the user identity and domain name identity, losing the continuity of identity. At the same time, the interaction based on the wallet address is still limited to the transaction information on the chain. At the same time, ENS is currently only based on the Ethereum main network. Facing the current Layer2, Layer3 and other multi-level networks and multi-chain environments, it may not be comprehensive and may not be accurate. , A complete description of a user’s identity, see the discussion of the wallet for details. Wallet address, as the entrance for users to interact with dapps on the chain, it can be said that for DAPP and other wallet addresses are users, and the transaction records on the chain are the portraits of users. Taking the wallet on the chain as the identity, such as Nansen smart wallet, can also be understood as the DID of the user in the transaction scenario. But in fact, a user may have multiple wallet accounts, and the transaction behavior of each account is different. For example, one of the wallets of an organization is doing unilateral strategy, and the other wallet is only doing delta neutra arbitrage behavior, so A single wallet cannot fully represent the actual behavior of a user. Moreover, in a multi-chain environment, each wallet is also an independent account, and users may have different transaction behaviors on different chains, so a single wallet on a certain chain cannot fully represent the user’s DID At the same time, Lens, Galxe, etc. have also derived social graphs and Profile NFT social portraits through on-chain behaviors. However, most of the interactive behaviors on the chain are transactions. Currently, there is a lack of social application scenarios to provide social behaviors on the chain. Of course, we think that when the data infrastructure is perfect and the cost of on-chain is reduced, we can see a lot of social data in the future, so as to obtain more valuable social relationships Stealth address The stealth address proposed by Vitalik can separate the right to use and view the wallet address, ensuring the isolation of user identity and assets The characteristic of the blockchain is openness and transparency. For users, the identity of the user can be deduced through asset information and behavior on the chain. In the latest article, Vitalik proposed a new stealth address, which is temporarily generated by the user who initiates the transfer through elliptic curve encryption. Public key and private key, and then combined with the address of the receiving account to generate a secret address and transfer to this address, the receiving user can use his own private key combined with the temporary public key to see the assets of the secret address and receive money later Talent has the right to control this asset. It is equivalent to cutting off the connection with the user’s identity with a one-time new address. But as Vitalik said, every time a new address is generated, it faces the problem of Gas payment. Although only the payee can control the account assets, the new address does not have enough Gas to pay the transfer fee. At the same time, the privacy of the secret address is only the privacy of the payee on one side, and it is still a transparent operation on the chain for the payer. The problem of Gas can be paid by abstract wallet management or relayer, and Vitalik proposes to use ZKP to prove the ownership of users and secret accounts, and even imagines that different addresses can be used for different DAPPs, allowing users to manage DAPPs. Visible information and assets. Of course, it is still difficult to implement ZKP, cross-chain and wallet design. We look forward to seeing new projects put forward implementation plans. Cross-platform Avatar, the three-dimensional character image is more in line with our imagination of identity, and cross-platform Avatar can ensure the unity of identity Although flat PFP NFT is also used as an avatar as a representative of identity, NFT itself is tradable and it is difficult to form a binding relationship between identity and NFT. The 3D properties of Avatar provide a visual image, and at the same time, as the incarnation of our metaverse, it has a unique identity. Like ready player me and Lifeform’s Avatar NFT can support users to edit appearance and image, and can be used across platforms. Like Lifeform’s Avatar NFT itself is an NFT on the chain, which can be directly compatible with the metaverse on the chain, and even off-chain meetings including Zoom , Google, etc. ensure the unity of identity. Afterwards, we can even look forward to a personalized metaverse, which will display more information about one’s own identity in a customized personal metaverse. Although Avatar NFT is a more three-dimensional identity, but currently Avatars NFT shows more about appearance and appearance, it is just a single identity image, and does not include more behaviors or other information that define identity. Later, we can Expect Avatar NFT to display more information through metadata or composability. c. Key attributes of DID Composability of identities In fact, DID can be more than an identity, just like our identity may consist of ID card, driver’s license, education, etc., a single certificate and product form may not be enough to describe our complete identity, but a combination of products applicable to different scenarios form our full identity. The difference is that the identity information under the chain cannot communicate with each other due to the fragmentation of different central communication systems. After the data is uploaded to the chain, it has fluidity and can support different arrangements and combinations, showing the user’s identity in a more three-dimensional image, such as imagining various games. The achievements of SBT can be displayed in the user’s personal Avatar NFT as a wearable medal, or the assets and transactions on the chain based on the wallet address can obtain lower interest in the lending agreement, which can include the composability of data sources , the composability of different application scenarios, and even the composability between different DIDs Privacy and Security of Identity Data At the same time, because the environment on the chain is publicly visible data, we need to pay attention to Privacy: In the dark forest on the chain, it is very unsafe to disclose identities and related bound accounts and asset information, especially for the information on the chain that mainly focuses on financial transaction scenarios, exposing wallet addresses and transaction behaviors is very dangerous. Very dangerous, so DID privacy for associated asset information should be a necessity. At the same time, maintaining anonymity is one of the needs of many encryption-native users. At present, we have seen open source plug-ins such as semaphore and DIDs that provide privacy such as Firstbatch Security: including information security and asset security. Information security mainly refers to the security of personal information, which may include information such as identity, address, and education background under the chain, as well as wallet addresses and transaction behaviors on the chain. It is possible to expose these information It brings about personal safety issues, and at the same time, the authorization behavior on the chain is often prone to the possibility of being hacked or maliciously attacked. Therefore, proper data processing and security must be ensured for the collection, storage, and use of personal information data. Although most project parties now save locally or use AWS, relying on the management capabilities of the team, the downtime of AWS and the endless problems of team management capabilities make us hope that these personal sensitive data can be stored in a more decentralized The modernized database guarantees the user’s data security Censorship risk: mainly for the potential risks that may exist in centralized storage and management data. Although it is not a common risk of DID decentralized identity that we have seen, we still need to be reminded of possible censorship risks in the collection process 2. Why do we need DID Everyone’s needs are different, because different preferences and backgrounds have been embodied in the choice of different financial products due to different risk preferences, different trading habits lead to the use of different trading platforms, different aesthetic standards, so choose to join different NFT community. For the project side, DID can help the team better understand the needs of users and design products. For users, a lot of market noise and dazzling products increase the difficulty of finding products that really meet their needs. And DID may be the shortest path connecting products and users. To a certain extent, the major platforms of web2 have pushed to us at the fastest speed through the user portraits they have mastered, such as the products you may like, and the tweets that people nearby are watching. Reasons for not being able to leave these platforms. And DID may be the key to bringing the same experience, and more importantly, this is the Web3 where users truly have data sovereignty a. DID Value For DID, we believe the real potential lies in: Possibility of data payment brought by data ownership The attribution of data ownership means the option of data charging. When the data belongs to a centralized platform, it is up to the platform to choose how to realize the user behavior or personal preferences and other data generated on the platform. The purchase behavior infers the possibility of the user’s pregnancy and sells the news to the seller of maternity and baby products to accurately push advertisements to the user. The user becomes the product sold on the platform to obtain benefits, and we can use the products and services of the platform for free. In fact, it is Because our behavioral data is collected by the platform and generates income, which subsidizes the cost of the platform and even makes a profit. When users have data sovereignty, users can get back the right to choose data transactions. You can choose to sell the relevant data, or you can choose not to sell. Different from the centralized platform as an intermediary connecting data producers and users and from it as a data porter to trade data value, we can even imagine a data market connecting data producers and consumers, such as providing a P2P transaction model, or DAPP directly subsidizes users to obtain user usage data and consumption habits, and a new business model for DAPP paying users emerges. These can be realized through user data and identities contained in DID, and this also depends on the chain we discuss below. Combination similar to Defi lego given by DID on Composability brought by data on-chain After the data is uploaded to the chain, the data begins to have liquidity, which can be read, used, and verified through the chain contract or API, and the process of data use or verification can be charged through the contract, just like reading and writing chainlink price Feed needs to be paid, and DID may propose a new data charging model for uploading user information to the chain. If DAPP wants to read or verify the personal information or transaction data contained in the user’s DID identity, it needs to be paid before it can be obtained. For example, Firstbatch’s SBT requires DAPP payment is required when the contract verifies the proof of interest Moreover, the design of DID can even allow users to choose whether to agree to DAPP to read relevant information, or to selectively disclose personal information. For example, for Defi-related types, DAPP only chooses to display the wallets on a certain chain or the transaction records of one of the wallets. To choose the service of DAPP More importantly, we may not need to have a large and complete identity on the same platform or dapp, but may have different DIDs in different scenarios, For example, in game scenarios, there are king status SBT, on the defi dapp, can be a mining OG or an onchain reputation of an arbitrage trading master, and published an NFT of a novel research master on the mirror. Just like people are multi-faceted, DIDs in different scenarios can show different aspects of an identity, and the composability of the chain allows different scenarios to obtain information about other identities of users to open new scenarios, or users can choose in different scenarios. Show different DIDs in different scenarios, choose to protect your identity while still being able to obtain products and services that meet your needs b.DID Application Scenario As identity data, DID mainly depends on how to use data or verify identity information. DAPP can use identity information to understand user needs and provide more personalized products or verify whether users meet the requirements of DAPP. The currently conceivable scenarios include: Compliance requirements: CEX under the chain requires user KYC based on compliance and security considerations. There are also lending agreements on the chain that start to combine traditional financial KYC data under the chain to help better manage risks, mainly based on compliance KYC requirements such as regulation and risk control Proof of Personhood authentication: The most common is the anti-sybil attack in whitelist and airdrop, that is, the same person can create multiple accounts or wallet addresses or get whitelist and airdrop. For the project side, this increases the cost of acquiring customers, because these rights received by the same user can originally obtain other new users, so for the project side, there is a need to verify the proof of personality of the user User portrait: This is more similar to the understanding and use of DID in our usual discussions. In the Web2 world we are familiar with, a large amount of personal information and behavioral activities are collected, sorted, and extracted by centralized companies to accurately push content and advertisements for each user, and personalized content will also increase. User stickiness to the platform. This is what we think one of the greatest potentials of DID is that DID can be used to design, recommend, and adjust content for users, including social scenarios and transaction scenarios Social scene: Although the current data on the chain is mainly transaction data, we believe that after EIP4844 has greatly reduced the storage cost on the chain, coupled with the massive data storage, retrieval, and analysis projects we have seen. After the launch of the project, we can see data-intensive social projects at the application layer, and provide targeted services to users through DID and other messages, providing web2 user experience. There may be different forms in different tracks: DAO: The topic that was often discussed before was onchain reputation. For DAO members, onchain reputation is one of the options to reflect the degree of member contribution. However, since the working scenarios of DAO are not limited to the chain, a large amount of communication and work results, such as snapshot voting, community proposals, offline activities, etc., all appear in off-chain scenarios, and VC may be used to prove off-chain academic qualifications, work experience etc. Credit lending: As one of the most widely discussed scenarios of DID, credit lending is a very common scenario in traditional finance. The essence of Huabei and Jiebei we are familiar with is that Alipay collects user asset information and behavior data The credit loan provided by the identity information of DID, and the user information provided by DID can also support more lending agreements to provide liquidity, and we have also seen that some agreements have begun to provide credit loans that are not fully mortgaged 3. Web3 Identity Stack The Web3 identity stack can be divided into six layers from bottom to top: Storage layer-identification layer-private key management layer-authentication layer-authorization layer-proof layer. The storage layer and identity layer are the infrastructures, that define the DID standard. Large-scale user interaction starts from the private key management layer: To obtain a DID, the user needs to own a private key management account (account). The form of the web3 private key management account is usually a wallet, which is a natural identity information container. After any authentication (Seed phrases/MPC login/biometrics), the user successfully logs in to the wallet Different Dapps can now ask for permission to connect to wallets and read information about the account (ex: multi-chain addresses, transactions, on-chain data, and NFT assets) after obtaining the authorized signature. Finally, all kinds of VC\SBT\Poap, etc. are generated and stored in the wallet to prove the user’s identity. When interacting with Dapps, the user can independently authorize which data to disclose to the project. Ideally, ZK technology can be used in the above links to hide the direct connection between the user’s wallet address and identity asset information. a. Agents&Key management The wallet provides a natural identity container, which is an entrance connecting Dapps and credentials held by users. What should an ideal “DID wallet” look like: Aggregates the addresses of all chains, and integrates the fragmented data of users while having basic functions such as signatures, transfers, and other transactions; secondly, it can display various SBT/VC/PoP certificates owned by users. When Dapps interacts with the wallet, the user can independently authorize which data to disclose, thereby helping the user maintain data sovereignty. 1) Smart Contract Wallet Currently, most wallets such as Metamask do not have these functions. An important reason is that they are all EOA wallets, which only support the most native operations such as query and transfer. In the future smart Contract wallets (eg: Unipass, Argent) are expected to realize more diverse functions, such as: No private key: users no longer need to keep seed phrases or private keys; multiple verification methods such as biometric verification and device verification can be adopted. Account recovery: Account recovery can be performed through biometrics, social verification, etc. to ensure that social identities will not be lost Gasless interaction: users can use ERC-20 tokens involved in the transaction for gas payment, or directly specify a fixed account for payment without preparing ETH as gas in advance On-chain credit lending: The account contains a complete on-chain credit record so the account itself can directly get a loan based on its reputation, without any collaterals, boosting the financial liquidity in the market; what needs to be further consideration is how the asset lender can properly punish dishonest behavior. Sub-account management: Users use dedicated account systems for different scenarios and have a more comprehensive account management system. For example, one account only stores ETH as a gas account, and the interaction of all other accounts could be paid by the gas account; One account only stores blue-chip NFT, which will not be easily used; One account is used as a gamefi account, they could store different asset as well as storing the on-chain behavior data as separate entities, thereby preserving user privacy to the largest extend. 2) Identity Wallet The main function of the smart contract wallet is to store and manage Crypto keys for sending/receiving assets and signing transactions; another type of identity wallet is dedicated to storing identities and allows users to make claims, which could be displayed across applications/platforms. Polygon ID Wallet: Polygon has launched a wallet for its users to manage and store PolygonID, which provides functions such as log-in, zkp generation, and privacy verification; currently there are few usescases, and it encourages developers to customize use cases ONTO wallet: it is designed to help users create DID, and fully protect user privacy through encryption algorithms, integrating on-chain + off-chain identities, NFT, asset management, etc; Nugget: Mainly focus on e-commerce payment scenarios, use biometric technology to scan users’ faces from multiple angles, and combine ID card scanning with optical character recognition technology to complete the KYC identity verification process and zero-knowledge storage on the chain. b. Authentication layer To attract users to use web3 DID on a large scale, a more user-friendly, low-threshold, and secure authentication method would be essential. Sign-In with Ethereum Sign-In with Ethereum (SIWE) is an authentication standard pioneered by Spruce, ENS, and the Ethereum Foundation. SIWE standardized a message format (similar to jwt) for users to use blockchain-based account login services. On this basis, Sign-In with X (CAIP-122) makes SIWE an implementation of SIWx centered on Ethereum, and generalizes the standard to operate across blockchains. Unipass Unipss has successfully developed an on-chain solution for mailbox login and deployed zk to hide the relationship between an email account and a wallet address. In contrast, web3auth’s mailbox login solution is not more centralized than the on-chain structure, and its private key is stored in the front-end storage of the browser, which poses concerns for account security. Hexlink As an identity oracle, Hexlink encrypts the user’s identity data and transmits it to the identity server. The server decrypts the data and constructs a merkle proof of zero-knowledge proof to verify the user’s identity. It can be compatible with both web and mobile applications and supports web2 authentication. Social accounts and web3 EOA accounts and AA abstract wallets provide a zero-threshold web3 account experience Notebook User identity KYC + login solution, after registering a notebook account and completing the verification of biometric information, zkp can be generated to prove the user’s identity. Recently, it is cooperating with Near for Notebook Auth integration. c. Access control & Authorization Authentication solutions help confirm the identity of the user, while access control solutions determine which resources an entity should be able to access and what they are allowed to do with those resources. 1) Access Control Management The community can programmatically grant access rights based on the visitor’s Token holdings, on-chain activity, or social verification, instead of manually assigning access rights. Lit Based on the MPC key management and access control protocol, the public/private key pair is represented by a PKP (programmable key pair) NFT. When the specified conditions are met, the owner of the PKP can trigger the aggregation of the network, so that they can identify to decrypt documents or sign messages. Potential usecase: Token-gated meeting and Gathertown spaces, live streaming, and Google Drive access. Collab.land Discord automates the management of bots and manages membership based on token ownership. 2) User authorization management When interacting with Dapps, users should also be able to independently authorize which data to disclose, and manage permissions of each Dapp through a unified interface. EverRise wallet EverRise wallet launched a new product– EverRevoke to help users manage authorizations. EverRevoke aggregates the token & NFT authorization status of 5 blockchains. Users can manage and revoke their token permissions at any time. The authorization management is expected to be further improved in AA account. Developers can define different read-write and asset permissions for each private keys, or set the minimum transaction threshold and automatic trigger conditions for transactions. d. Attestation Attestation solutions can help wallet addresses to produce their “proof”, “reputation” and “credentials”. Based on verifiable on-chain data, this reputation is not limited to a certain platform. We believe that the attestation layer is the closest to crypto users, thereby sustainably generating valuable data. It has the most potential to become a unified DID layer. 1) Data attestations The biggest feature of the data attestation protocol is that it can attract users to generate new data. With the tractions it gains from user data, it can naturally evolve into a universal identity standard. Galxe Galxe ID labels each address by issuing certificates and integrates user data on different chains. Data sources range from Web2 and Web3; there are currently more than 4.2 million Galxe ID users. In 2023 H1, Galxe will release its new product: Galxe Protocol, a license-free data protocol product. Developers are able to access the data generated by Galxe and start to build applications on its ecosystem. Galxe is planning to launch a standard system for attestations, including storage modules, verification modules, usage modules, and privacy management. Lens Protocol — Social Gardens Emerge Lens Protocol is a decentralized social graph protocol built on Polygon, launched by the founding team of Aave. Regarding the goal of betting on Web3 social media, Aave founder Stani has publicly stated that the ultimate goal is to combine Web3 social media and DeFi, turning the social graph into a trust graph, and providing low-payment loans through Aave. Lens has passed the hackathon LFGrow and two rounds of donation plans so far, with a total of more than 100,000 users, with many innovative projects including Lenster, Phaver, Orb, SteamDAO, etc. Farcaster — Decentralized Trusted Neutral Social Protocol An open social protocol that supports decentralized clients. Farcaster adopts an on-chain + off-chain hybrid architecture, in which identities are stored on the Ethereum, and Ethereum is used to ensure its security, composability, and consistency. Identity is controlled through Ethereum addresses, and off-chain messages are signed through Ethereum accounts. The user’s data is encrypted and signed by identity and stored on the server controlled by the user (Farcaster Hubs). The reason why the data is not stored on the chain is that the settlement cost on most L1 and L2 networks are too high and the speed needs to be faster. Nostr-an information relay transmission protocol banned by Twitter The full name of Nostr is “Notes and Other Stuff Transmitted by Relays”, a relay transmission protocol for social media information. Nostr was established in November 2020. As a general-purpose Internet infrastructure, it is still in its early stages. Nostr is divided into two components: clients and relayers. The client is used to sign and verify the information and is run by the user; the repeater is responsible for delivering messages and can be run by anyone setting up a server. The client stores the user’s key internally. Each message must be signed and sent to the relay. The relay does not change the data. The verification of the authenticity of these messages is done by the client, so the user does not need to trust the relay, more in line with the spirit of decentralization. Nostr originated from the BTC community and has a close relationship with the web3 community. In the future, token incentives may be released for storage. In terms of marketing, Jack Dorsey carried the banner, but some time ago it was officially banned by Twitter. 2) Domain Name Service Both the domain name and the wallet can be regarded as identity information containers: the wallet can use a domain name as a substitute for the wallet account name and use it as the “name” when interacting with the application side, and the domain name can integrate multiple chain addresses or even multiple wallets account, which is also a natural identity aggregator. ENS With 1.8 million registrations, it is working with SpruceID to advance EIP-4361: Sign In With Ethereum. If the proposal is successfully implemented, this will replace the function of Connect Wallet, allowing the domain name to become the entrance to Web3 above the wallet address. In addition, ENS also hopes to complete its vision of “Web3 name” through the integration of a series of identities in the domain name. Space ID Space ID: Provides registration services for domain names with .bnb suffixes. Space ID also hopes to link .bnb domain names with users’ multiple on-chain addresses, Twitter, and other Web2 accounts, and become a Web3 Universal Domain. 3) 3D Visual Identity People are always pursuing experience upgrading, we can reasonably assume that we will gradually see the evolution from 2D identity to 3D identity. The immersive experience and playability of Metaverse will be much better than web page interaction. Philand Users can visualize the behavior on the chain through Phi, and receive corresponding components based on the behavior on the chain (donation, protocol interaction, etc.), building up their reputation world like Lego blocks. Users that owns ENS domain names can automatically generate land, and combine their web3 worlds according to their onchain behavior, and their social networks. Phi Previously won the hackathon NFT HACK2022 and BuildQuest in in ETH Global. It also received a grant from Uniswap, a grand winner for lens hack. Lifeform-3D visualization DID Lifeform is a 3D digital identity solution provider that has received investment from Binance’s seed round, focusing on the development of integrated solutions for visual DID. Core products include a 3D hyper-realistic virtual human editor, visual DID protocol, secure smart contract suite, native decentralized identity SaaS system service, and metaverse underlying engine suitable for Web3 cross-application development. 4) Other Attestations Unirep Unirep is an undeniable reputation protocol built on Ethereum. Users are allowed to anonymously give positive or negative reputations to others, and at the same time cannot refuse to accept the reputation (non-repudiable), and can voluntarily prove that they have a certain reputation through zk technology, but do not disclose the specific amount. Such solution can also be abstracted into smart contracts, using the composability of blockchain to provide encryption schemes for other project parties. Firstbatch Use AI to analyze web2 data and formulate data labels, verify the user’s interest while ensuring user privacy through ZK . Its uses cases include1. Content recommendation 2. Offline activities 3. Permission activities. Aspect AI The Linkedin for web3 builders. It uses some AI algorithms to process, classify, and label Github user data to generate user portraits for each Github user. Developer identities may have more utilities than general-purpose identities, including online job hunting, code contribution management, offline hackathons, developer communities, etc. e. Privacy We cannot discuss identity infrastructure without considering the technical primitives for privacy,because privacy is important at all layers of the identity stack. Over the past decade, blockchain adoption has accelerated the development of cryptographic primitives such as zk-proof. Iden3 The zk-native identity protocol, provides a zk framework and open-source libraries for zk identity primitives, authentication, and proof-of-claims generation. The protocol uses Baby JubjubElliptic curves, which generate key pairs for each identity, are designed to work efficiently with zk-SNARKs, which are used to prove identity ownership and claims in a privacy-preserving manner.PolygonIDis currently leveraging this protocol to build its identity wallet Polyhedra The zk full-stack protocol, and its cross-chain bridge zkbridge can solve the problem of multichain identity management. Polyhedra has launched the zkbridge relay network + two-layer recursive proof, ensuring the security of the cross-chain bridge while reducing time and gas fee. It also developed a DID container for the Gamefi projects, tp package players’ assets under one nft and transfer it with one click.In the future, it is planned to make user interaction simple or even insensitive. Semaphore The zero protocol for developers in Ethereum. Users can prove their group membership and send signals such as votes or support without revealing their original identity. Semaphore is not a user-facing application, but aims to provide powerful and simple tools for Ethereum developers and build DApps. Application-level zkp has also received much attention: Private Airdrop:Stealthdrop Anonymous communication:heyanon Vote anonymously:Melo KYC Zkpass Outdid 4. Identity will be the next entry point for web3 Take a look at the development of Web2 portals. It evolves from portals — search engines — PC apps — mobile apps.It’s clear that user are transferring from single-function, passive, complex to comprehensive portal to active, simple onboarding tools.Driven by humanity, users will always tend to choose a one-stop integrated entrances as touchpoint to explore crypto world. We optimistically predict that this traffic portal will have the following characteristics: Universal identity: A unified and interoperable identity layer which portraits all user interactions onchain Intergrated frontend: We believe that one or several super account systems similar to WeChat will emerge, and the most reasonable form of such system is the wallet Decentralized backend: Relying on the composability of the blockchain, the back end of the wallet can integrate various protocols such as social/game/Defi/identity/payment for safe use by users Privacy protection: Multiple privacy middleware should be combined to provide powerful end-to-end communication encryption, identity authorization and personal information protection. Diversed security settings: With the development of zk and AA, better security mechanisms can be invented and put to practice; there should be various key management mechanisms (MPC/multi-signature/no mnemonic) for users to choose from, meeting the needs of different types of customers User-owned data: Personal information/social data/identity credentials are stored in decentralized storage solutions such as Arweave or IPFS, and are only controlled by the user’s own key. Native Commercialization: Support native web3 payment, simplified cross-chain interaction and gas fee payment. Anti-censorship: Users will no longer banned by any centralized institutions like Meta or local governments. About Foresight Ventures Foresight Ventures is dedicated to backing the disruptive innovation of blockchain for the next few decades. We manage multiple funds: a VC fund, an actively-managed secondary fund, a multi-strategy FOF, and a private market secondary fund, with AUM exceeding $400 million. Foresight Ventures adheres to the belief of “Unique, Independent, Aggressive, Long-Term mindset” and provides extensive support for portfolio companies within a growing ecosystem. Our team is composed of veterans from top financial and technology companies like Sequoia Capital, CICC, Google, Bitmain and many others. Website: https://www.foresightventures.com/ Twitter: https://twitter.com/ForesightVen Medium: https://foresightventures.medium.com Substack: https://foresightventures.substack.com Discord: https://discord.com/invite/maEG3hRdE3 Linktree: https://linktr.ee/foresightventures

Decentralized identity and wallet system: redefining identity in web3

Author: Alice@Foresight Ventures

TL;DR

The potential value of DID is very large, and it will capture the huge value second only to the public chain.

Compared with Defi, gamefi and other projects, DID’s entry threshold and understanding cost are low enough, and it has a strong Matthew effect. It is expected to become a super entrance for users to enter web3.

DID can provide the same personalized content as web2, and at the same time, it can ensure that we have our own identity, preferences, and account information from being leaked and profited by others.

The identity layer on the chain brings new possibilities. It redefines the rights of identity, re-changes the power of identity, and returns the power of data realization to users.

We divide the DID identity stack into private key management layer-authentication layer-authorization layer-proof layer. The proof layer has a unique data advantage, which is the product form with the most potential to make a unified DID.

The competition for wallets in the future will be extremely fierce, not only the existing EOA wallet and aa wallet competition, but also cross-track players at the proof level may join the competition to provide users with a better identity account system and DID ecology.

1. What is DID decentralized identity

DID is the abbreviation of Decentralized Identifiers, that is, decentralized identity. The focus of DID is decentralization and then identity. In the world of web3, how we maintain a unified identity in a decentralized environment and how to use our identity is what we want to discuss in this article.

a. Traditional centralized identity

In the centralized platform of web2, our every sentence, every transaction, and every behavior are collected and recorded by the platform to analyze our preferences and habits to form user portraits. For the web2 platform, this is their tool for harvesting traffic. Fine user portraits plus big data algorithms push articles, videos, and even the next product you may like to your face, so that users can stay on this platform, while the centralized platform has how to control And the way to realize the traffic. this means

Control of platform content: For any creator who wants to provide content for platform users, they lose the opportunity to share their views when their content is deemed to violate regulations or not meet platform standards.

Control user data: All data generated on the platform is essentially the content written into the platform’s database, which is owned by the centralized platform. That is to say, the platform can modify or even delete the user’s personal information and account at any time. For example, Gmail once deleted the user’s Gmail account and emails due to manual review errors, and the user lost everyone’s contact information overnight.

b. Classification of Decentralized Identity

DID provides new possibilities, Users have complete control over the generation of identities and the use of identities. In the year from the proposal of DID to the crazy discussion in the market, we have seen different solutions based on the understanding of DID-based decentralized identities, including

Data type angle division includes

VC is the abbreviation of Verifiable Credential, which is a standard proposed by W3C in March 2022, and verifiers can prove its authenticity.

In the process of using VC, there will be three roles. The claimer holds the certificate, and the data in it needs to meet the requirements of fields, data types, and data sorting that match the certificate type. The notary Attester checks the data and confirms it is valid. Then use the private key to sign the VC, and then the verifier can verify the authenticity of the certificate without touching the notary and only needs to see the signed data.

VC itself is a standard that provides standards for different subjects to verify data. Although a certain degree of decentralization has been achieved, but VC data does not necessarily need to be uploaded to the chain, so the user’s data ownership is not guaranteed. At the same time, what VC describes is not the identity of the user, but only a certificate that meets a certain standard, such as education, medical insurance, so in the long run, VC may become the standard for Internet verification, but it is not a decentralized identity.

Soulbound NFT, Soulbound NFT does not support on-chain transfer, which means that SBT only belongs to a specific on-chain address

Based on the Soulbound NFT proposed by Vitalik, the data and information carried by NFT are “soul-bound” in the user’s wallet address, which adds non-transferable attributes and solves the illusion that users themselves have achievements or experience by purchasing NFT or achievements. , which binds the unity of user behavior and identity

But the other side of soul binding is that it cannot be unbound with NFT and user identity. In addition to the active display of POAP or NFT by the user, the user’s address can be maliciously airdropped NFT or maliciously labeled. In the case of transfer, the user can only be forced to accept and will be publicly displayed on the chain, causing the user’s reputation loss or even transaction failure, etc.

Onchain reputation The reputation on the chain reflects the identity of the user through the interactive behavior on the chain

A successful transaction means that it is written into the block on the chain and can be viewed and verified publicly. Of course, ZKP provides a way to directly verify the transaction result without publishing the specific transaction content. But every transaction and interaction of users on the chain can be recorded and analyzed. Whale alert itself is based on publicly visible data on the chain to track the behavior of large households. Nansen’s Smart Money label can be said to vividly display what Black Swan author Nassim Taleb said Don’t tell me what you think, show me your portfolio . The Degen score goes one step further, by analyzing the data on the chain, extracting the score as the proof of the user’s Degen, providing a new identity label,

However, these labels themselves lack reasonable context and background introduction. For example, the address marked by smart money may only be one of the wallet addresses of the institution, and the selling or purchasing behavior of a single address may not fully infer the institution’s strategy. Maybe this is just one leg of the delta neutra strategy. The act of an address alone is not enough to form a complete identity. Moreover, the reputation on the chain may be deceptive, because the user can compensate for the interaction after the event, or the illusion of reputation formed by purchasing in the secondary market. Of course, it is also possible to judge whether it is an ex-post compensation through the timeline and NFT transaction records, but This kind of on-chain data itself is a lag proof.

Decentralized domain name, the decentralized domain name opened by ENS can be said to be a decentralized identity in itself, and ENS’s name positioning based on ETH and its huge wealth effect have become a symbol of identity

ENS can directly reflect the identity through the name of the address by converting the seemingly garbled wallet address into a readable address. At the same time, the first launch effect and brand influence of ENS has basically become the domain name with the most application scenarios. At the same time, ENS can edit the content of websites and subdomain names, providing opportunities for customizing identities. At the same time, ENS is composable as a contract on the chain, and different applications can be combined on the chain. For example, Philand generates a metaverse unique to the user through the relevant information of ENS, and Kola lab generates music NFT based on the user’s ENS. The visualization of information can be said to be the representative of the composability of DID

However, the transaction attribute of ENS can untie the user identity and domain name identity, losing the continuity of identity. At the same time, the interaction based on the wallet address is still limited to the transaction information on the chain. At the same time, ENS is currently only based on the Ethereum main network. Facing the current Layer2, Layer3 and other multi-level networks and multi-chain environments, it may not be comprehensive and may not be accurate. , A complete description of a user’s identity, see the discussion of the wallet for details.

Wallet address, as the entrance for users to interact with dapps on the chain, it can be said that for DAPP and other wallet addresses are users, and the transaction records on the chain are the portraits of users.

Taking the wallet on the chain as the identity, such as Nansen smart wallet, can also be understood as the DID of the user in the transaction scenario. But in fact, a user may have multiple wallet accounts, and the transaction behavior of each account is different. For example, one of the wallets of an organization is doing unilateral strategy, and the other wallet is only doing delta neutra arbitrage behavior, so A single wallet cannot fully represent the actual behavior of a user. Moreover, in a multi-chain environment, each wallet is also an independent account, and users may have different transaction behaviors on different chains, so a single wallet on a certain chain cannot fully represent the user’s DID

At the same time, Lens, Galxe, etc. have also derived social graphs and Profile NFT social portraits through on-chain behaviors. However, most of the interactive behaviors on the chain are transactions. Currently, there is a lack of social application scenarios to provide social behaviors on the chain. Of course, we think that when the data infrastructure is perfect and the cost of on-chain is reduced, we can see a lot of social data in the future, so as to obtain more valuable social relationships

Stealth address The stealth address proposed by Vitalik can separate the right to use and view the wallet address, ensuring the isolation of user identity and assets

The characteristic of the blockchain is openness and transparency. For users, the identity of the user can be deduced through asset information and behavior on the chain. In the latest article, Vitalik proposed a new stealth address, which is temporarily generated by the user who initiates the transfer through elliptic curve encryption. Public key and private key, and then combined with the address of the receiving account to generate a secret address and transfer to this address, the receiving user can use his own private key combined with the temporary public key to see the assets of the secret address and receive money later Talent has the right to control this asset. It is equivalent to cutting off the connection with the user’s identity with a one-time new address.

But as Vitalik said, every time a new address is generated, it faces the problem of Gas payment. Although only the payee can control the account assets, the new address does not have enough Gas to pay the transfer fee. At the same time, the privacy of the secret address is only the privacy of the payee on one side, and it is still a transparent operation on the chain for the payer. The problem of Gas can be paid by abstract wallet management or relayer, and Vitalik proposes to use ZKP to prove the ownership of users and secret accounts, and even imagines that different addresses can be used for different DAPPs, allowing users to manage DAPPs. Visible information and assets. Of course, it is still difficult to implement ZKP, cross-chain and wallet design. We look forward to seeing new projects put forward implementation plans.

Cross-platform Avatar, the three-dimensional character image is more in line with our imagination of identity, and cross-platform Avatar can ensure the unity of identity

Although flat PFP NFT is also used as an avatar as a representative of identity, NFT itself is tradable and it is difficult to form a binding relationship between identity and NFT. The 3D properties of Avatar provide a visual image, and at the same time, as the incarnation of our metaverse, it has a unique identity. Like ready player me and Lifeform’s Avatar NFT can support users to edit appearance and image, and can be used across platforms. Like Lifeform’s Avatar NFT itself is an NFT on the chain, which can be directly compatible with the metaverse on the chain, and even off-chain meetings including Zoom , Google, etc. ensure the unity of identity. Afterwards, we can even look forward to a personalized metaverse, which will display more information about one’s own identity in a customized personal metaverse.

Although Avatar NFT is a more three-dimensional identity, but currently Avatars NFT shows more about appearance and appearance, it is just a single identity image, and does not include more behaviors or other information that define identity. Later, we can Expect Avatar NFT to display more information through metadata or composability.

c. Key attributes of DID

Composability of identities

In fact, DID can be more than an identity, just like our identity may consist of ID card, driver’s license, education, etc., a single certificate and product form may not be enough to describe our complete identity, but a combination of products applicable to different scenarios form our full identity. The difference is that the identity information under the chain cannot communicate with each other due to the fragmentation of different central communication systems. After the data is uploaded to the chain, it has fluidity and can support different arrangements and combinations, showing the user’s identity in a more three-dimensional image, such as imagining various games. The achievements of SBT can be displayed in the user’s personal Avatar NFT as a wearable medal, or the assets and transactions on the chain based on the wallet address can obtain lower interest in the lending agreement, which can include the composability of data sources , the composability of different application scenarios, and even the composability between different DIDs

Privacy and Security of Identity Data

At the same time, because the environment on the chain is publicly visible data, we need to pay attention to

Privacy: In the dark forest on the chain, it is very unsafe to disclose identities and related bound accounts and asset information, especially for the information on the chain that mainly focuses on financial transaction scenarios, exposing wallet addresses and transaction behaviors is very dangerous. Very dangerous, so DID privacy for associated asset information should be a necessity. At the same time, maintaining anonymity is one of the needs of many encryption-native users. At present, we have seen open source plug-ins such as semaphore and DIDs that provide privacy such as Firstbatch

Security: including information security and asset security. Information security mainly refers to the security of personal information, which may include information such as identity, address, and education background under the chain, as well as wallet addresses and transaction behaviors on the chain. It is possible to expose these information It brings about personal safety issues, and at the same time, the authorization behavior on the chain is often prone to the possibility of being hacked or maliciously attacked. Therefore, proper data processing and security must be ensured for the collection, storage, and use of personal information data. Although most project parties now save locally or use AWS, relying on the management capabilities of the team, the downtime of AWS and the endless problems of team management capabilities make us hope that these personal sensitive data can be stored in a more decentralized The modernized database guarantees the user’s data security

Censorship risk: mainly for the potential risks that may exist in centralized storage and management data. Although it is not a common risk of DID decentralized identity that we have seen, we still need to be reminded of possible censorship risks in the collection process

2. Why do we need DID

Everyone’s needs are different, because different preferences and backgrounds have been embodied in the choice of different financial products due to different risk preferences, different trading habits lead to the use of different trading platforms, different aesthetic standards, so choose to join different NFT community. For the project side, DID can help the team better understand the needs of users and design products. For users, a lot of market noise and dazzling products increase the difficulty of finding products that really meet their needs.

And DID may be the shortest path connecting products and users. To a certain extent, the major platforms of web2 have pushed to us at the fastest speed through the user portraits they have mastered, such as the products you may like, and the tweets that people nearby are watching. Reasons for not being able to leave these platforms. And DID may be the key to bringing the same experience, and more importantly, this is the Web3 where users truly have data sovereignty

a. DID Value

For DID, we believe the real potential lies in:

Possibility of data payment brought by data ownership

The attribution of data ownership means the option of data charging. When the data belongs to a centralized platform, it is up to the platform to choose how to realize the user behavior or personal preferences and other data generated on the platform. The purchase behavior infers the possibility of the user’s pregnancy and sells the news to the seller of maternity and baby products to accurately push advertisements to the user. The user becomes the product sold on the platform to obtain benefits, and we can use the products and services of the platform for free. In fact, it is Because our behavioral data is collected by the platform and generates income, which subsidizes the cost of the platform and even makes a profit.

When users have data sovereignty, users can get back the right to choose data transactions. You can choose to sell the relevant data, or you can choose not to sell. Different from the centralized platform as an intermediary connecting data producers and users and from it as a data porter to trade data value, we can even imagine a data market connecting data producers and consumers, such as providing a P2P transaction model, or DAPP directly subsidizes users to obtain user usage data and consumption habits, and a new business model for DAPP paying users emerges. These can be realized through user data and identities contained in DID, and this also depends on the chain we discuss below. Combination similar to Defi lego given by DID on

Composability brought by data on-chain

After the data is uploaded to the chain, the data begins to have liquidity, which can be read, used, and verified through the chain contract or API, and the process of data use or verification can be charged through the contract, just like reading and writing chainlink price Feed needs to be paid, and DID may propose a new data charging model for uploading user information to the chain. If DAPP wants to read or verify the personal information or transaction data contained in the user’s DID identity, it needs to be paid before it can be obtained. For example, Firstbatch’s SBT requires DAPP payment is required when the contract verifies the proof of interest

Moreover, the design of DID can even allow users to choose whether to agree to DAPP to read relevant information, or to selectively disclose personal information. For example, for Defi-related types, DAPP only chooses to display the wallets on a certain chain or the transaction records of one of the wallets. To choose the service of DAPP

More importantly, we may not need to have a large and complete identity on the same platform or dapp, but may have different DIDs in different scenarios, For example, in game scenarios, there are king status SBT, on the defi dapp, can be a mining OG or an onchain reputation of an arbitrage trading master, and published an NFT of a novel research master on the mirror. Just like people are multi-faceted, DIDs in different scenarios can show different aspects of an identity, and the composability of the chain allows different scenarios to obtain information about other identities of users to open new scenarios, or users can choose in different scenarios. Show different DIDs in different scenarios, choose to protect your identity while still being able to obtain products and services that meet your needs

b.DID Application Scenario

As identity data, DID mainly depends on how to use data or verify identity information. DAPP can use identity information to understand user needs and provide more personalized products or verify whether users meet the requirements of DAPP. The currently conceivable scenarios include:

Compliance requirements: CEX under the chain requires user KYC based on compliance and security considerations. There are also lending agreements on the chain that start to combine traditional financial KYC data under the chain to help better manage risks, mainly based on compliance KYC requirements such as regulation and risk control

Proof of Personhood authentication: The most common is the anti-sybil attack in whitelist and airdrop, that is, the same person can create multiple accounts or wallet addresses or get whitelist and airdrop. For the project side, this increases the cost of acquiring customers, because these rights received by the same user can originally obtain other new users, so for the project side, there is a need to verify the proof of personality of the user

User portrait: This is more similar to the understanding and use of DID in our usual discussions. In the Web2 world we are familiar with, a large amount of personal information and behavioral activities are collected, sorted, and extracted by centralized companies to accurately push content and advertisements for each user, and personalized content will also increase. User stickiness to the platform. This is what we think one of the greatest potentials of DID is that DID can be used to design, recommend, and adjust content for users, including social scenarios and transaction scenarios

Social scene: Although the current data on the chain is mainly transaction data, we believe that after EIP4844 has greatly reduced the storage cost on the chain, coupled with the massive data storage, retrieval, and analysis projects we have seen. After the launch of the project, we can see data-intensive social projects at the application layer, and provide targeted services to users through DID and other messages, providing web2 user experience. There may be different forms in different tracks:

DAO: The topic that was often discussed before was onchain reputation. For DAO members, onchain reputation is one of the options to reflect the degree of member contribution. However, since the working scenarios of DAO are not limited to the chain, a large amount of communication and work results, such as snapshot voting, community proposals, offline activities, etc., all appear in off-chain scenarios, and VC may be used to prove off-chain academic qualifications, work experience etc.

Credit lending: As one of the most widely discussed scenarios of DID, credit lending is a very common scenario in traditional finance. The essence of Huabei and Jiebei we are familiar with is that Alipay collects user asset information and behavior data The credit loan provided by the identity information of DID, and the user information provided by DID can also support more lending agreements to provide liquidity, and we have also seen that some agreements have begun to provide credit loans that are not fully mortgaged

3. Web3 Identity Stack

The Web3 identity stack can be divided into six layers from bottom to top: Storage layer-identification layer-private key management layer-authentication layer-authorization layer-proof layer. The storage layer and identity layer are the infrastructures, that define the DID standard. Large-scale user interaction starts from the private key management layer:

To obtain a DID, the user needs to own a private key management account (account). The form of the web3 private key management account is usually a wallet, which is a natural identity information container.

After any authentication (Seed phrases/MPC login/biometrics), the user successfully logs in to the wallet

Different Dapps can now ask for permission to connect to wallets and read information about the account (ex: multi-chain addresses, transactions, on-chain data, and NFT assets) after obtaining the authorized signature.

Finally, all kinds of VC\SBT\Poap, etc. are generated and stored in the wallet to prove the user’s identity.

When interacting with Dapps, the user can independently authorize which data to disclose to the project.

Ideally, ZK technology can be used in the above links to hide the direct connection between the user’s wallet address and identity asset information.

a. Agents&Key management

The wallet provides a natural identity container, which is an entrance connecting Dapps and credentials held by users.

What should an ideal “DID wallet” look like: Aggregates the addresses of all chains, and integrates the fragmented data of users while having basic functions such as signatures, transfers, and other transactions; secondly, it can display various SBT/VC/PoP certificates owned by users. When Dapps interacts with the wallet, the user can independently authorize which data to disclose, thereby helping the user maintain data sovereignty.

1) Smart Contract Wallet

Currently, most wallets such as Metamask do not have these functions. An important reason is that they are all EOA wallets, which only support the most native operations such as query and transfer. In the future smart Contract wallets (eg: Unipass, Argent) are expected to realize more diverse functions, such as:

No private key: users no longer need to keep seed phrases or private keys; multiple verification methods such as biometric verification and device verification can be adopted.

Account recovery: Account recovery can be performed through biometrics, social verification, etc. to ensure that social identities will not be lost

Gasless interaction: users can use ERC-20 tokens involved in the transaction for gas payment, or directly specify a fixed account for payment without preparing ETH as gas in advance

On-chain credit lending: The account contains a complete on-chain credit record so the account itself can directly get a loan based on its reputation, without any collaterals, boosting the financial liquidity in the market; what needs to be further consideration is how the asset lender can properly punish dishonest behavior.

Sub-account management: Users use dedicated account systems for different scenarios and have a more comprehensive account management system. For example, one account only stores ETH as a gas account, and the interaction of all other accounts could be paid by the gas account; One account only stores blue-chip NFT, which will not be easily used; One account is used as a gamefi account, they could store different asset as well as storing the on-chain behavior data as separate entities, thereby preserving user privacy to the largest extend.

2) Identity Wallet

The main function of the smart contract wallet is to store and manage Crypto keys for sending/receiving assets and signing transactions; another type of identity wallet is dedicated to storing identities and allows users to make claims, which could be displayed across applications/platforms.

Polygon ID Wallet: Polygon has launched a wallet for its users to manage and store PolygonID, which provides functions such as log-in, zkp generation, and privacy verification; currently there are few usescases, and it encourages developers to customize use cases

ONTO wallet: it is designed to help users create DID, and fully protect user privacy through encryption algorithms, integrating on-chain + off-chain identities, NFT, asset management, etc;

Nugget: Mainly focus on e-commerce payment scenarios, use biometric technology to scan users’ faces from multiple angles, and combine ID card scanning with optical character recognition technology to complete the KYC identity verification process and zero-knowledge storage on the chain.

b. Authentication layer

To attract users to use web3 DID on a large scale, a more user-friendly, low-threshold, and secure authentication method would be essential.

Sign-In with Ethereum

Sign-In with Ethereum (SIWE) is an authentication standard pioneered by Spruce, ENS, and the Ethereum Foundation. SIWE standardized a message format (similar to jwt) for users to use blockchain-based account login services. On this basis, Sign-In with X (CAIP-122) makes SIWE an implementation of SIWx centered on Ethereum, and generalizes the standard to operate across blockchains.

Unipass

Unipss has successfully developed an on-chain solution for mailbox login and deployed zk to hide the relationship between an email account and a wallet address. In contrast, web3auth’s mailbox login solution is not more centralized than the on-chain structure, and its private key is stored in the front-end storage of the browser, which poses concerns for account security.

Hexlink

As an identity oracle, Hexlink encrypts the user’s identity data and transmits it to the identity server. The server decrypts the data and constructs a merkle proof of zero-knowledge proof to verify the user’s identity. It can be compatible with both web and mobile applications and supports web2 authentication. Social accounts and web3 EOA accounts and AA abstract wallets provide a zero-threshold web3 account experience

Notebook

User identity KYC + login solution, after registering a notebook account and completing the verification of biometric information, zkp can be generated to prove the user’s identity. Recently, it is cooperating with Near for Notebook Auth integration.

c. Access control & Authorization

Authentication solutions help confirm the identity of the user, while access control solutions determine which resources an entity should be able to access and what they are allowed to do with those resources.

1) Access Control Management

The community can programmatically grant access rights based on the visitor’s Token holdings, on-chain activity, or social verification, instead of manually assigning access rights.

Lit

Based on the MPC key management and access control protocol, the public/private key pair is represented by a PKP (programmable key pair) NFT. When the specified conditions are met, the owner of the PKP can trigger the aggregation of the network, so that they can identify to decrypt documents or sign messages. Potential usecase: Token-gated meeting and Gathertown spaces, live streaming, and Google Drive access.

Collab.land

Discord automates the management of bots and manages membership based on token ownership.

2) User authorization management

When interacting with Dapps, users should also be able to independently authorize which data to disclose, and manage permissions of each Dapp through a unified interface.

EverRise wallet

EverRise wallet launched a new product– EverRevoke to help users manage authorizations. EverRevoke aggregates the token & NFT authorization status of 5 blockchains. Users can manage and revoke their token permissions at any time.

The authorization management is expected to be further improved in AA account. Developers can define different read-write and asset permissions for each private keys, or set the minimum transaction threshold and automatic trigger conditions for transactions.

d. Attestation

Attestation solutions can help wallet addresses to produce their “proof”, “reputation” and “credentials”. Based on verifiable on-chain data, this reputation is not limited to a certain platform. We believe that the attestation layer is the closest to crypto users, thereby sustainably generating valuable data. It has the most potential to become a unified DID layer.

1) Data attestations

The biggest feature of the data attestation protocol is that it can attract users to generate new data. With the tractions it gains from user data, it can naturally evolve into a universal identity standard.

Galxe

Galxe ID labels each address by issuing certificates and integrates user data on different chains. Data sources range from Web2 and Web3; there are currently more than 4.2 million Galxe ID users.

In 2023 H1, Galxe will release its new product: Galxe Protocol, a license-free data protocol product. Developers are able to access the data generated by Galxe and start to build applications on its ecosystem.

Galxe is planning to launch a standard system for attestations, including storage modules, verification modules, usage modules, and privacy management.

Lens Protocol — Social Gardens Emerge

Lens Protocol is a decentralized social graph protocol built on Polygon, launched by the founding team of Aave. Regarding the goal of betting on Web3 social media, Aave founder Stani has publicly stated that the ultimate goal is to combine Web3 social media and DeFi, turning the social graph into a trust graph, and providing low-payment loans through Aave.

Lens has passed the hackathon LFGrow and two rounds of donation plans so far, with a total of more than 100,000 users, with many innovative projects including Lenster, Phaver, Orb, SteamDAO, etc.

Farcaster — Decentralized Trusted Neutral Social Protocol

An open social protocol that supports decentralized clients. Farcaster adopts an on-chain + off-chain hybrid architecture, in which identities are stored on the Ethereum, and Ethereum is used to ensure its security, composability, and consistency.

Identity is controlled through Ethereum addresses, and off-chain messages are signed through Ethereum accounts. The user’s data is encrypted and signed by identity and stored on the server controlled by the user (Farcaster Hubs). The reason why the data is not stored on the chain is that the settlement cost on most L1 and L2 networks are too high and the speed needs to be faster.

Nostr-an information relay transmission protocol banned by Twitter

The full name of Nostr is “Notes and Other Stuff Transmitted by Relays”, a relay transmission protocol for social media information. Nostr was established in November 2020. As a general-purpose Internet infrastructure, it is still in its early stages.

Nostr is divided into two components: clients and relayers. The client is used to sign and verify the information and is run by the user; the repeater is responsible for delivering messages and can be run by anyone setting up a server. The client stores the user’s key internally. Each message must be signed and sent to the relay. The relay does not change the data. The verification of the authenticity of these messages is done by the client, so the user does not need to trust the relay, more in line with the spirit of decentralization.

Nostr originated from the BTC community and has a close relationship with the web3 community. In the future, token incentives may be released for storage. In terms of marketing, Jack Dorsey carried the banner, but some time ago it was officially banned by Twitter.

2) Domain Name Service

Both the domain name and the wallet can be regarded as identity information containers: the wallet can use a domain name as a substitute for the wallet account name and use it as the “name” when interacting with the application side, and the domain name can integrate multiple chain addresses or even multiple wallets account, which is also a natural identity aggregator.

ENS

With 1.8 million registrations, it is working with SpruceID to advance EIP-4361: Sign In With Ethereum.

If the proposal is successfully implemented, this will replace the function of Connect Wallet, allowing the domain name to become the entrance to Web3 above the wallet address. In addition, ENS also hopes to complete its vision of “Web3 name” through the integration of a series of identities in the domain name.

Space ID

Space ID: Provides registration services for domain names with .bnb suffixes. Space ID also hopes to link .bnb domain names with users’ multiple on-chain addresses, Twitter, and other Web2 accounts, and become a Web3 Universal Domain.

3) 3D Visual Identity

People are always pursuing experience upgrading, we can reasonably assume that we will gradually see the evolution from 2D identity to 3D identity. The immersive experience and playability of Metaverse will be much better than web page interaction.

Philand

Users can visualize the behavior on the chain through Phi, and receive corresponding components based on the behavior on the chain (donation, protocol interaction, etc.), building up their reputation world like Lego blocks. Users that owns ENS domain names can automatically generate land, and combine their web3 worlds according to their onchain behavior, and their social networks.

Phi Previously won the hackathon NFT HACK2022 and BuildQuest in in ETH Global. It also received a grant from Uniswap, a grand winner for lens hack.

Lifeform-3D visualization DID

Lifeform is a 3D digital identity solution provider that has received investment from Binance’s seed round, focusing on the development of integrated solutions for visual DID. Core products include a 3D hyper-realistic virtual human editor, visual DID protocol, secure smart contract suite, native decentralized identity SaaS system service, and metaverse underlying engine suitable for Web3 cross-application development.

4) Other Attestations

Unirep

Unirep is an undeniable reputation protocol built on Ethereum. Users are allowed to anonymously give positive or negative reputations to others, and at the same time cannot refuse to accept the reputation (non-repudiable), and can voluntarily prove that they have a certain reputation through zk technology, but do not disclose the specific amount. Such solution can also be abstracted into smart contracts, using the composability of blockchain to provide encryption schemes for other project parties.

Firstbatch

Use AI to analyze web2 data and formulate data labels, verify the user’s interest while ensuring user privacy through ZK . Its uses cases include1. Content recommendation 2. Offline activities 3. Permission activities.

Aspect AI

The Linkedin for web3 builders. It uses some AI algorithms to process, classify, and label Github user data to generate user portraits for each Github user. Developer identities may have more utilities than general-purpose identities, including online job hunting, code contribution management, offline hackathons, developer communities, etc.

e. Privacy

We cannot discuss identity infrastructure without considering the technical primitives for privacy,because privacy is important at all layers of the identity stack. Over the past decade, blockchain adoption has accelerated the development of cryptographic primitives such as zk-proof.

Iden3

The zk-native identity protocol, provides a zk framework and open-source libraries for zk identity primitives, authentication, and proof-of-claims generation. The protocol uses Baby JubjubElliptic curves, which generate key pairs for each identity, are designed to work efficiently with zk-SNARKs, which are used to prove identity ownership and claims in a privacy-preserving manner.PolygonIDis currently leveraging this protocol to build its identity wallet

Polyhedra

The zk full-stack protocol, and its cross-chain bridge zkbridge can solve the problem of multichain identity management. Polyhedra has launched the zkbridge relay network + two-layer recursive proof, ensuring the security of the cross-chain bridge while reducing time and gas fee. It also developed a DID container for the Gamefi projects, tp package players’ assets under one nft and transfer it with one click.In the future, it is planned to make user interaction simple or even insensitive.

Semaphore

The zero protocol for developers in Ethereum. Users can prove their group membership and send signals such as votes or support without revealing their original identity. Semaphore is not a user-facing application, but aims to provide powerful and simple tools for Ethereum developers and build DApps.

Application-level zkp has also received much attention:

Private Airdrop:Stealthdrop

Anonymous communication:heyanon

Vote anonymously:Melo

KYC Zkpass Outdid

4. Identity will be the next entry point for web3

Take a look at the development of Web2 portals. It evolves from portals — search engines — PC apps — mobile apps.It’s clear that user are transferring from single-function, passive, complex to comprehensive portal to active, simple onboarding tools.Driven by humanity, users will always tend to choose a one-stop integrated entrances as touchpoint to explore crypto world.

We optimistically predict that this traffic portal will have the following characteristics:

Universal identity: A unified and interoperable identity layer which portraits all user interactions onchain

Intergrated frontend: We believe that one or several super account systems similar to WeChat will emerge, and the most reasonable form of such system is the wallet

Decentralized backend: Relying on the composability of the blockchain, the back end of the wallet can integrate various protocols such as social/game/Defi/identity/payment for safe use by users

Privacy protection: Multiple privacy middleware should be combined to provide powerful end-to-end communication encryption, identity authorization and personal information protection.

Diversed security settings: With the development of zk and AA, better security mechanisms can be invented and put to practice; there should be various key management mechanisms (MPC/multi-signature/no mnemonic) for users to choose from, meeting the needs of different types of customers

User-owned data: Personal information/social data/identity credentials are stored in decentralized storage solutions such as Arweave or IPFS, and are only controlled by the user’s own key.

Native Commercialization: Support native web3 payment, simplified cross-chain interaction and gas fee payment.

Anti-censorship: Users will no longer banned by any centralized institutions like Meta or local governments.

About Foresight Ventures

Foresight Ventures is dedicated to backing the disruptive innovation of blockchain for the next few decades. We manage multiple funds: a VC fund, an actively-managed secondary fund, a multi-strategy FOF, and a private market secondary fund, with AUM exceeding $400 million. Foresight Ventures adheres to the belief of “Unique, Independent, Aggressive, Long-Term mindset” and provides extensive support for portfolio companies within a growing ecosystem. Our team is composed of veterans from top financial and technology companies like Sequoia Capital, CICC, Google, Bitmain and many others.

Website: https://www.foresightventures.com/

Twitter: https://twitter.com/ForesightVen

Medium: https://foresightventures.medium.com

Substack: https://foresightventures.substack.com

Discord: https://discord.com/invite/maEG3hRdE3

Linktree: https://linktr.ee/foresightventures
EIP-4844: The first step towards Ethereum full shardingYou may have noticed almost every developer is participating and retweeting KZG Ceremony, so what is KZG Ceremony? In simple terms, KZG Ceremony is the trust setup of EIP-4844 KZG commitment, and EIP-4844 is the prerelease of Ethereum full sharding. 1. Sharding: A long-term solution for Ethereum scaling While rollups scale Ethereum from the execution layer, sharding improves Ethereum’s scalability and capacity from the perspective of data availability. The trend chart below shows that the average block size fluctuates around 90kb despite the rapid iteration of Ethereum these years. Although rollups remit the network congestion notably, the overall performance is still restricted by the Layer 1 data-storage capacity. In consideration of security and the complexity of implementation, sharding is divided into multiple phases, which include proto-danksharding and danksharding. The whole process could take several years. Given the current storage schema, only few high-performance hardwares are able to participate as nodes. After the implementation of sharding, nodes are not required to store the full content of historical data, which leverages the security of Ethereum by lowering the threshold of becoming a node(A lower data storage cost and a higher degree of decentralization). 2. EIP-4844: Remarkable short-term return, a prerelease of Ethereum full sharding EIP-4844 = Proto-Danksharding Since the complete implementation of sharding is still too complex and could take years, proto-danksharding is the best Intermediate plan to reduce Ethereum congestion in the short term. 2.1 Proto-danksharding Summary Proto-Danksharding introduces a new transaction type called blob-carrying transaction. Benefiting from this update, rollups can use “blob” to transfer data to L1 and store it provisionally at a relatively lower cost. The size of a blob is much larger than the current calldata. About blob Each transaction can carry at most 2 blobs Each block normally carries 8 blobs, which have a 1MB capacity. A block can carry 16 blobs, which leads to a 2MB block size. A blob is not permanently stored as the history log like calldata. In the design of proto-danksharding, nodes still need to download the full data content and verify the data availability. 2.2 Blob-carrying transaction in depth Functionality The functionality of a data blob is similar to calldata, which allows the rollup to transfer transaction data and proofs to L1. Cost The original intention of the blob is to support high TPS in rollups. Compared to calldata, which uses on-chain storage, those data blob is only downloaded and stored for a period of time. Therefore, the gas spending for rollups to ensure data availability will be predictably lower. Capacity The size of each blob is 125kB. 2.3 The value and challenge of blob-carrying transaction Value It is certain that the emergence of blobs makes transaction data become a kind of cache, which further lowers the storage hardware requirement for nodes, and reduces the gas fee by providing Ethereum with extra data storage. Challenge: Let’s calculate the hardware requirement The fact is that the current block size is around 90kB, but the size of a blob can reach 125kB According to the design of EIP-4844, the size of each slot is normally 1 MB, which means the total data size can be calculated as follow: 1 MB/block * 5 block/min * 43200 min/month * 12 month/year = 2.47 TB per year It is obvious that the annual data increment is far more than the total Ethereum data, which infers that this naive data storage plan is not efficient. What can be optimized? In short term, each node still needs to store the full content of historical data, but the consensus layer is implemented with a scheme that the blob data will be deleted in a certain period of time(30day or 1 year, TBD) For the long-term benefit, EIP-4444 needs to be implemented, which indicates that nodes are no longer required to store full data. Instead, a new mechanism is adopted, which allows nodes to only store parts of data for a certain time referring to a so-called history expiry scheme. 2.4 KZG Commitment KZG Commitment is a polynomial commitment scheme adopted by EIP-4844 proto-danksharding KZG Ceremony is the process of trust setup for KZG Commitment, which attracts more than 30,000 participants. 2.4.1 What is KZG commitment KZG is an abbreviation of Aniket Kate, Gregory M. Zaverucha, and Ian Goldberg, who published the polynomial commitment essay “Constant-Size Commitments to Polynomials and Their Applications” in 2010. KZG Commitment is wildly applied in the plonk-style zk-snark protocol. Referring to the diagram from Dankrad’s presentation, the KZG root is similar to the Merkle root, except the KZG root commits to a polynomial, where every position is laying on this polynomial. Based on the scenario of proto-danksharding, KZG root commits a data set, where every single data point can be verified as a part of the entire set. A quick view of how KZG commitment works internally Prover: Responsible for calculating commitment. For the security consideration, a prover cannot modify the given polynomial, and the commitment is only valid for the current polynomial; Verifier: Responsible for verifying the commitment sent from the prover. 2.4.2 KZG Ceremony(trusted setup) The process of the KZG Ceremony Everyone can join as a participant in the KZG ceremony and contribute the secret. The newly added secret will be mixed with the previous output to form a new result, and finally, generate an SRS for the KZG commitment trust setup. (Check the diagram provided by Vitalik for a better understanding) Trust setup KZG Ceremony is a wildly used multi-participant trust setup called power-of-tau; This setup follows the 1-of-N trust model, which means no matter how many participants contribute to the process of generating the final setup, as long as one person keeps his/her secret, the validity of the setup can be guaranteed. Significance of the KZG Ceremony The value of the trust setup of KZG commitment can be interpreted as follow: to generate a parameter that is necessary for every single execution of the cryptographic protocol When the prover calculates the commitment, KZG commitment C = f(s)g1, where f is the evaluation function, and s is the final result of the KZG trust setup. Therefore the final secret generated by the current KZG ceremony is crucial to the following implementation of sharding. 2.4.3 Advantage of the KZG Commitment Cost KZG commitment has a lower complexity and can be verified efficiently. No extra proof is needed, which leads to a lower cost and remits the requirement of bandwidth. Even lower cost taking advantage of the point evaluation precompile. Security If the failure occurs, only the blob corresponding to the current commitment is infected, and there is no further chain effect. Compatibility The KZG commitment is more friendly to DAS, which avoids redundancy in development. 2.5 The benefit of EIP-4844 Rollup As shown in the picture below, rollup needs to submit the state delta and the versioned hash of KZG commitment through calldata(zk-rollup still needs to upload the zkp). After the implementation of EIP-4844, the expensive calldata only carries some small data such as state delta and commitments, while the large data like transaction batch is put into the blob. reduce the cost; reduce the block storage space usage. Improvement of security Data availability: Blob is stored in the beacon chain, which shares the same security as Ethereum L1. Historical data: nodes only store blobs for a certain amount of time, and layer 2 rollup is responsible for the permanent data storage, which indicates that the security of historical data relies on the rollup. Cost The low-cost feature of the blob-carrying transaction can optimize the overall cost by x10 to x50. Meanwhile, EIP-4844 introduces a blob fee Gas and blob will have separate adjustable gas prices and limits; The price unit of a blob is gas, the gas amount will float according to the network traffic, which aims to maintain the number that each block carries(8 on average). Implementation of precompile EVM execution can only view the commitment of a blob generated by the prover, and cannot access blob data directly. Therefore, rollup needs to use precompile scheme to verify the validity of commitment. There are two precompile algorithms mentioned in EIP-4844 Point evaluation precompile Prove that multiple commitments are committed to the same set of data. Point evaluation precompile is mainly adopted by zk-rollup, rollup needs to provide two commitments, KZG commitment and zk-rollup commitment As for optimistic rollup, most of them have adopted multi-round fraud-proof, and the final round fraud-proof holds a smaller data size, which means they can also use point evaluation precompile for a lower cost. Blob verification precompile Prove that the versioned hash is valid for the corresponding blob Optimistic rollup needs access to full data when submitting fraud-proof, so it is rational to verify the validity of the versioned hash and then the fraud-proof verification. 3. Danksharding: A crucial step towards full sharding Scaling Thanks to the new transaction type design of proto-danksharding, which introduces a data blob, each block now has an extra 1MB cache. This number will grow 16 to 32 times after the implementation of danksharding. Data availability: High-performance data storage and verification Compared to proto-danksharding, where nodes are required to store the full content of historical data, danksharding allows nodes to only store data after sampling. DAS Taking advantage of erasure coding technology, danksharding proposal makes it easier (each node only needs to download parts of data) for nodes to discover the loss of data. Security: Almost the same Since nodes are no longer required to store the full content of historical data, the security is not backed by only a single node but depends on multiple nodes which store parts of data and can be further composed and recover the full data. Although a single-point dependence scheme is more secure than a multi-point dependency, the number of nodes in the Ethereum network is far more than enough, which is qualified to achieve the goal of ensuring data availability. New challenge: the higher requirement for block builders While validators only download and store parts of full data, the block builder still needs to upload the full content of data which is the blob that contains all transaction data. According to the diagram from Dankrad’s slides, we can see how PBS(proposer/builder separation), which is originally designed for anti-MEV, helps with reducing the bandwidth requirement during block building. 4. Another sharding scheme: dynamic state sharding from Shardeum Shardeum is an EVM-compatible L1 blockchain, which uses dynamic state sharding to improve scalability and security. Meanwhile, the shardeum network is able to ensure a higher level of decentralization. Dynamic state sharding Advantages The most intuitive benefits of dynamic state sharding are linear scaling. Each node holds a different address range, and there is a significant overlap between the addresses covered by nodes. The sharding algorithm groups nodes dynamically, which means newly added nodes in the Shardeum network work immediately to increase the TPS Implementation The complexity of implementing dynamic state sharding is more difficult than static sharding. Shardeum’s technical team has deeply researched sharding technologies. The previous R&D achievements made by the Shardeum team(previously Shardus technology) also make significant contributions, which is able to showcase the linear scaling of dynamic state sharding in an early development stage. Summary Product Referring to the idea of divide and conquer, the Shardeum dynamic state sharding split the workload of calculation and storage, which allows a higher level of parallelization. Therefore, the network is able to accommodate more nodes, which further improves the throughput and level of decentralization. Team The Shardeum team has strong marketing experience and narrative ability. They also have a deep understanding of the tech detail, especially dynamic state sharding. Technology The tech team is able to design an appropriate sharding scheme and an efficient consensus algorithm(Proof of Stake+ Proof of Quorum)based on their understanding of the scenario, which puts the scaling and throughput as the first consideration and ensures the security and level of decentralization as far as possible. Progress Betanet launched on 2023–02–02. 5. The outlook Sharding is a long-term scaling solution for Ethereum, it has a huge value and profound significance for the whole network. It is worse to give close attention, as the implementation of sharding is a process of iteration. All the current proposals, including proto-danksharding and danksharding, can be upgraded/altered. While understanding the general method of implementing sharding is important, the technical proposals, such as PBS, DAS, multidimensional fee market, etc, that emerge during the process are also worth paying attention to. There could be many outstanding projects accompanying those schemes. It is important to know that sharding is a general term that describes a set of scaling technologies, and there are different application schemes depending on specific scenarios. For example, the design of danksharding might only fit Ethereum, and could likely lead to a negative effect if applied in other L1s, as the security needs to be guaranteed by a huge amount of nodes in the network. A rational combination of sharding and other scaling solutions can achieve a multiplying effect. The current danksharding proposal will not work alone. Instead, rollups and danksharding supplement each other to better improve Ethereum’s scalability and capacity. Reference https://notes.ethereum.org/@dankrad/kzg_commitments_in_proofs https://notes.ethereum.org/@dankrad/new_sharding https://vitalik.ca/general/2022/03/14/trustedsetup.html https://notes.ethereum.org/@vbuterin/proto_danksharding_faq#Why-use-the-hash-of-the-KZG-instead-of-the-KZG-directly https://ethresear.ch/t/easy-proof-of-equivalence-between-multiple-polynomial-commitment-schemes-to-the-same-data/8188 https://dankradfeist.de/ethereum/2020/06/16/kate-polynomial-commitments.html https://eips.ethereum.org/EIPS/eip-4844 https://www.eip4844.com/ https://biquanlibai.notion.site/Data-Availability-caa896aae59d489b98f2448f17b01640 https://ethresear.ch/t/a-design-of-decentralized-zk-rollups-based-on-eip-4844/12434 About Foresight Ventures Foresight Ventures is dedicated to backing the disruptive innovation of blockchain for the next few decades. We manage multiple funds: a VC fund, an actively-managed secondary fund, a multi-strategy FOF, and a private market secondary fund, with AUM exceeding $400 million. Foresight Ventures adheres to the belief of “Unique, Independent, Aggressive, Long-Term mindset” and provides extensive support for portfolio companies within a growing ecosystem. Our team is composed of veterans from top financial and technology companies like Sequoia Capital, CICC, Google, Bitmain, and many others. Website: https://www.foresightventures.com/ Twitter: https://twitter.com/ForesightVen Medium: https://foresightventures.medium.com Substack: https://foresightventures.substack.com Discord: https://discord.com/invite/maEG3hRdE3 Linktree: https://linktr.ee/foresightventures

EIP-4844: The first step towards Ethereum full sharding

You may have noticed almost every developer is participating and retweeting KZG Ceremony, so what is KZG Ceremony?

In simple terms, KZG Ceremony is the trust setup of EIP-4844 KZG commitment, and EIP-4844 is the prerelease of Ethereum full sharding.

1. Sharding: A long-term solution for Ethereum scaling

While rollups scale Ethereum from the execution layer, sharding improves Ethereum’s scalability and capacity from the perspective of data availability.

The trend chart below shows that the average block size fluctuates around 90kb despite the rapid iteration of Ethereum these years. Although rollups remit the network congestion notably, the overall performance is still restricted by the Layer 1 data-storage capacity.

In consideration of security and the complexity of implementation, sharding is divided into multiple phases, which include proto-danksharding and danksharding. The whole process could take several years.

Given the current storage schema, only few high-performance hardwares are able to participate as nodes. After the implementation of sharding, nodes are not required to store the full content of historical data, which leverages the security of Ethereum by lowering the threshold of becoming a node(A lower data storage cost and a higher degree of decentralization).

2. EIP-4844: Remarkable short-term return, a prerelease of Ethereum full sharding

EIP-4844 = Proto-Danksharding

Since the complete implementation of sharding is still too complex and could take years, proto-danksharding is the best Intermediate plan to reduce Ethereum congestion in the short term.

2.1 Proto-danksharding Summary

Proto-Danksharding introduces a new transaction type called blob-carrying transaction. Benefiting from this update, rollups can use “blob” to transfer data to L1 and store it provisionally at a relatively lower cost. The size of a blob is much larger than the current calldata.

About blob

Each transaction can carry at most 2 blobs

Each block normally carries 8 blobs, which have a 1MB capacity.

A block can carry 16 blobs, which leads to a 2MB block size.

A blob is not permanently stored as the history log like calldata.

In the design of proto-danksharding, nodes still need to download the full data content and verify the data availability.

2.2 Blob-carrying transaction in depth

Functionality

The functionality of a data blob is similar to calldata, which allows the rollup to transfer transaction data and proofs to L1.

Cost

The original intention of the blob is to support high TPS in rollups. Compared to calldata, which uses on-chain storage, those data blob is only downloaded and stored for a period of time. Therefore, the gas spending for rollups to ensure data availability will be predictably lower.

Capacity

The size of each blob is 125kB.

2.3 The value and challenge of blob-carrying transaction

Value

It is certain that the emergence of blobs makes transaction data become a kind of cache, which further lowers the storage hardware requirement for nodes, and reduces the gas fee by providing Ethereum with extra data storage.

Challenge: Let’s calculate the hardware requirement

The fact is that the current block size is around 90kB, but the size of a blob can reach 125kB

According to the design of EIP-4844, the size of each slot is normally 1 MB, which means the total data size can be calculated as follow:

1 MB/block * 5 block/min * 43200 min/month * 12 month/year = 2.47 TB per year

It is obvious that the annual data increment is far more than the total Ethereum data, which infers that this naive data storage plan is not efficient.

What can be optimized?

In short term, each node still needs to store the full content of historical data, but the consensus layer is implemented with a scheme that the blob data will be deleted in a certain period of time(30day or 1 year, TBD)

For the long-term benefit, EIP-4444 needs to be implemented, which indicates that nodes are no longer required to store full data. Instead, a new mechanism is adopted, which allows nodes to only store parts of data for a certain time referring to a so-called history expiry scheme.

2.4 KZG Commitment

KZG Commitment is a polynomial commitment scheme adopted by EIP-4844 proto-danksharding

KZG Ceremony is the process of trust setup for KZG Commitment, which attracts more than 30,000 participants.

2.4.1 What is KZG commitment

KZG is an abbreviation of Aniket Kate, Gregory M. Zaverucha, and Ian Goldberg, who published the polynomial commitment essay “Constant-Size Commitments to Polynomials and Their Applications” in 2010. KZG Commitment is wildly applied in the plonk-style zk-snark protocol.

Referring to the diagram from Dankrad’s presentation, the KZG root is similar to the Merkle root, except the KZG root commits to a polynomial, where every position is laying on this polynomial. Based on the scenario of proto-danksharding, KZG root commits a data set, where every single data point can be verified as a part of the entire set.

A quick view of how KZG commitment works internally

Prover: Responsible for calculating commitment. For the security consideration, a prover cannot modify the given polynomial, and the commitment is only valid for the current polynomial;

Verifier: Responsible for verifying the commitment sent from the prover.

2.4.2 KZG Ceremony(trusted setup)

The process of the KZG Ceremony

Everyone can join as a participant in the KZG ceremony and contribute the secret. The newly added secret will be mixed with the previous output to form a new result, and finally, generate an SRS for the KZG commitment trust setup. (Check the diagram provided by Vitalik for a better understanding)

Trust setup

KZG Ceremony is a wildly used multi-participant trust setup called power-of-tau;

This setup follows the 1-of-N trust model, which means no matter how many participants contribute to the process of generating the final setup, as long as one person keeps his/her secret, the validity of the setup can be guaranteed.

Significance of the KZG Ceremony

The value of the trust setup of KZG commitment can be interpreted as follow: to generate a parameter that is necessary for every single execution of the cryptographic protocol

When the prover calculates the commitment, KZG commitment C = f(s)g1, where f is the evaluation function, and s is the final result of the KZG trust setup. Therefore the final secret generated by the current KZG ceremony is crucial to the following implementation of sharding.

2.4.3 Advantage of the KZG Commitment

Cost

KZG commitment has a lower complexity and can be verified efficiently.

No extra proof is needed, which leads to a lower cost and remits the requirement of bandwidth.

Even lower cost taking advantage of the point evaluation precompile.

Security

If the failure occurs, only the blob corresponding to the current commitment is infected, and there is no further chain effect.

Compatibility

The KZG commitment is more friendly to DAS, which avoids redundancy in development.

2.5 The benefit of EIP-4844

Rollup

As shown in the picture below, rollup needs to submit the state delta and the versioned hash of KZG commitment through calldata(zk-rollup still needs to upload the zkp).

After the implementation of EIP-4844, the expensive calldata only carries some small data such as state delta and commitments, while the large data like transaction batch is put into the blob.

reduce the cost;

reduce the block storage space usage.

Improvement of security

Data availability: Blob is stored in the beacon chain, which shares the same security as Ethereum L1.

Historical data: nodes only store blobs for a certain amount of time, and layer 2 rollup is responsible for the permanent data storage, which indicates that the security of historical data relies on the rollup.

Cost

The low-cost feature of the blob-carrying transaction can optimize the overall cost by x10 to x50.

Meanwhile, EIP-4844 introduces a blob fee

Gas and blob will have separate adjustable gas prices and limits;

The price unit of a blob is gas, the gas amount will float according to the network traffic, which aims to maintain the number that each block carries(8 on average).

Implementation of precompile

EVM execution can only view the commitment of a blob generated by the prover, and cannot access blob data directly. Therefore, rollup needs to use precompile scheme to verify the validity of commitment.

There are two precompile algorithms mentioned in EIP-4844

Point evaluation precompile

Prove that multiple commitments are committed to the same set of data.

Point evaluation precompile is mainly adopted by zk-rollup, rollup needs to provide two commitments, KZG commitment and zk-rollup commitment

As for optimistic rollup, most of them have adopted multi-round fraud-proof, and the final round fraud-proof holds a smaller data size, which means they can also use point evaluation precompile for a lower cost.

Blob verification precompile

Prove that the versioned hash is valid for the corresponding blob

Optimistic rollup needs access to full data when submitting fraud-proof, so it is rational to verify the validity of the versioned hash and then the fraud-proof verification.

3. Danksharding: A crucial step towards full sharding

Scaling

Thanks to the new transaction type design of proto-danksharding, which introduces a data blob, each block now has an extra 1MB cache. This number will grow 16 to 32 times after the implementation of danksharding.

Data availability: High-performance data storage and verification

Compared to proto-danksharding, where nodes are required to store the full content of historical data, danksharding allows nodes to only store data after sampling.

DAS

Taking advantage of erasure coding technology, danksharding proposal makes it easier (each node only needs to download parts of data) for nodes to discover the loss of data.

Security: Almost the same

Since nodes are no longer required to store the full content of historical data, the security is not backed by only a single node but depends on multiple nodes which store parts of data and can be further composed and recover the full data.

Although a single-point dependence scheme is more secure than a multi-point dependency, the number of nodes in the Ethereum network is far more than enough, which is qualified to achieve the goal of ensuring data availability.

New challenge: the higher requirement for block builders

While validators only download and store parts of full data, the block builder still needs to upload the full content of data which is the blob that contains all transaction data.

According to the diagram from Dankrad’s slides, we can see how PBS(proposer/builder separation), which is originally designed for anti-MEV, helps with reducing the bandwidth requirement during block building.

4. Another sharding scheme: dynamic state sharding from Shardeum

Shardeum is an EVM-compatible L1 blockchain, which uses dynamic state sharding to improve scalability and security. Meanwhile, the shardeum network is able to ensure a higher level of decentralization.

Dynamic state sharding

Advantages

The most intuitive benefits of dynamic state sharding are linear scaling. Each node holds a different address range, and there is a significant overlap between the addresses covered by nodes. The sharding algorithm groups nodes dynamically, which means newly added nodes in the Shardeum network work immediately to increase the TPS

Implementation

The complexity of implementing dynamic state sharding is more difficult than static sharding. Shardeum’s technical team has deeply researched sharding technologies. The previous R&D achievements made by the Shardeum team(previously Shardus technology) also make significant contributions, which is able to showcase the linear scaling of dynamic state sharding in an early development stage.

Summary

Product

Referring to the idea of divide and conquer, the Shardeum dynamic state sharding split the workload of calculation and storage, which allows a higher level of parallelization. Therefore, the network is able to accommodate more nodes, which further improves the throughput and level of decentralization.

Team

The Shardeum team has strong marketing experience and narrative ability. They also have a deep understanding of the tech detail, especially dynamic state sharding.

Technology

The tech team is able to design an appropriate sharding scheme and an efficient consensus algorithm(Proof of Stake+ Proof of Quorum)based on their understanding of the scenario, which puts the scaling and throughput as the first consideration and ensures the security and level of decentralization as far as possible.

Progress

Betanet launched on 2023–02–02.

5. The outlook

Sharding is a long-term scaling solution for Ethereum, it has a huge value and profound significance for the whole network. It is worse to give close attention, as the implementation of sharding is a process of iteration. All the current proposals, including proto-danksharding and danksharding, can be upgraded/altered.

While understanding the general method of implementing sharding is important, the technical proposals, such as PBS, DAS, multidimensional fee market, etc, that emerge during the process are also worth paying attention to. There could be many outstanding projects accompanying those schemes.

It is important to know that sharding is a general term that describes a set of scaling technologies, and there are different application schemes depending on specific scenarios. For example, the design of danksharding might only fit Ethereum, and could likely lead to a negative effect if applied in other L1s, as the security needs to be guaranteed by a huge amount of nodes in the network.

A rational combination of sharding and other scaling solutions can achieve a multiplying effect. The current danksharding proposal will not work alone. Instead, rollups and danksharding supplement each other to better improve Ethereum’s scalability and capacity.

Reference

https://notes.ethereum.org/@dankrad/kzg_commitments_in_proofs

https://notes.ethereum.org/@dankrad/new_sharding

https://vitalik.ca/general/2022/03/14/trustedsetup.html

https://notes.ethereum.org/@vbuterin/proto_danksharding_faq#Why-use-the-hash-of-the-KZG-instead-of-the-KZG-directly

https://ethresear.ch/t/easy-proof-of-equivalence-between-multiple-polynomial-commitment-schemes-to-the-same-data/8188

https://dankradfeist.de/ethereum/2020/06/16/kate-polynomial-commitments.html

https://eips.ethereum.org/EIPS/eip-4844

https://www.eip4844.com/

https://biquanlibai.notion.site/Data-Availability-caa896aae59d489b98f2448f17b01640

https://ethresear.ch/t/a-design-of-decentralized-zk-rollups-based-on-eip-4844/12434

About Foresight Ventures

Foresight Ventures is dedicated to backing the disruptive innovation of blockchain for the next few decades. We manage multiple funds: a VC fund, an actively-managed secondary fund, a multi-strategy FOF, and a private market secondary fund, with AUM exceeding $400 million. Foresight Ventures adheres to the belief of “Unique, Independent, Aggressive, Long-Term mindset” and provides extensive support for portfolio companies within a growing ecosystem. Our team is composed of veterans from top financial and technology companies like Sequoia Capital, CICC, Google, Bitmain, and many others.

Website: https://www.foresightventures.com/

Twitter: https://twitter.com/ForesightVen

Medium: https://foresightventures.medium.com

Substack: https://foresightventures.substack.com

Discord: https://discord.com/invite/maEG3hRdE3

Linktree: https://linktr.ee/foresightventures
Explore the latest crypto news
âšĄïž Be a part of the latests discussions in crypto
💬 Interact with your favorite creators
👍 Enjoy content that interests you
Email / Phone number

Latest News

--
View More

Trending Articles

View More
Sitemap
Cookie Preferences
Platform T&Cs