Binance Square
LIVE
Web3com Ventures
@Web3com_Ventures
Web3.com Ventures is a diversified web3 finance innovator, mainly focusing on equity investment, asset management, and web3 infrastructure building.
Following
Followers
Liked
Shared
All Content
LIVE
--
The Appchain Galaxy: Where Everyone Gets a BlockchainTechnologies and Implications of Appchains Web3.com Ventures Original Research Analysis 0xFishylosopher Introduction Ethereum is great — for most people at least. But one of the key problems of Ethereum and other public-facing blockchains is that there’s not that much customizability. Sure, you can write and deploy smart contracts, but as a single user or company, you can’t really do much about the consensus mechanisms, execution engines, and other underlying architecture. After all, you aren’t the only one using Ethereum, so you don’t really get to dictate the rules. Suppose that you are building a cross-chain decentralized exchange that needs to aggregate data from all these different chains [1], or you want an on-chain real-time-strategy game [2], or you’re a Fortune 500 company tracking your supply chain. Ready-made public blockchains can’t really handle the thoroughput, data aggregation, or privacy requirements that these applications need. This is where the concept of an “appchain” comes in. Appchains are decentralized, application-specific blockchain networks that are designed to support specific use cases and meet specific requirements. Within this essay, I will explore three exemplary approaches to the emerging appchain landscape: the Cosmos ecosystem, Hyperledger Fabric, and Rollups-as-a-Service (RaaS), before discussing appchains’ rise affects the Web3 infrastructure landscape at-large. Cosmos Network The Cosmos network is perhaps one of the most important appchain solutions within Web3. Stretching all the way back to 2016, the Cosmos network aims to create an “internet of blockchains,” or the infrastructure of a multi-chain world. Essentially, Cosmos provides an software development kit (SDK), which provides developers with the tools and interfaces they need to easily build their own blockchain applications. While the SDK has some default architectures that come out of the box, such as the Tendermint PoS consensus protocol, developers are given a large leeway to overwrite these default architectures and design a blockchain complete with its own custom consensus mechanism, token standards, and smart contract platforms. Within the Cosmos Network, there are two distinct types of blockchains: “hubs” and “zones” [3]. “Zones” are fairly straightforward to understand — these are just the application-specific chains that run the application (such as a decentralized exchange or on-chain game) itself. “Hubs” on the other hand, function to connect these “zones” and other “hubs” together in order to enable inter-chain communication through the “Inter-Blockchain Communication Protocol” (IBC). At the center of the Cosmos network is the “Cosmos Hub,” which is the first “Hub” chain to be built on the Cosmos network, and essentially acts as a “Grand Central Station” for all other hubs and zones. Nonetheless, as shown in the diagram above, “Hubs” don’t necessarily need to connect to the Cosmos Hub. A private company, for example, could choose to launch its own private hub that connects its own private zones. Indeed, the Cosmos SDK has a focus on customizability and interoperability, and developers are given a wide latitude to let their imagination run free. Within the Cosmos ecosystem are some major general-purpose L1 chains, such as the EVM-Compatible BNB Chain (originally Binance Smart Chain) [4]. For an appchain builder, the interoperability of the Cosmos network is incredibly enabling, as it allows the appchain to access data from a large variety of chains through the IBC. This is one of the reasons why many appchains built using the Cosmos network are oracles, multi-chain DEXs, and other services that require the aggregation of data from a wide variety of different chains. Hyperledger Fabric Another way to implement appchains is through Hyperledger Fabric, an open-source blockchain platform for building enterprise-grade applications and solutions. It is one of the projects under the Hyperledger umbrella, which is hosted by the Linux Foundation [5]. Like the Cosmos SDK, Hyperledger Fabric is designed to support modular blockchain network architecture and provide flexibility in building and deploying blockchain-based applications. It provides a plug-and-play architecture that allows components such as consensus, membership services, and smart contract execution (chaincode) to be easily added or replaced. But the most important difference between the Cosmos SDK and Hyperledger Fabric is one of audience: Cosmos SDK is oriented towards Web3-native users and developers, whereas Hyperledger Fabric targets enterprise-grade institutional clients. Thus, while Cosmos SDK focuses on building a decentralized, interoperable system, Hyperledger Fabric focuses on building private and permissioned networks, where participants must be explicitly authorized to join and participate in the network. Indeed, enterprise-grade appchains operate in a vastly different way to Web3-native appchains. Whereas Web3-native appchains often focus on the tokenomics and use the blockchain structure to financialize their products, enterprise-grade appchains focus on using the blockchain as an efficient way of record and maintain data. Fundamentally, these enterprise-grade appchains leverage the fact that blockchains are tamper-proof, verifiable, append-only data structures that can have data-writers spread out across time and geographic locations. Essentially, a Hyperledger Fabric-based appchain simply seeks to replace the dusty, inefficient databases enterprises have maintained for decades. As shown in the diagram above, essentially all of the “permissioned nodes” on the Hyperledger Fabric blockchain are employees in various parts of the supply chain, using the blockchain as a way of synchronously recording different data states — such as if a piece of produce has entered a factory or a store. There is far less of a focus of interconnectivity between different blockchains, and unlike Cosmos, there is no native support for inter-blockchain connectivity using Hyperledger Fabric. Instead, there are data APIs and interfaces for integrating with other blockchain networks or systems. This means that it is possible to build inter-blockchain communication solutions using Hyperledger Fabric, but it requires custom development and integration, unlike the built-in hub-and-zone design of the Cosmos ecosystem. Intuitively, this makes sense: as an enterprise-grade appchain, blockchains built using Hyperledger fabric are usually siloed away from the outer world — after all, you don’t want the public to be messing with your supply chain data. One case study of Hyperledger Fabric in action is in managing Walmart’s supply chain. Walmart’s appchain, also known as the “Walmart Food Traceability Platform,” is a blockchain-based food traceability system designed to improve food safety and transparency in the supply chain [6]. The Walmart appchain allows for the tracking and tracing of food products from farm to store shelves, helping to quickly identify and isolate any potential food safety issues. At every stage in the process, suppliers had to upload labels and certificates of authenticity through a web interface onto the blockchain, where it would be permanently accessible to all of the relevant stakeholders. With the blockchain serving as a single “ground truth” of information, this greatly reduced the time to record and track food quality issues, such as food-borne diseases: The Hyperledger Fabric blockchain-based food traceability system built for the two products worked. For pork in China, it allowed uploading certificates of authenticity to the blockchain, bringing more trust to a system where that used to be a serious issue. And for mangoes in the US, the time needed to trace their provenance went from 7 days to
 2.2 seconds! — Hyperledger Foundation [6] Thus, Hyperledger showcases the use of appchains in an entirely different setting to Cosmos — to large, enterprise-grade use-cases where the blockchain is used as an alternate to traditional data-recording and tracing methods. Sure, one can argue that this use-case of appchains is “not really Web3,” simply because it doesn’t leverage tokenomics, decentralization, and crypto-native principles in the same way. But the enterprise adoption of appchains cannot be ignored — the underlying technology is the same, and it is a vital step to gaining public trust of blockchain technologies to enable mass adoption. And in the long run, if companies like Walmart have an appchain setup, it is far easier to integrate them into the “Web3-native” world. Rollups-as-a-Service A more recent alternate approach to building appchains is through the use of rollups, especially through “rollups-as-a-service” (RaaS). Rollups-as-a-service is a type of blockchain scaling solution that allows for off-chain computation and storage while maintaining the security and trust of the underlying blockchain. They work by batching a large number of transactions into a single, compressed transaction that is then recorded on an underlying L1 blockchain, most commonly Ethereum. Rollups can be used for appchains because they provide a way to scale the processing and storage of transactions for dApps built on these chains, without sacrificing the security and trust of the underlying blockchain. While the developer may sacrifice some customizability in the underlying consensus layer compared with a Cosmos-based solution, there are several advantages of using a rollup-as-a-service for an application-specific blockchain instead of a full chain. Perhaps the most important advantage is that there is no need for developers to bootstrap a validator set. Because the consensus and settlement are outsourced to an underlying L1, this results in not only a cheaper appchain solution, but also one that is more robust, as it is secured through the underlying L1. Moreover, because rollups are vertical solutions that are modular in structure, they are not constrained to a single blockchain framework, and can leverage data and functionality on multiple blockchains. Essentially, in the rollup appchain model, L1 chains become the equivalent of “Hubs” that contain wealths of data and security. A gaming appchain, for example, could use a rollup-based solution where Ethereum’s security is leveraged for consensus and settlement, and Solana’s high-thoroughput can be leveraged for executional speed [7]. Rollups-as-a-service companies provide the infrastructure and services necessary for appchain developers to implement their custom appchain rollup solution. Different RaaS abstract this toolkit to different levels: some projects, such as the Op Stack [8], are SDK based, allowing the developer to customize the rollup fairly extensively, other projects such as Constellation focus on a complete no-code “white-glove” deployment solution, allowing appchain developers to focus on the “app” part of development, rather than the “chain” part of development [9]. The Implications of Appchains The emergence and rise of these appchain solutions have several profound implications for the Web3 infrastructure landscape at large. Appchains usher in a paradigmatic change between the relationship between “chains” and “apps” from a supply-driven model to a demand-driven model. Before, when infrastructure solutions were still maturing, and many modern infrastructure concepts such as Proof of Stake, sharding, and rollups were still in the proof of concept stage, infrastructure projects were often implemented for the sake of exploring the feasibility of these technical solutions. This is most typically the case in “third-generation blockchains” such as Avalanche, Cardano, NEAR, and Solana. The supply of these chains (their throughput and technical implementations) drove the demand for these chains. In other words, the design of the “chains” determined the design of the “apps.” However, the rise of appchains marks a distinct fourth-generation of blockchains, where demand for blockchains drives the supply of these blockchains. All of the technical routes presented above, the Cosmos network, Hyperledger Fabric, and Rollups-as-a-Service, present plug-and-play blockchain deployment solutions that are simple, customizable, and cost-effective. It has never been easier to design, customize, and deploy your own blockchain. Because of this, it is now the opposite way round: the design of the “apps” determines the design of the “chains.” One likely result of this is that there will likely be a more concentrated L1 scene, with a shift from a monopolistic competition-style scene (where different L1s are slightly differentiated between one another) to an oligopolistic scene, dominated by a few key chains with enormous ecosystems and liquidity. This is because, with the advent of appchains, there is less of a reason for top-level DApps to deploy on smaller L1s rather than run their own chain (with infinitely customizable features) or deploy an appchain rollup on a large, established L1. Smaller L1 chains simply cannot compete with the combination of large L1s and appchains in terms of flexibility, customization, security, and support. Furthermore, appchains are also a key mechanism to institutional onboarding to Web3. As mentioned before, though enterprise chains such as Hyperledger are usually excluded from our typical idea of Web3, they leverage the same underlying blockchain technologies to achieve greater transparency in logistics and supply management. So why can’t this also be extended into payment and beyond? Thus, I believe that it will only be a matter of time before these enterprise-grade blockchains become more integrated with native Web3 DeFi, payment, and NFT projects. Already, there is some exploration into the feasbility of expanding the Cosmos IBC beyond just Cosmos appchains, connecting platforms such as Hyperledger Fabric, and targeting other enterprise-grade blockchain solutions [10]. Conclusion Nowadays, it is clear that there never was, and never will be a one-size-fits-all blockchain. Whether it be through Cosmos SDK, Hyperledger Fabric, or Rollups-as-a-Service, the maturity and ease of deployment for custom appchains will transform the Web3 infrastructure space into an app-first, user-first experience — a sign of growing maturity in Web3. The future of Web3 space is a multichain galaxy, and given their customizability, effectiveness, and ease of deployment, appchains will play an ever-increasingly important role in this universe, shining like stars in a galaxy of decentralization. 🐩 @0xfishylosopher 📅 6 February 2023 References [1] Osmosis, a cross-chain DEX: https://osmosis.zone/ [2] Curio.gg, on-chain gaming company: https://www.curio.gg/ [3] https://newsletter.banklesshq.com/p/a-beginners-guide-to-cosmos [4] See BNB Chain Whitepaper: https://github.com/bnb-chain/whitepaper/blob/master/WHITEPAPER.md [5] https://www.ibm.com/products/blockchain-platform-hyperledger-fabric [6] Walmart case study: https://www.hyperledger.org/learn/publications/walmart-case-study [7] Example from https://messari.vercel.app/article/the-rollups-as-a-service-ecosystem [8] See https://www.optimism.io/op-stack [9] See https://constellation-labs.gitbook.io/constellation-labs-documentation/getting-started/overview [10] https://medium.com/the-interchain-foundation/expanding-the-interchain-taking-ibc-beyond-cosmos-appchains-b0f9bebb4405 Disclaimer: the information presented above is purely educational does not constitute financial advice, and represent the views of the author alone. Many thanks to my friends at Constellation Labs for their insightful conversations on Rollups-as-a-Service.

The Appchain Galaxy: Where Everyone Gets a Blockchain

Technologies and Implications of Appchains

Web3.com Ventures Original Research Analysis

0xFishylosopher

Introduction

Ethereum is great — for most people at least. But one of the key problems of Ethereum and other public-facing blockchains is that there’s not that much customizability. Sure, you can write and deploy smart contracts, but as a single user or company, you can’t really do much about the consensus mechanisms, execution engines, and other underlying architecture. After all, you aren’t the only one using Ethereum, so you don’t really get to dictate the rules.

Suppose that you are building a cross-chain decentralized exchange that needs to aggregate data from all these different chains [1], or you want an on-chain real-time-strategy game [2], or you’re a Fortune 500 company tracking your supply chain. Ready-made public blockchains can’t really handle the thoroughput, data aggregation, or privacy requirements that these applications need. This is where the concept of an “appchain” comes in. Appchains are decentralized, application-specific blockchain networks that are designed to support specific use cases and meet specific requirements. Within this essay, I will explore three exemplary approaches to the emerging appchain landscape: the Cosmos ecosystem, Hyperledger Fabric, and Rollups-as-a-Service (RaaS), before discussing appchains’ rise affects the Web3 infrastructure landscape at-large.

Cosmos Network

The Cosmos network is perhaps one of the most important appchain solutions within Web3. Stretching all the way back to 2016, the Cosmos network aims to create an “internet of blockchains,” or the infrastructure of a multi-chain world. Essentially, Cosmos provides an software development kit (SDK), which provides developers with the tools and interfaces they need to easily build their own blockchain applications. While the SDK has some default architectures that come out of the box, such as the Tendermint PoS consensus protocol, developers are given a large leeway to overwrite these default architectures and design a blockchain complete with its own custom consensus mechanism, token standards, and smart contract platforms.

Within the Cosmos Network, there are two distinct types of blockchains: “hubs” and “zones” [3]. “Zones” are fairly straightforward to understand — these are just the application-specific chains that run the application (such as a decentralized exchange or on-chain game) itself. “Hubs” on the other hand, function to connect these “zones” and other “hubs” together in order to enable inter-chain communication through the “Inter-Blockchain Communication Protocol” (IBC).

At the center of the Cosmos network is the “Cosmos Hub,” which is the first “Hub” chain to be built on the Cosmos network, and essentially acts as a “Grand Central Station” for all other hubs and zones. Nonetheless, as shown in the diagram above, “Hubs” don’t necessarily need to connect to the Cosmos Hub. A private company, for example, could choose to launch its own private hub that connects its own private zones.

Indeed, the Cosmos SDK has a focus on customizability and interoperability, and developers are given a wide latitude to let their imagination run free. Within the Cosmos ecosystem are some major general-purpose L1 chains, such as the EVM-Compatible BNB Chain (originally Binance Smart Chain) [4]. For an appchain builder, the interoperability of the Cosmos network is incredibly enabling, as it allows the appchain to access data from a large variety of chains through the IBC. This is one of the reasons why many appchains built using the Cosmos network are oracles, multi-chain DEXs, and other services that require the aggregation of data from a wide variety of different chains.

Hyperledger Fabric

Another way to implement appchains is through Hyperledger Fabric, an open-source blockchain platform for building enterprise-grade applications and solutions. It is one of the projects under the Hyperledger umbrella, which is hosted by the Linux Foundation [5].

Like the Cosmos SDK, Hyperledger Fabric is designed to support modular blockchain network architecture and provide flexibility in building and deploying blockchain-based applications. It provides a plug-and-play architecture that allows components such as consensus, membership services, and smart contract execution (chaincode) to be easily added or replaced. But the most important difference between the Cosmos SDK and Hyperledger Fabric is one of audience: Cosmos SDK is oriented towards Web3-native users and developers, whereas Hyperledger Fabric targets enterprise-grade institutional clients. Thus, while Cosmos SDK focuses on building a decentralized, interoperable system, Hyperledger Fabric focuses on building private and permissioned networks, where participants must be explicitly authorized to join and participate in the network.

Indeed, enterprise-grade appchains operate in a vastly different way to Web3-native appchains. Whereas Web3-native appchains often focus on the tokenomics and use the blockchain structure to financialize their products, enterprise-grade appchains focus on using the blockchain as an efficient way of record and maintain data. Fundamentally, these enterprise-grade appchains leverage the fact that blockchains are tamper-proof, verifiable, append-only data structures that can have data-writers spread out across time and geographic locations. Essentially, a Hyperledger Fabric-based appchain simply seeks to replace the dusty, inefficient databases enterprises have maintained for decades.

As shown in the diagram above, essentially all of the “permissioned nodes” on the Hyperledger Fabric blockchain are employees in various parts of the supply chain, using the blockchain as a way of synchronously recording different data states — such as if a piece of produce has entered a factory or a store. There is far less of a focus of interconnectivity between different blockchains, and unlike Cosmos, there is no native support for inter-blockchain connectivity using Hyperledger Fabric. Instead, there are data APIs and interfaces for integrating with other blockchain networks or systems. This means that it is possible to build inter-blockchain communication solutions using Hyperledger Fabric, but it requires custom development and integration, unlike the built-in hub-and-zone design of the Cosmos ecosystem. Intuitively, this makes sense: as an enterprise-grade appchain, blockchains built using Hyperledger fabric are usually siloed away from the outer world — after all, you don’t want the public to be messing with your supply chain data.

One case study of Hyperledger Fabric in action is in managing Walmart’s supply chain. Walmart’s appchain, also known as the “Walmart Food Traceability Platform,” is a blockchain-based food traceability system designed to improve food safety and transparency in the supply chain [6]. The Walmart appchain allows for the tracking and tracing of food products from farm to store shelves, helping to quickly identify and isolate any potential food safety issues. At every stage in the process, suppliers had to upload labels and certificates of authenticity through a web interface onto the blockchain, where it would be permanently accessible to all of the relevant stakeholders. With the blockchain serving as a single “ground truth” of information, this greatly reduced the time to record and track food quality issues, such as food-borne diseases:

The Hyperledger Fabric blockchain-based food traceability system built for the two products worked. For pork in China, it allowed uploading certificates of authenticity to the blockchain, bringing more trust to a system where that used to be a serious issue. And for mangoes in the US, the time needed to trace their provenance went from 7 days to
 2.2 seconds! — Hyperledger Foundation [6]

Thus, Hyperledger showcases the use of appchains in an entirely different setting to Cosmos — to large, enterprise-grade use-cases where the blockchain is used as an alternate to traditional data-recording and tracing methods. Sure, one can argue that this use-case of appchains is “not really Web3,” simply because it doesn’t leverage tokenomics, decentralization, and crypto-native principles in the same way. But the enterprise adoption of appchains cannot be ignored — the underlying technology is the same, and it is a vital step to gaining public trust of blockchain technologies to enable mass adoption. And in the long run, if companies like Walmart have an appchain setup, it is far easier to integrate them into the “Web3-native” world.

Rollups-as-a-Service

A more recent alternate approach to building appchains is through the use of rollups, especially through “rollups-as-a-service” (RaaS). Rollups-as-a-service is a type of blockchain scaling solution that allows for off-chain computation and storage while maintaining the security and trust of the underlying blockchain. They work by batching a large number of transactions into a single, compressed transaction that is then recorded on an underlying L1 blockchain, most commonly Ethereum.

Rollups can be used for appchains because they provide a way to scale the processing and storage of transactions for dApps built on these chains, without sacrificing the security and trust of the underlying blockchain. While the developer may sacrifice some customizability in the underlying consensus layer compared with a Cosmos-based solution, there are several advantages of using a rollup-as-a-service for an application-specific blockchain instead of a full chain.

Perhaps the most important advantage is that there is no need for developers to bootstrap a validator set. Because the consensus and settlement are outsourced to an underlying L1, this results in not only a cheaper appchain solution, but also one that is more robust, as it is secured through the underlying L1. Moreover, because rollups are vertical solutions that are modular in structure, they are not constrained to a single blockchain framework, and can leverage data and functionality on multiple blockchains.

Essentially, in the rollup appchain model, L1 chains become the equivalent of “Hubs” that contain wealths of data and security. A gaming appchain, for example, could use a rollup-based solution where Ethereum’s security is leveraged for consensus and settlement, and Solana’s high-thoroughput can be leveraged for executional speed [7].

Rollups-as-a-service companies provide the infrastructure and services necessary for appchain developers to implement their custom appchain rollup solution. Different RaaS abstract this toolkit to different levels: some projects, such as the Op Stack [8], are SDK based, allowing the developer to customize the rollup fairly extensively, other projects such as Constellation focus on a complete no-code “white-glove” deployment solution, allowing appchain developers to focus on the “app” part of development, rather than the “chain” part of development [9].

The Implications of Appchains

The emergence and rise of these appchain solutions have several profound implications for the Web3 infrastructure landscape at large. Appchains usher in a paradigmatic change between the relationship between “chains” and “apps” from a supply-driven model to a demand-driven model.

Before, when infrastructure solutions were still maturing, and many modern infrastructure concepts such as Proof of Stake, sharding, and rollups were still in the proof of concept stage, infrastructure projects were often implemented for the sake of exploring the feasibility of these technical solutions. This is most typically the case in “third-generation blockchains” such as Avalanche, Cardano, NEAR, and Solana. The supply of these chains (their throughput and technical implementations) drove the demand for these chains. In other words, the design of the “chains” determined the design of the “apps.”

However, the rise of appchains marks a distinct fourth-generation of blockchains, where demand for blockchains drives the supply of these blockchains. All of the technical routes presented above, the Cosmos network, Hyperledger Fabric, and Rollups-as-a-Service, present plug-and-play blockchain deployment solutions that are simple, customizable, and cost-effective. It has never been easier to design, customize, and deploy your own blockchain. Because of this, it is now the opposite way round: the design of the “apps” determines the design of the “chains.”

One likely result of this is that there will likely be a more concentrated L1 scene, with a shift from a monopolistic competition-style scene (where different L1s are slightly differentiated between one another) to an oligopolistic scene, dominated by a few key chains with enormous ecosystems and liquidity. This is because, with the advent of appchains, there is less of a reason for top-level DApps to deploy on smaller L1s rather than run their own chain (with infinitely customizable features) or deploy an appchain rollup on a large, established L1. Smaller L1 chains simply cannot compete with the combination of large L1s and appchains in terms of flexibility, customization, security, and support.

Furthermore, appchains are also a key mechanism to institutional onboarding to Web3. As mentioned before, though enterprise chains such as Hyperledger are usually excluded from our typical idea of Web3, they leverage the same underlying blockchain technologies to achieve greater transparency in logistics and supply management. So why can’t this also be extended into payment and beyond? Thus, I believe that it will only be a matter of time before these enterprise-grade blockchains become more integrated with native Web3 DeFi, payment, and NFT projects. Already, there is some exploration into the feasbility of expanding the Cosmos IBC beyond just Cosmos appchains, connecting platforms such as Hyperledger Fabric, and targeting other enterprise-grade blockchain solutions [10].

Conclusion

Nowadays, it is clear that there never was, and never will be a one-size-fits-all blockchain. Whether it be through Cosmos SDK, Hyperledger Fabric, or Rollups-as-a-Service, the maturity and ease of deployment for custom appchains will transform the Web3 infrastructure space into an app-first, user-first experience — a sign of growing maturity in Web3. The future of Web3 space is a multichain galaxy, and given their customizability, effectiveness, and ease of deployment, appchains will play an ever-increasingly important role in this universe, shining like stars in a galaxy of decentralization.

🐩 @0xfishylosopher

📅 6 February 2023

References

[1] Osmosis, a cross-chain DEX: https://osmosis.zone/

[2] Curio.gg, on-chain gaming company: https://www.curio.gg/

[3] https://newsletter.banklesshq.com/p/a-beginners-guide-to-cosmos

[4] See BNB Chain Whitepaper: https://github.com/bnb-chain/whitepaper/blob/master/WHITEPAPER.md

[5] https://www.ibm.com/products/blockchain-platform-hyperledger-fabric

[6] Walmart case study: https://www.hyperledger.org/learn/publications/walmart-case-study

[7] Example from https://messari.vercel.app/article/the-rollups-as-a-service-ecosystem

[8] See https://www.optimism.io/op-stack

[9] See https://constellation-labs.gitbook.io/constellation-labs-documentation/getting-started/overview

[10] https://medium.com/the-interchain-foundation/expanding-the-interchain-taking-ibc-beyond-cosmos-appchains-b0f9bebb4405

Disclaimer: the information presented above is purely educational does not constitute financial advice, and represent the views of the author alone.

Many thanks to my friends at Constellation Labs for their insightful conversations on Rollups-as-a-Service.
ChatGPT for Web3Catalyzing Education, Inspiration, and Iteration Web3.com Ventures Original Research Analysis 0xFishylosopher Introduction Like many others, my mind was absolutely blown away by ChatGPT’s capabilities. Though I had played with GPT-3’s playground API before, and have written about newer image models such as DALLE 2 before, I was not prepared for the breadth, depth, and fluidity of GPT-3’s responses. From sonnets to Solidity, philosophy to physics, the quality of the content was simply stunning. Still in its infancy, ChatGPT is undoubtedly a potentially disruptive force to every industry, including Web3. In this essay, I will endeavor to outline some of the ways I believe ChatGPT will accelerate the industry’s evolution, catalyzing the processes of education, inspiration, and iteration in Web3. Education First, AI systems like ChatGPT are incredibly useful in educating and helping users navigate the vast world of Web 3. As an industry that melds together cutting-edge technology with opaque financial models, Web 3 has a particularly pronounced education problem. To make a single transaction on a decentralized exchange (DEX) such as Uniswap v3, a user has to know about wallets, seed phrases, concentrated liquidity, liquidity pools etc. All this makes onboarding hellish for a new user, and is currently a major bottleneck for Web 3’s mass adoption [1]. Of course, one can argue that there are countless resources online — blogs, videos, courses, games — that aim at onboarding users into the industry. But there is one massive bootstrapping problem to all this: where do you start? Googling “What is Web3” gives you a bunch of ads, followed by a Wikipedia page containing jargon [2]: Web3 (also known as Web 3.0) is an idea for a new iteration of the World Wide Web which incorporates concepts such as decentralization, blockchain technologies, and token-based economics. — Wikipedia What are “blockchain technologies,” “token-based economics,” and why do we need a “new iteration of the World Wide Web”? For a complete beginner, none of this makes any sense. You need to make a bunch of additional queries and click on a bunch of different links before you find something that you understand. Fundamentally, this is because a search engine like Google is an aggregator of information and an indexer of content, rather than a producer of this content. Simply put, Google is basically a digital librarian. It doesn’t know how to exactly answer your question, and can only point you to resources it thinks will help. If the content that Google curates doesn’t match your needs, then tough luck. Even all the personalized information that Google gathers about you doesn’t help with this: all it does is try to point you to better resources that it thinks may help you more [3]. ChatGPT, on the other hand, is paradigmatically different: it is an original producer of content, rather than a simple indexer of content. Rather than act as a librarian that points you to resources, ChatGPT is more like a mentor or a teacher that actually tries to teach you what everything means, presented in the way that you tell it to: through giving you analogies, through directing you to appropriate blogs or resources, or through writing a sonnet about it [4]. As an original content creator, ChatGPT does not rely on a fixed corpus of already-made responses, but rather makes a new one for your specific circumstances. This is why ChatGPT can act as an effective one-stop shop for absolute beginners. Of course, education goes beyond just for absolute beginners; it is a useful tool for anyone wanting to understand a novel corner of Web3, be it zk-SNARKs or inter-blockchain communication protocols. As Web3 is both a diverse and technically challenging industry, there are always new facets, ideas, and concepts to wrap your head around. And that’s why a sophisticated educational tool, such as ChatGPT, is sorely needed in Web3. Inspiration ChatGPT not only can teach you the principles of the technology; it can also give you ideas on how to apply the technology in practice. Essentially, ChatGPT can also act as a “fresh set of eyes” that can snowball innovation. Here’s an example: Q: What are some examples of dApps that ChatGPT can enable or enhance A: Some examples of dApps that ChatGPT could enable or enhance include: Social networking dApps — ChatGPT could be used to generate natural language responses to user posts, creating a more engaging and lifelike experience for users. Gaming dApps — ChatGPT could be used to generate human-like responses from in-game characters, creating a more immersive gaming experience. Supply chain management dApps — ChatGPT could be used to automatically generate smart contracts for tracking and managing the flow of goods and services within a supply chain. Voting dApps — ChatGPT could be used to automatically generate natural language summaries of voting results and other data from the blockchain, making it easier for users to understand and track the results of elections and other voting processes. Predictive analytics dApps — ChatGPT could be used to generate natural language predictions and insights based on data from the blockchain, providing users with more accessible and intuitive ways to analyze and understand complex data. — OpenAI ChatGPT Some of the dApp ideas presented here are ones that I have thought about before, namely (1) social media chatbots and (3) smart-contract generation. Though (2) obviously seems to make sense in hindsight, I actually did not think of this right off the top of my head. (4) and (5) are further applications of ChatGPT in the realm of analytics that I completely missed. The more important point is that ChatGPT has the ability to inspire: it gives you ideas on what type of projects you can create using a specific technology in a specific field. Because it is trained on essentially the entire Internet, it can draw parallels and inferences between disparate fields to create a combinatoric explosion in the possibility of ideas. Just imagine the power that ChatGPT can bring, to a hackathon team ready to get their hands dirty after learning a specific tech stack, or to the founder that has only the vaguest of ideas of what she wants to solve. While ChatGPT’s power of inspiration is game-changing in any field or industry, it is particularly powerful for potential builders in Web3. Fundamentally, Web3 is still a nascent field, one that requires a torrent of new ideas, new models, and new concepts in order to push it toward maturity and mass adoption. At this early stage, because there is no set formula for success, every experiment and idea counts in this industry. Because ChatGPT is able to give you original content, it is also a far better source of inspiration than “a list of 10 project ideas that you can do to improve your Python” online, which are often cookie-cutter and mundane. Web3 is also unique in that the field offers enormous financial incentives, such as grants and hackathon rewards, for such experimental innovation [5]. Thus, ChatGPT can potentially be an enormous catalyst in allowing people to discover and explore new project directions, providing a guiding source of inspiration that kickstarts the flywheel of disruptive innovation. Iteration Nonetheless, arguably one of ChatGPT’s most impressive capabilities is its ability to be involved in the iterative process of creation itself. Basically, ChatGPT not only educates and inspires, but allows you to concretely instantiate your idea in code in two important ways: as an original creator of code, and as a sanity check for security vulnerabilities. Here’s an example of the first case: Q: Write a smart contract that creates a ERC-20 token called GPToken with symbol GPT using the OpenZeppelin with a disable_mint function that is triggered when 1 million tokens are minted A: While this smart contract has some style and optimization potential, such as removing the use of decimals (since the number of tokens will always be an integer), and adding in an if statement to check if the number of tokens equals 1 million and disable_mint if so, it gets the basic gist of the smart contract code, and acts as a great boilerplate starting point. Given that a large portion of development time is dedicated to building these boilerplate templates, ChatGPT can drastically increase the pace of development. Its capabilities are especially impressive when we consider that its output can be customized into various use cases (such as the 1 million token disable-mint here) that traditional online how-to’s and coding guides cannot give. In the field of creation, ChatGPT therefore greatly reduces the developer’s equivalent of “writer’s bloc,” where you have a vague idea of how the code might look but don’t know where to start from. The second use-case of ChatGPT with actual smart contract code is in detecting vulnerabilities in smart contract code. Even as ChatGPT claims that it does not specifically audit smart contracts, when you feed in a vulnerable smart contract, it is able to provide decent feedback on the contract’s potential vulnerabilities. As with ChatGPT’s creation of code, the system’s raw output may not be perfect, and may mis-detect some errors in the process. Therefore, while it is not reliable as a complete replacement for a human auditor, it represents a passable first-checker that looks for obvious vulnerabilities [6]. Conclusion To what extent can ChatGPT truly automate the development process for Web3? I believe that Vitalik summarizes this fairly well: At this point, AI is quite far from being a substitute for human programmers
 That said, it did introduce me to some coding patterns I had not seen before, and it wrote the base converter faster than I would have on my own
 That said, AI is improving quickly and I expect it to keep improving further and ironing out bugs like this over time. [7] The bottom line for an advanced AI like ChatGPT is that it accelerates the learning, thinking, and coding phases of Web3 innovation by reducing the amount of “dead time” spent thinking about how to go from zero to one and repeatedly writing boilerplate code. While in the future, especially with domain-specific fine-tuning, we may be able to see much more sophisticated accurate code automaters, there’s still no way to take humans out of the process of creation. 🐩 @0xfishylosopher 📅 28 December 2022 References [1] For more information on the onboarding problem, see: https://moralis.io/how-to-boost-web3-user-onboarding-success-rates/ [2] https://en.wikipedia.org/wiki/Web3 [3] https://www.nytimes.com/2022/12/21/technology/ai-chatgpt-google-search.html [4] https://www.cnbc.com/2022/12/15/google-vs-chatgpt-what-happened-when-i-swapped-services-for-a-day.html [5] https://www.decentralised.co/p/grants-and-web3 [6] https://beosin.com/resources/can-chatgpt-the-"most-powerful-ai"-detect-vulnerabilities [7] https://vitalik.ca/general/2022/12/06/gpt3.html

ChatGPT for Web3

Catalyzing Education, Inspiration, and Iteration

Web3.com Ventures Original Research Analysis

0xFishylosopher

Introduction

Like many others, my mind was absolutely blown away by ChatGPT’s capabilities. Though I had played with GPT-3’s playground API before, and have written about newer image models such as DALLE 2 before, I was not prepared for the breadth, depth, and fluidity of GPT-3’s responses. From sonnets to Solidity, philosophy to physics, the quality of the content was simply stunning.

Still in its infancy, ChatGPT is undoubtedly a potentially disruptive force to every industry, including Web3. In this essay, I will endeavor to outline some of the ways I believe ChatGPT will accelerate the industry’s evolution, catalyzing the processes of education, inspiration, and iteration in Web3.

Education

First, AI systems like ChatGPT are incredibly useful in educating and helping users navigate the vast world of Web 3. As an industry that melds together cutting-edge technology with opaque financial models, Web 3 has a particularly pronounced education problem. To make a single transaction on a decentralized exchange (DEX) such as Uniswap v3, a user has to know about wallets, seed phrases, concentrated liquidity, liquidity pools etc. All this makes onboarding hellish for a new user, and is currently a major bottleneck for Web 3’s mass adoption [1].

Of course, one can argue that there are countless resources online — blogs, videos, courses, games — that aim at onboarding users into the industry. But there is one massive bootstrapping problem to all this: where do you start? Googling “What is Web3” gives you a bunch of ads, followed by a Wikipedia page containing jargon [2]:

Web3 (also known as Web 3.0) is an idea for a new iteration of the World Wide Web which incorporates concepts such as decentralization, blockchain technologies, and token-based economics. — Wikipedia

What are “blockchain technologies,” “token-based economics,” and why do we need a “new iteration of the World Wide Web”? For a complete beginner, none of this makes any sense. You need to make a bunch of additional queries and click on a bunch of different links before you find something that you understand. Fundamentally, this is because a search engine like Google is an aggregator of information and an indexer of content, rather than a producer of this content. Simply put, Google is basically a digital librarian. It doesn’t know how to exactly answer your question, and can only point you to resources it thinks will help. If the content that Google curates doesn’t match your needs, then tough luck. Even all the personalized information that Google gathers about you doesn’t help with this: all it does is try to point you to better resources that it thinks may help you more [3].

ChatGPT, on the other hand, is paradigmatically different: it is an original producer of content, rather than a simple indexer of content. Rather than act as a librarian that points you to resources, ChatGPT is more like a mentor or a teacher that actually tries to teach you what everything means, presented in the way that you tell it to: through giving you analogies, through directing you to appropriate blogs or resources, or through writing a sonnet about it [4]. As an original content creator, ChatGPT does not rely on a fixed corpus of already-made responses, but rather makes a new one for your specific circumstances. This is why ChatGPT can act as an effective one-stop shop for absolute beginners.

Of course, education goes beyond just for absolute beginners; it is a useful tool for anyone wanting to understand a novel corner of Web3, be it zk-SNARKs or inter-blockchain communication protocols. As Web3 is both a diverse and technically challenging industry, there are always new facets, ideas, and concepts to wrap your head around. And that’s why a sophisticated educational tool, such as ChatGPT, is sorely needed in Web3.

Inspiration

ChatGPT not only can teach you the principles of the technology; it can also give you ideas on how to apply the technology in practice. Essentially, ChatGPT can also act as a “fresh set of eyes” that can snowball innovation. Here’s an example:

Q: What are some examples of dApps that ChatGPT can enable or enhance

A: Some examples of dApps that ChatGPT could enable or enhance include:

Social networking dApps — ChatGPT could be used to generate natural language responses to user posts, creating a more engaging and lifelike experience for users.

Gaming dApps — ChatGPT could be used to generate human-like responses from in-game characters, creating a more immersive gaming experience.

Supply chain management dApps — ChatGPT could be used to automatically generate smart contracts for tracking and managing the flow of goods and services within a supply chain.

Voting dApps — ChatGPT could be used to automatically generate natural language summaries of voting results and other data from the blockchain, making it easier for users to understand and track the results of elections and other voting processes.

Predictive analytics dApps — ChatGPT could be used to generate natural language predictions and insights based on data from the blockchain, providing users with more accessible and intuitive ways to analyze and understand complex data. — OpenAI ChatGPT

Some of the dApp ideas presented here are ones that I have thought about before, namely (1) social media chatbots and (3) smart-contract generation. Though (2) obviously seems to make sense in hindsight, I actually did not think of this right off the top of my head. (4) and (5) are further applications of ChatGPT in the realm of analytics that I completely missed.

The more important point is that ChatGPT has the ability to inspire: it gives you ideas on what type of projects you can create using a specific technology in a specific field. Because it is trained on essentially the entire Internet, it can draw parallels and inferences between disparate fields to create a combinatoric explosion in the possibility of ideas. Just imagine the power that ChatGPT can bring, to a hackathon team ready to get their hands dirty after learning a specific tech stack, or to the founder that has only the vaguest of ideas of what she wants to solve.

While ChatGPT’s power of inspiration is game-changing in any field or industry, it is particularly powerful for potential builders in Web3. Fundamentally, Web3 is still a nascent field, one that requires a torrent of new ideas, new models, and new concepts in order to push it toward maturity and mass adoption. At this early stage, because there is no set formula for success, every experiment and idea counts in this industry. Because ChatGPT is able to give you original content, it is also a far better source of inspiration than “a list of 10 project ideas that you can do to improve your Python” online, which are often cookie-cutter and mundane. Web3 is also unique in that the field offers enormous financial incentives, such as grants and hackathon rewards, for such experimental innovation [5]. Thus, ChatGPT can potentially be an enormous catalyst in allowing people to discover and explore new project directions, providing a guiding source of inspiration that kickstarts the flywheel of disruptive innovation.

Iteration

Nonetheless, arguably one of ChatGPT’s most impressive capabilities is its ability to be involved in the iterative process of creation itself. Basically, ChatGPT not only educates and inspires, but allows you to concretely instantiate your idea in code in two important ways: as an original creator of code, and as a sanity check for security vulnerabilities.

Here’s an example of the first case:

Q: Write a smart contract that creates a ERC-20 token called GPToken with symbol GPT using the OpenZeppelin with a disable_mint function that is triggered when 1 million tokens are minted

A:

While this smart contract has some style and optimization potential, such as removing the use of decimals (since the number of tokens will always be an integer), and adding in an if statement to check if the number of tokens equals 1 million and disable_mint if so, it gets the basic gist of the smart contract code, and acts as a great boilerplate starting point. Given that a large portion of development time is dedicated to building these boilerplate templates, ChatGPT can drastically increase the pace of development. Its capabilities are especially impressive when we consider that its output can be customized into various use cases (such as the 1 million token disable-mint here) that traditional online how-to’s and coding guides cannot give. In the field of creation, ChatGPT therefore greatly reduces the developer’s equivalent of “writer’s bloc,” where you have a vague idea of how the code might look but don’t know where to start from.

The second use-case of ChatGPT with actual smart contract code is in detecting vulnerabilities in smart contract code. Even as ChatGPT claims that it does not specifically audit smart contracts, when you feed in a vulnerable smart contract, it is able to provide decent feedback on the contract’s potential vulnerabilities.

As with ChatGPT’s creation of code, the system’s raw output may not be perfect, and may mis-detect some errors in the process. Therefore, while it is not reliable as a complete replacement for a human auditor, it represents a passable first-checker that looks for obvious vulnerabilities [6].

Conclusion

To what extent can ChatGPT truly automate the development process for Web3? I believe that Vitalik summarizes this fairly well:

At this point, AI is quite far from being a substitute for human programmers
 That said, it did introduce me to some coding patterns I had not seen before, and it wrote the base converter faster than I would have on my own
 That said, AI is improving quickly and I expect it to keep improving further and ironing out bugs like this over time. [7]

The bottom line for an advanced AI like ChatGPT is that it accelerates the learning, thinking, and coding phases of Web3 innovation by reducing the amount of “dead time” spent thinking about how to go from zero to one and repeatedly writing boilerplate code. While in the future, especially with domain-specific fine-tuning, we may be able to see much more sophisticated accurate code automaters, there’s still no way to take humans out of the process of creation.

🐩 @0xfishylosopher

📅 28 December 2022

References

[1] For more information on the onboarding problem, see: https://moralis.io/how-to-boost-web3-user-onboarding-success-rates/

[2] https://en.wikipedia.org/wiki/Web3

[3] https://www.nytimes.com/2022/12/21/technology/ai-chatgpt-google-search.html

[4] https://www.cnbc.com/2022/12/15/google-vs-chatgpt-what-happened-when-i-swapped-services-for-a-day.html

[5] https://www.decentralised.co/p/grants-and-web3

[6] https://beosin.com/resources/can-chatgpt-the-"most-powerful-ai"-detect-vulnerabilities

[7] https://vitalik.ca/general/2022/12/06/gpt3.html
zk-ASM: Powering Rollups and MoreHow zk-ASM may deliver a secure and trustless internet Web3.com Ventures Original Research Analysis 0xFishylosopher Note: This article is a fairly technically-dense piece, and assumes basic conceptual familiarity with zk-Proofs and/or zk-Rollups. A more general introduction to these principles can be found here. Introduction Zero Knowledge Proofs, in particular zk-SNARKs (Succinct Non-interactive Arguments of Knowledge) is perhaps one of the most important technologies at the frontiers of Web 3. While most of the media and investment attention in this sub-field have gone towards zk-Rollups, scaling solutions that provide magnitudes of scalability to L1 blockchains such as Ethereum, this is by no means the only application of zk-SNARKs. In this essay, I will analyze in-depth the concept of Zero-Knowledge Assembly code (or zkASM), evaluating its use-cases in both zk-Rollups and beyond, exploring its theoretical possibilities in re-inventing the Internet as we know. Technical Principles zk-ASM, as its name suggests, contains two main technical parts: ZK and ASM. The ZK part refers to zk-SNARKs, or Succinct Non-Interactive Arguments of Knowledge, while the ASM part refers to Assembly code. To understand the potential of zk-ASM, we must first understand the theoretical foundations of both these seemingly arcane concepts. zk-SNARKs zk-SNARKs are the crown jewels of zk-Proofs: they are a succinct proof that a certain statement is True where the proof reveals nothing about the data being proved. For example, consider someone asserting the statement “I know an m such that C(m) = 0”, where m is a gigabyte long message and C is a function. A zk-SNARK would be a very short proof (< 1GB) that can be quickly verified and one where nothing about m is revealed (beyond publicly available information) [1]. So what is this “C(m)”? How is it useful? This function is actually an arithmetic circuit, or a Directed Acyclic Graph (DAG) representation of a specific function that we want to carry out, as the diagram shows [2]. The “m” is essentially the entry data into the circuit, and specific “nodes” in the circuit are individual logic gates or arithmetic operations. For example, a “+” node may have “2” and “3” as inputs, and output a “5” onto the next operator. Thus, an arbitrary arithmetic or logical operation may be encoded in an “arithmetic circuit.” Once we have this arithmetic circuit as a representation of the code we want to run a zk-SNARK on, we can begin building this zk-SNARK. Fundamentally, a zk-SNARK is possible because of the “fundamental theorem of algebra,” which states that a polynomial of degree “d” has at most “d” roots [3]. The mathematical trick is two steps: (1) to somehow convert the function “f(m)” that we want to prove into a polynomial (and stick with that), and (2) use the “fundamental theorem of algebra” to interact with the polynomial and provide a succinct proof. In technical jargon, the first part is called a “Polynomial Commitment Scheme” (PCS), and the second part is called a “Polynomial Interactive Oracle Proof” (PIOP) [4]. While the specific implementations of a PCS and PIOP are beyond the scope of this article, thus far we have derived a rough sketch for the core steps of a zk-SNARK: Have a function-of-choice (code function, math equation etc.) that you want to run a zk-SNARK Encode this function as an arithmetic circuit C(m) Run a PCS to get a polynomial representation of this arithmetic circuit Run a PIOP to get a succinct proof logarithmic in size to the original “m” And viola, we have a custom-built zk-SNARK that can prove that someone knows a given message without revealing what that message is. Assembly Code The second piece of the puzzle of zk-ASM is the idea of Assembly Code. Assembly code is a class of languages containing very low-language instructions that are easy for a machine to read but fairly difficult for a human to decipher. Unlike higher-level languages, such as Python, Java, or even C, Assembly languages contain very primitive functions, such as move, compare, add, and jump on a series of data registers on the processor and hard-coded memory locations. For example, the Python code to print the numbers 1 to 9 on the screen is 123456789: Pretty easy to understand, right? Now here’s the x86 Assembly version of it [5]: A lot nastier, particularly for such a simple operation. So why even use Assembly language? As stated above, while these instructions may not be easy for a human to read, they are very easy to “assemble” into the 110011001 byte-code for a machine to read and execute (this is called an assembler) [6]. Comparatively speaking, higher-level languages such as Python and Java are a lot more human-friendly to read, but programs written in these languages cannot be directly executed by a processor. Instead, we need to rely on a “compiler” that chews on the Python or Java code that we write and spits out a dump of assembly code like the one above to be assembled and executed by the machine. We can expect the same piece of Python or Java to run smoothly across different processors and different operating systems because the compiler does the heavy-lifting, compiling your source code into an Assembly language specific to that processor or operating system. Because all languages compile down to assembly code (which itself gets compiled down to executable binary), assembly is essentially like a “mother of all languages.” Now suppose that we’re able to turn all of the operands in an Assembly language (such as x86 or RISC-V) into an arithmetic circuit representation, such that we are able to provide zk-SNARK proofs of all the operands in this Assembly language. This means that we are theoretically capable of providing a zk-SNARK of any program written in an arbitrary high-level language (such as Python or Java) that compiles down to this Assembly language. And that’s why we need to care to think about zk-ASMs. Practical Applications zk-EVM Rollups: Polygon zk-ASM One of the most important applications for zk-ASM is in creating Ethereum Virtual Machine-compatible zk-Rollups, or zk-EVMs. A zk-EVM is incredibly important to blockchain scalability because it allows for programmers to deploy on a zk-Rollup based L2 chain without modifying much (if any) of their code [7]. In this field, Polygon’s zk-EVM is an exemplary case study that demonstrates how zk-ASM may be used to achieve this goal. When programmers develop on the Ethereum L1 blockchain, they usually code in Solidity, which is a high-level language akin to C. This Solidity code is compiled into a series of EVM Opcodes, such as ADD, SLOAD, and EQ, before being executed on the L1 blockchain [8]. By default, this process obviously does not create any sort of zk-Proof. Polygon’s trick is to create a method to interpret each of the EVM Opcodes into their custom-written zk-ASM, which is very zk-SNARK friendly. Then, their L2 zk-EVM will execute the zk-ASM, while also creating a zk-SNARK circuit of the ASM in order to create a zk-SNARK proof [9]. For example, the ADD opcode in the EVM will be translated into Polygon’s zk-ASM as follows [10]: Because Polygon zk-EVM’s sleight-of-hand happens at the Assembly level, it is two levels removed from the code that the average Ethereum programmer touches, the “Solidity” level. This is the reason why most developers can port their EVM code built for the Ethereum mainnet directly over to the Polygon zk-EVM. Furthermore, because Polygon zk-EVM “keeps” the tech stack of Ethereum down to the opcode level, all of the debugging infrastructure that relies on analyzing compiled opcodes will all be kept usable and intact. This is unlike some other zk-EVM designs, such as zk-Sync, which does not provide zk-Proofs on an opcode level. Thus, even as Polygon invents and proves its own Assembly language, Vitalik writes that “it can still verify EVM code, it just uses some different internal logic to do it” [11]. Beyond Rollups: zk-WASM zk-EVMs are by no means the only application for zk-ASMs. Recall our previous assertion that Assembly languages are essentially “the mother of all languages,” and that the creation of a zk-ASM will unlock zk-Proofs for generic programs written in any language that compiles to that Assembly language. Web Assembly, or WASM, is one of the most important emerging assembly languages. First published in 2018, the point of WASM is to create an Assembly language that increased the execution speed of Web Apps and provided an executional complement to Javascript, the primary coding language behind the Web [12]. Essentially, as the Web developed over the years, the growing size and complexity of Web Apps has meant that often it is incredibly slow for browsers to compile everything written in Javascript, and must rely on complex compile-optimize-reloading cycles [12]. WebAssembly, on the other hand, removes the need to rely on complex browser execution engines by providing a portable, modular and easily executable assembly language. Furthermore, as an Assembly language, WASM allows programmers to directly write code snippets in C, C++, Rust, Java or Ruby that runs natively in a browser. WASM has therefore become a technology of choice for “providing distributed serverless functions” [13]. So why and how do zk-SNARKs come into the picture? WASM is unique in that it is a client-side technology, able to directly interact with user inputs and data. Because oftentimes this includes sensitive data such as passwords and personal information, we need a technology that (1) ensures that the program executes correctly, and that (2) our sensitive information is not leaked. As described above, a zk-SNARK is a perfect solution to solve both these issues, and is thus an important puzzle piece in securing WASM [14]. While work on developing zk-WASM is still in its early stages, recently there have been some projects that have released prototype zk-SNARK circuits for WebAssembly. For example, Delphinus Lab’s “ZAWA” zk-SNARK Emulator presents a method to encode the operands and semantics of a WASM virtual machine into an arithmetic circuit, which allows it to conduct zk-SNARK proofs [13]. As time goes forth, zk-WASM circuits will undoubtedly be continuously optimized, thus allowing programs written in generic languages (such as C, C++, Rust and Ruby) to adopt the paradigm of zk-Proofs. Conclusion Throughout this essay, we have explored the theoretical underpinnings of zk-ASM as well as examined two paradigmatic case studies of zk-ASM: Polygon’s use of zk-ASM to create an opcode-level zk-EVM, as well as the application of zk-SNARKs onto WebAssembly to create zk-WASM. Ultimately, the promise of zk-ASM is to put together the interoperability and scale of Web 2 with the trustlessness and security of Web 3. One the one hand, blockchains increasingly look to scale beyond their current throughput bottlenecks and potentially support the execution, while on the other, Web 2 methods have become increasingly under attack for inadequately protecting user data and privacy. As programmers are able to employ Web 3 design paradigms in their Web 2 code and introduce Web 2 languages and code onto the blockchain, generic zk-ASMs may represent a merging point in the world of Web 2 and Web 3 [15]. It is in this sense, that zk-ASM may allow us to reimagine a more secure and trustless Internet. 🐩 @0xfishylosopher 📅 17 December 2022 Disclaimer: the information presented above is purely educational does not constitute financial advice, and represents the views of the author alone. Delphinus Lab is a portfolio company of Web3.com Ventures. References [1] https://z.cash/technology/zksnarks/ [2] https://cs251.stanford.edu/lectures/lecture14.pdf [3] https://www.britannica.com/science/fundamental-theorem-of-algebra [4] Building efficient SNARKs: https://cs251.stanford.edu/lectures/lecture15.pdf [5] Example from: https://www.tutorialspoint.com/assembly_programming/assembly_loops.htm [6] https://en.wikipedia.org/wiki/Assembly_language [7] https://www.alchemy.com/overviews/zkevm [8] For list of opcodes: https://ethereum.org/en/developers/docs/evm/opcodes/ [9] https://wiki.polygon.technology/docs/zkEVM/zkASM/introduction [10] https://wiki.polygon.technology/docs/zkEVM/zkASM/some-examples [11] https://vitalik.ca/general/2022/08/04/zkevm.html [12] https://blog.developer.adobe.com/understanding-webassembly-wasm-d5b592208ecc [13] https://jhc.sjtu.edu.cn/~hongfeifu/manuscriptb.pdf [14] https://hyperoracle.medium.com/zkwasm-the-next-chapter-of-zk-and-zkvm-471038b1fba6 [15] https://delphinuslab.com/zk-wasm/

zk-ASM: Powering Rollups and More

How zk-ASM may deliver a secure and trustless internet

Web3.com Ventures Original Research Analysis

0xFishylosopher

Note: This article is a fairly technically-dense piece, and assumes basic conceptual familiarity with zk-Proofs and/or zk-Rollups. A more general introduction to these principles can be found here.

Introduction

Zero Knowledge Proofs, in particular zk-SNARKs (Succinct Non-interactive Arguments of Knowledge) is perhaps one of the most important technologies at the frontiers of Web 3. While most of the media and investment attention in this sub-field have gone towards zk-Rollups, scaling solutions that provide magnitudes of scalability to L1 blockchains such as Ethereum, this is by no means the only application of zk-SNARKs. In this essay, I will analyze in-depth the concept of Zero-Knowledge Assembly code (or zkASM), evaluating its use-cases in both zk-Rollups and beyond, exploring its theoretical possibilities in re-inventing the Internet as we know.

Technical Principles

zk-ASM, as its name suggests, contains two main technical parts: ZK and ASM. The ZK part refers to zk-SNARKs, or Succinct Non-Interactive Arguments of Knowledge, while the ASM part refers to Assembly code. To understand the potential of zk-ASM, we must first understand the theoretical foundations of both these seemingly arcane concepts.

zk-SNARKs

zk-SNARKs are the crown jewels of zk-Proofs: they are a succinct proof that a certain statement is True where the proof reveals nothing about the data being proved. For example, consider someone asserting the statement “I know an m such that C(m) = 0”, where m is a gigabyte long message and C is a function. A zk-SNARK would be a very short proof (< 1GB) that can be quickly verified and one where nothing about m is revealed (beyond publicly available information) [1].

So what is this “C(m)”? How is it useful? This function is actually an arithmetic circuit, or a Directed Acyclic Graph (DAG) representation of a specific function that we want to carry out, as the diagram shows [2]. The “m” is essentially the entry data into the circuit, and specific “nodes” in the circuit are individual logic gates or arithmetic operations. For example, a “+” node may have “2” and “3” as inputs, and output a “5” onto the next operator. Thus, an arbitrary arithmetic or logical operation may be encoded in an “arithmetic circuit.”

Once we have this arithmetic circuit as a representation of the code we want to run a zk-SNARK on, we can begin building this zk-SNARK. Fundamentally, a zk-SNARK is possible because of the “fundamental theorem of algebra,” which states that a polynomial of degree “d” has at most “d” roots [3]. The mathematical trick is two steps: (1) to somehow convert the function “f(m)” that we want to prove into a polynomial (and stick with that), and (2) use the “fundamental theorem of algebra” to interact with the polynomial and provide a succinct proof. In technical jargon, the first part is called a “Polynomial Commitment Scheme” (PCS), and the second part is called a “Polynomial Interactive Oracle Proof” (PIOP) [4].

While the specific implementations of a PCS and PIOP are beyond the scope of this article, thus far we have derived a rough sketch for the core steps of a zk-SNARK:

Have a function-of-choice (code function, math equation etc.) that you want to run a zk-SNARK

Encode this function as an arithmetic circuit C(m)

Run a PCS to get a polynomial representation of this arithmetic circuit

Run a PIOP to get a succinct proof logarithmic in size to the original “m”

And viola, we have a custom-built zk-SNARK that can prove that someone knows a given message without revealing what that message is.

Assembly Code

The second piece of the puzzle of zk-ASM is the idea of Assembly Code. Assembly code is a class of languages containing very low-language instructions that are easy for a machine to read but fairly difficult for a human to decipher. Unlike higher-level languages, such as Python, Java, or even C, Assembly languages contain very primitive functions, such as move, compare, add, and jump on a series of data registers on the processor and hard-coded memory locations. For example, the Python code to print the numbers 1 to 9 on the screen is 123456789:

Pretty easy to understand, right? Now here’s the x86 Assembly version of it [5]:

A lot nastier, particularly for such a simple operation. So why even use Assembly language? As stated above, while these instructions may not be easy for a human to read, they are very easy to “assemble” into the 110011001 byte-code for a machine to read and execute (this is called an assembler) [6]. Comparatively speaking, higher-level languages such as Python and Java are a lot more human-friendly to read, but programs written in these languages cannot be directly executed by a processor. Instead, we need to rely on a “compiler” that chews on the Python or Java code that we write and spits out a dump of assembly code like the one above to be assembled and executed by the machine. We can expect the same piece of Python or Java to run smoothly across different processors and different operating systems because the compiler does the heavy-lifting, compiling your source code into an Assembly language specific to that processor or operating system.

Because all languages compile down to assembly code (which itself gets compiled down to executable binary), assembly is essentially like a “mother of all languages.” Now suppose that we’re able to turn all of the operands in an Assembly language (such as x86 or RISC-V) into an arithmetic circuit representation, such that we are able to provide zk-SNARK proofs of all the operands in this Assembly language. This means that we are theoretically capable of providing a zk-SNARK of any program written in an arbitrary high-level language (such as Python or Java) that compiles down to this Assembly language. And that’s why we need to care to think about zk-ASMs.

Practical Applications

zk-EVM Rollups: Polygon zk-ASM

One of the most important applications for zk-ASM is in creating Ethereum Virtual Machine-compatible zk-Rollups, or zk-EVMs. A zk-EVM is incredibly important to blockchain scalability because it allows for programmers to deploy on a zk-Rollup based L2 chain without modifying much (if any) of their code [7]. In this field, Polygon’s zk-EVM is an exemplary case study that demonstrates how zk-ASM may be used to achieve this goal.

When programmers develop on the Ethereum L1 blockchain, they usually code in Solidity, which is a high-level language akin to C. This Solidity code is compiled into a series of EVM Opcodes, such as ADD, SLOAD, and EQ, before being executed on the L1 blockchain [8]. By default, this process obviously does not create any sort of zk-Proof. Polygon’s trick is to create a method to interpret each of the EVM Opcodes into their custom-written zk-ASM, which is very zk-SNARK friendly. Then, their L2 zk-EVM will execute the zk-ASM, while also creating a zk-SNARK circuit of the ASM in order to create a zk-SNARK proof [9]. For example, the ADD opcode in the EVM will be translated into Polygon’s zk-ASM as follows [10]:

Because Polygon zk-EVM’s sleight-of-hand happens at the Assembly level, it is two levels removed from the code that the average Ethereum programmer touches, the “Solidity” level. This is the reason why most developers can port their EVM code built for the Ethereum mainnet directly over to the Polygon zk-EVM. Furthermore, because Polygon zk-EVM “keeps” the tech stack of Ethereum down to the opcode level, all of the debugging infrastructure that relies on analyzing compiled opcodes will all be kept usable and intact. This is unlike some other zk-EVM designs, such as zk-Sync, which does not provide zk-Proofs on an opcode level. Thus, even as Polygon invents and proves its own Assembly language, Vitalik writes that “it can still verify EVM code, it just uses some different internal logic to do it” [11].

Beyond Rollups: zk-WASM

zk-EVMs are by no means the only application for zk-ASMs. Recall our previous assertion that Assembly languages are essentially “the mother of all languages,” and that the creation of a zk-ASM will unlock zk-Proofs for generic programs written in any language that compiles to that Assembly language. Web Assembly, or WASM, is one of the most important emerging assembly languages. First published in 2018, the point of WASM is to create an Assembly language that increased the execution speed of Web Apps and provided an executional complement to Javascript, the primary coding language behind the Web [12].

Essentially, as the Web developed over the years, the growing size and complexity of Web Apps has meant that often it is incredibly slow for browsers to compile everything written in Javascript, and must rely on complex compile-optimize-reloading cycles [12]. WebAssembly, on the other hand, removes the need to rely on complex browser execution engines by providing a portable, modular and easily executable assembly language. Furthermore, as an Assembly language, WASM allows programmers to directly write code snippets in C, C++, Rust, Java or Ruby that runs natively in a browser. WASM has therefore become a technology of choice for “providing distributed serverless functions” [13].

So why and how do zk-SNARKs come into the picture? WASM is unique in that it is a client-side technology, able to directly interact with user inputs and data. Because oftentimes this includes sensitive data such as passwords and personal information, we need a technology that (1) ensures that the program executes correctly, and that (2) our sensitive information is not leaked. As described above, a zk-SNARK is a perfect solution to solve both these issues, and is thus an important puzzle piece in securing WASM [14].

While work on developing zk-WASM is still in its early stages, recently there have been some projects that have released prototype zk-SNARK circuits for WebAssembly. For example, Delphinus Lab’s “ZAWA” zk-SNARK Emulator presents a method to encode the operands and semantics of a WASM virtual machine into an arithmetic circuit, which allows it to conduct zk-SNARK proofs [13]. As time goes forth, zk-WASM circuits will undoubtedly be continuously optimized, thus allowing programs written in generic languages (such as C, C++, Rust and Ruby) to adopt the paradigm of zk-Proofs.

Conclusion

Throughout this essay, we have explored the theoretical underpinnings of zk-ASM as well as examined two paradigmatic case studies of zk-ASM: Polygon’s use of zk-ASM to create an opcode-level zk-EVM, as well as the application of zk-SNARKs onto WebAssembly to create zk-WASM. Ultimately, the promise of zk-ASM is to put together the interoperability and scale of Web 2 with the trustlessness and security of Web 3.

One the one hand, blockchains increasingly look to scale beyond their current throughput bottlenecks and potentially support the execution, while on the other, Web 2 methods have become increasingly under attack for inadequately protecting user data and privacy. As programmers are able to employ Web 3 design paradigms in their Web 2 code and introduce Web 2 languages and code onto the blockchain, generic zk-ASMs may represent a merging point in the world of Web 2 and Web 3 [15]. It is in this sense, that zk-ASM may allow us to reimagine a more secure and trustless Internet.

🐩 @0xfishylosopher

📅 17 December 2022

Disclaimer: the information presented above is purely educational does not constitute financial advice, and represents the views of the author alone. Delphinus Lab is a portfolio company of Web3.com Ventures.

References

[1] https://z.cash/technology/zksnarks/

[2] https://cs251.stanford.edu/lectures/lecture14.pdf

[3] https://www.britannica.com/science/fundamental-theorem-of-algebra

[4] Building efficient SNARKs: https://cs251.stanford.edu/lectures/lecture15.pdf

[5] Example from: https://www.tutorialspoint.com/assembly_programming/assembly_loops.htm

[6] https://en.wikipedia.org/wiki/Assembly_language

[7] https://www.alchemy.com/overviews/zkevm

[8] For list of opcodes: https://ethereum.org/en/developers/docs/evm/opcodes/

[9] https://wiki.polygon.technology/docs/zkEVM/zkASM/introduction

[10] https://wiki.polygon.technology/docs/zkEVM/zkASM/some-examples

[11] https://vitalik.ca/general/2022/08/04/zkevm.html

[12] https://blog.developer.adobe.com/understanding-webassembly-wasm-d5b592208ecc

[13] https://jhc.sjtu.edu.cn/~hongfeifu/manuscriptb.pdf

[14] https://hyperoracle.medium.com/zkwasm-the-next-chapter-of-zk-and-zkvm-471038b1fba6

[15] https://delphinuslab.com/zk-wasm/
Crypto Insurance: A Blue OceanDeFi Security In October, the security of on-chain assets regained much attention since well-known protocols such as BNB Cross Chain Bridge, OmniX NFT Platform, and Team Finance were getting hacked. It seems that the endless hacking attacks in crypto have long been commonplace. Early this year, Chainalysis published a research report about on-chain crimes, showing that crypto protocols have become a new playground for digital thieves. In the past year, $3.2B worth of cryptocurrencies were stolen from the crypto market, and the victims are disproportionately in DeFi. About 97% of all cryptocurrencies stolen in Q1 2022 were taken from DeFi protocols. Code exploits are becoming the most common vector of attacks on DeFi protocols, while the Flash Loan attacks gradually disappear. Hackers are now focusing more on getting access to victims’ private keys or manipulating the token prices by taking advantage of faulty codes in the protocols. In the current DeFi market, on-chain security businesses mainly focus on providing auditing and monitoring services to protocols. Institutions like Certik, Hacken, and Slowmist have established the industry standard and user confidence to a certain extent. However, even though the vast majority of smart contracts were audited before they go live on the main net, nearly 30% of code exploits still occurred on platforms with the audited smart contracts last year. The auditing service can only prevent most attacks caused by faulty code, but not all. A more robust approach is needed to protect the security of the user fund. The crypto market currently lacks a standardized risk control process in terms of asset protection after the protocol got attacked. Many attacked protocols rely on help from centralized exchanges and stable coin issuers to track & recover lost, and the different operational capabilities between projects make the security of on-chain assets even worse. An industry standard for after-attack services needs to be established to provide a standardized recovery process that can greatly decrease the operational risk of the on-chain assets and form a comprehensive protection for protocols. In traditional finance, the protection for the company’s property comes from the commercial insurance purchased, but the commercial insurance for on-chain properties is still underdeveloped. Recent research has shown that less than 3% of crypto assets are insured and the current TVL of the on-chain insurance protocol is only about $275.55M. Centralized Crypto Insurance Goldman Sachs published their eleventh annual Insurance Survey this June, which included the views of 328 Chief Investment Officers and Chief Financial Officers from the global insurance industry about their asset allocation decisions. There have 6% of insurers ranked Cryptocurrencies as their first choice when considering portfolio construction. Although most traditional insurance companies claim that they do not have cryptocurrencies, it is undeniable that they have shown a strong interest in the development of the crypto market Evertas, a pioneer in the cryptocurrency insurance industry founded in 2017, was licensed by Lloyd’s of London early this year. This represents further cooperation of the traditional insurance company with the crypto market and the efforts by the crypto insurance industry on their road to compliance. In addition to supporting common commercial insurance like mining property protection, for the on-chain protocols, Evertas launched their smart contract insurance for DeFi. This insurance will provide coverage for malfunctioning smart contracts and associated loss of assets. For security breach issues, it also provides coverage for private key loss and theft cases. Evertas can now provide crypto insurance for users in the UK, Australia, and HK through the continuous expansion of its supporting area. However, due to the lack of large insurance institutions, the centralized crypto insurance industry is still in a very early stage and can cover only a very small part of the roughly $4B cryptocurrency loss in 2021. Decentralized Crypto Insurance The decentralized crypto insurance protocols are combining traditional insurance products with blockchain technology and DAO ideas to improve the transparency and efficiency of their services. Nexus Mutual Nexus Mutual is the best-performed DeFi insurance protocol in the current crypto market, which accounts for over 67% of the entire on-chain insurance industry. Unlike commercial insurance in traditional finance, its mechanism is closer to the elder mutual insurance model that appeared in 1752 when Benjamin Franklin established the Philadelphia Contributionship for the Insurance of Houses from Loss by Fire. The mutual insurance company is owned by its policyholders, and any profits earned are either retained within the company or rebated to policyholders in the form of dividends or reduced future premiums Nexus Mutual currently provides three types of cover Yield Token Cover: Protects against yield-bearing token de-pegging Protocol Cover: Protect against a hack on a specific protocol Custody Cover: protects against halted withdrawals and haircuts on your funds stored on centralized exchanges. The insurance premium of Nexus Mutual comes from the fund pool contributed by investors. Investors deposit $NXM into pools that they think it is safe or they are willing to bear the insurance cost and obtain the corresponding profit generated. If any claim was initiated and verified by the community, funds in the corresponding pool will be delivered to the claimer in the form of $ETH/$DAI while the $NXM will be destroyed. The mechanism design of Nexus Mutual also draws on the model of the popular liquidity pool, that is, investors sell $ETH to the protocol for $NXM, and $NXM is like the LP tokens in DEXs, which represents the liquidity share provided by the investors and is also its income warrant. Its tokenomics, or the token value capture, $NXM adopts the following formula. Firstly, due to its $ETH-denominated nature, $NXM holders naturally have exposure to the price volatility of $ETH. In the formula above, we can find that the price of $NXM is mainly affected by the minimum capital required (MCR), and its ratio to the total amount of funds in the pool. MCR represents the popularity of the underlying insurance, that is, demand, which supports the token price in the long run. On the other hand, the changes in its ratio can greatly influence the price performance of $NXM in the short term. For example, after a claim is paid, the MCR will remain unchanged, and the rapid growth of MCR% will make the price of $NXM quickly rise. From the mechanisms introduced above, we can see that decentralized insurance protocols use blockchain technology to improve fairness and reduce friction costs compared to traditional commercial insurance. Nexus Mutual’s competitor InsurAce, which has a very similar system to Nexus Mutual, has successfully paid out a total of $11.7M to 155 victims of the previous Terra crash in less than a month. With a more democratized and transparent insurance process, decentralized insurance protocols can hold stronger confidence from crypto-native investors. Current Dilemma and Prospects Decentralized crypto insurance protocols and centralized crypto insurance companies have taken different paths in their way of protecting on-chain assets. on-chain crypto insurance protocols are focusing more on expanding their coverage to deal with the booming of DeFi market, and to meet the various needs of customers, while the centralized crypto insurance companies are focusing more on collaboration with traditional insurance institutions and policy compliances around the world. It’s still hard to tell what the future of crypto insurance will look like, but the current decentralized crypto insurance business model has encountered some challenges. Firstly, although the mutual insurance model was widely adopted in on-chain crypto insurance protocols, the biggest attraction for user funds is still its tasty rate of yield. Since there are many speculators in the crypto market, its source of insurance premiums is unstable, and the lack of stable insurance premiums and interest rates generator can be a huge problem that hinders the development of the decentralized crypto insurance businesses. In addition to that, the stricter KYC requirement is also a problem. For now, complex KYCs are breaking up the decentralization of on-chain insurance protocols and limiting the number of investors. It is worth noting that compliance is beneficial to the adoption rate of the crypto, but it also brings investors concerns about the security of their KYC data. On the plus side, the cooperation between centralized crypto insurance companies and traditional insurance institutions will help educate employees and their customers, which can bring more investors to crypto. At present, the crypto insurance industry only provides coverage for on-chain security risks. There are still no corresponding countermeasures for emerging risks like governance attacks. But with such a blue ocean, I believe that there will be projects to explore more possibilities of crypto insurance in the near future. 🐩 @Soxpt50 📅 17 November 2022 Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision. Links: [1]https://nexusmutual.io/assets/docs/nmx_white_paperv2_3.pdf [2]https://www.gsam.com/content/gsam/us/en/institutions/market-insights/gsam-insights/2022/Insurance_Survey_2022.html [3]https://go.chainalysis.com/2022-Crypto-Crime-Report.html

Crypto Insurance: A Blue Ocean

DeFi Security

In October, the security of on-chain assets regained much attention since well-known protocols such as BNB Cross Chain Bridge, OmniX NFT Platform, and Team Finance were getting hacked. It seems that the endless hacking attacks in crypto have long been commonplace.

Early this year, Chainalysis published a research report about on-chain crimes, showing that crypto protocols have become a new playground for digital thieves. In the past year, $3.2B worth of cryptocurrencies were stolen from the crypto market, and the victims are disproportionately in DeFi. About 97% of all cryptocurrencies stolen in Q1 2022 were taken from DeFi protocols.

Code exploits are becoming the most common vector of attacks on DeFi protocols, while the Flash Loan attacks gradually disappear. Hackers are now focusing more on getting access to victims’ private keys or manipulating the token prices by taking advantage of faulty codes in the protocols.

In the current DeFi market, on-chain security businesses mainly focus on providing auditing and monitoring services to protocols. Institutions like Certik, Hacken, and Slowmist have established the industry standard and user confidence to a certain extent. However, even though the vast majority of smart contracts were audited before they go live on the main net, nearly 30% of code exploits still occurred on platforms with the audited smart contracts last year. The auditing service can only prevent most attacks caused by faulty code, but not all. A more robust approach is needed to protect the security of the user fund.

The crypto market currently lacks a standardized risk control process in terms of asset protection after the protocol got attacked. Many attacked protocols rely on help from centralized exchanges and stable coin issuers to track & recover lost, and the different operational capabilities between projects make the security of on-chain assets even worse.

An industry standard for after-attack services needs to be established to provide a standardized recovery process that can greatly decrease the operational risk of the on-chain assets and form a comprehensive protection for protocols.

In traditional finance, the protection for the company’s property comes from the commercial insurance purchased, but the commercial insurance for on-chain properties is still underdeveloped. Recent research has shown that less than 3% of crypto assets are insured and the current TVL of the on-chain insurance protocol is only about $275.55M.

Centralized Crypto Insurance

Goldman Sachs published their eleventh annual Insurance Survey this June, which included the views of 328 Chief Investment Officers and Chief Financial Officers from the global insurance industry about their asset allocation decisions. There have 6% of insurers ranked Cryptocurrencies as their first choice when considering portfolio construction. Although most traditional insurance companies claim that they do not have cryptocurrencies, it is undeniable that they have shown a strong interest in the development of the crypto market

Evertas, a pioneer in the cryptocurrency insurance industry founded in 2017, was licensed by Lloyd’s of London early this year. This represents further cooperation of the traditional insurance company with the crypto market and the efforts by the crypto insurance industry on their road to compliance.

In addition to supporting common commercial insurance like mining property protection, for the on-chain protocols, Evertas launched their smart contract insurance for DeFi. This insurance will provide coverage for malfunctioning smart contracts and associated loss of assets. For security breach issues, it also provides coverage for private key loss and theft cases. Evertas can now provide crypto insurance for users in the UK, Australia, and HK through the continuous expansion of its supporting area.

However, due to the lack of large insurance institutions, the centralized crypto insurance industry is still in a very early stage and can cover only a very small part of the roughly $4B cryptocurrency loss in 2021.

Decentralized Crypto Insurance

The decentralized crypto insurance protocols are combining traditional insurance products with blockchain technology and DAO ideas to improve the transparency and efficiency of their services.

Nexus Mutual

Nexus Mutual is the best-performed DeFi insurance protocol in the current crypto market, which accounts for over 67% of the entire on-chain insurance industry. Unlike commercial insurance in traditional finance, its mechanism is closer to the elder mutual insurance model that appeared in 1752 when Benjamin Franklin established the Philadelphia Contributionship for the Insurance of Houses from Loss by Fire. The mutual insurance company is owned by its policyholders, and any profits earned are either retained within the company or rebated to policyholders in the form of dividends or reduced future premiums

Nexus Mutual currently provides three types of cover

Yield Token Cover: Protects against yield-bearing token de-pegging

Protocol Cover: Protect against a hack on a specific protocol

Custody Cover: protects against halted withdrawals and haircuts on your funds stored on centralized exchanges.

The insurance premium of Nexus Mutual comes from the fund pool contributed by investors. Investors deposit $NXM into pools that they think it is safe or they are willing to bear the insurance cost and obtain the corresponding profit generated. If any claim was initiated and verified by the community, funds in the corresponding pool will be delivered to the claimer in the form of $ETH /$DAI while the $NXM will be destroyed.

The mechanism design of Nexus Mutual also draws on the model of the popular liquidity pool, that is, investors sell $ETH to the protocol for $NXM, and $NXM is like the LP tokens in DEXs, which represents the liquidity share provided by the investors and is also its income warrant.

Its tokenomics, or the token value capture, $NXM adopts the following formula.

Firstly, due to its $ETH -denominated nature, $NXM holders naturally have exposure to the price volatility of $ETH . In the formula above, we can find that the price of $NXM is mainly affected by the minimum capital required (MCR), and its ratio to the total amount of funds in the pool. MCR represents the popularity of the underlying insurance, that is, demand, which supports the token price in the long run. On the other hand, the changes in its ratio can greatly influence the price performance of $NXM in the short term. For example, after a claim is paid, the MCR will remain unchanged, and the rapid growth of MCR% will make the price of $NXM quickly rise.

From the mechanisms introduced above, we can see that decentralized insurance protocols use blockchain technology to improve fairness and reduce friction costs compared to traditional commercial insurance. Nexus Mutual’s competitor InsurAce, which has a very similar system to Nexus Mutual, has successfully paid out a total of $11.7M to 155 victims of the previous Terra crash in less than a month. With a more democratized and transparent insurance process, decentralized insurance protocols can hold stronger confidence from crypto-native investors.

Current Dilemma and Prospects

Decentralized crypto insurance protocols and centralized crypto insurance companies have taken different paths in their way of protecting on-chain assets.

on-chain crypto insurance protocols are focusing more on expanding their coverage to deal with the booming of DeFi market, and to meet the various needs of customers, while the centralized crypto insurance companies are focusing more on collaboration with traditional insurance institutions and policy compliances around the world.

It’s still hard to tell what the future of crypto insurance will look like, but the current decentralized crypto insurance business model has encountered some challenges. Firstly, although the mutual insurance model was widely adopted in on-chain crypto insurance protocols, the biggest attraction for user funds is still its tasty rate of yield. Since there are many speculators in the crypto market, its source of insurance premiums is unstable, and the lack of stable insurance premiums and interest rates generator can be a huge problem that hinders the development of the decentralized crypto insurance businesses.

In addition to that, the stricter KYC requirement is also a problem. For now, complex KYCs are breaking up the decentralization of on-chain insurance protocols and limiting the number of investors. It is worth noting that compliance is beneficial to the adoption rate of the crypto, but it also brings investors concerns about the security of their KYC data. On the plus side, the cooperation between centralized crypto insurance companies and traditional insurance institutions will help educate employees and their customers, which can bring more investors to crypto.

At present, the crypto insurance industry only provides coverage for on-chain security risks. There are still no corresponding countermeasures for emerging risks like governance attacks. But with such a blue ocean, I believe that there will be projects to explore more possibilities of crypto insurance in the near future.

🐩 @Soxpt50

📅 17 November 2022

Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision.

Links:

[1]https://nexusmutual.io/assets/docs/nmx_white_paperv2_3.pdf

[2]https://www.gsam.com/content/gsam/us/en/institutions/market-insights/gsam-insights/2022/Insurance_Survey_2022.html

[3]https://go.chainalysis.com/2022-Crypto-Crime-Report.html
Sui-nnovationWeb3.com Ventures Original Research Analysis 0xFishylosopher Introduction The Sui blockchain is one of the major new L1 projects utilizing the Move language, and has gathered major media and capital attention in its technological developments [1]. Compared with many other new L1 blockchains, particularly with Aptos, Sui boasts a whole set of experimental designs, technological innovations, and unique tokenomics to provide a blockchain that “scales horizontally” with virtually unlimited upside [2]. Within this article, I will walk through the primary technological breakthroughs of Sui, as well as the project’s unique tokenomics model, before providing a critical assessment of Sui’s potential as a scalability-focused L1 chain. Sui Technical Principles There is a wide variety of technical innovations that Sui as a project makes that all aim to optimize transaction speed and throughput, particularly for simple peer-to-peer transactions. In this section, I will focus on three of the most prominent technological innovations: the modification of the Move language to create Sui Move, the handling of “simple transactions” without consensus, and the unique Narwhal-Tusk consensus engine. Sui Move First, unlike Aptos and other Move-language projects that directly take Meta Libra’s original Move language “out-of-the-box,” Sui uses its own custom version of the Move language, Sui Move, to write its smart contracts. Move, as a resource-based smart-contract language implemented using ability-enabled structs, is naturally optimized for parallel execution. But while the original Diem-Move language was developed with a “permissioned system” in mind where access control was strictly enforced, Sui’s idea of mass-adoption necessitates a much more open system in order to accelerate onboarding. To this end, Sui Move creates more primitive functions, such as the transfer function, that allow for “frictionless mass asset creation” [3]. Sui’s other modifications to the Move-language include a clean-up of Move’s typing systems to allow for a much smoother programming experience. Through allowing native-typed asset transfers and ownership, developers do not need to use “wrapper patterns” like in the traditional Move language. Sui Move’s optimization of resource “typing” also allows for the bundling of different-typed assets together into a larger container of assets, thus extending the functionality of custom-defined types and allowing for a cleaner polymorphism structure [3]. Simple “Single-Writer” Transactions Sui’s handling of simple “single-writer” transactions is arguably the project's most important innovation, and the key underwriter for the project’s scalability. Single-writer transactions, as its name suggests, include common blockchain transactions such as peer-to-peer payments and NFT transfers where there is only one sender involved. Sui states that it “optimizes for single-writer objects, allowing a design that forgoes consensus for simple transactions” [4]. This seems contradictory to the principles of blockchain design — after all, the consensus is the crucial step in making sure the data sent over the blockchain is secure. So how does Sui handle data security without consensus? The key here is that Sui uses an algorithm known as “Byzantine Consistent Broadcast,” a simpler algorithm that eliminates overhead from traditional consensus approaches, but still guarantees security and liveness under Byzantine adversarial conditions (basically the standard security requirement for consensus). This algorithm is actually fairly similar to the Transport Layer Security (TLS) algorithm that is traditionally used by webpages to establish a secure connection [5]. In a traditional consensus mechanism, there is a “fire-and-forget” approach for the sender, where after submitting a transaction to the validators, the sender does nothing more with the transaction, and the validators do all the work [6]. This is because validators need to check that the transaction does not conflict with the other transactions sent in by other clients. But in the case of a simple “single-writer” operation, we already know that there will not be other conflicting writes from other senders. Therefore, we can offload the computational burden of tallying votes to the sender, rather than use up valuable validator computation. More specifically, simple transactions undergo the following four steps [6]: The sender sends transaction data to the validators, and keeps a record that it has sent this transaction The validators send their (proof-of-stake weighted) votes back to the sender The sender tallies up all of the votes received by validators, and once it crosses the validation threshold, creates a “verification certificate” that is sent back to the validators The validators receive this certificate, check validity and instantly finalize this transaction. Because the validators do not need to communicate with each other (as in the case of traditional consensus), this ingenious algorithm allows simple-transactions have very little computational costs for validators on the Sui network, while the certificate mechanism also provides these transactions with a fair level of robustness on the security side. More importantly, this algorithm is highly scalable. Because of this ingenious algorithm, the Sui network theoretically does not have an upper bound on the number of simple transactions that it can process. Since the main computation of tallying up validator votes is done on the client-side, as more and more people use the Sui network, the validators don’t need to face the same computational pressure as Ethereum, Solana, or even Aptos faces (the users essentially DIY computation). This is why the creators of Sui advertise the blockchain as being qualitatively different compared to other blockchains in that it uses a “horizontal scaling” strategy [7]. Narwhal-Tusk Consensus Engine Of course, while Sui may optimize for simple single-writer transactions, it also has a more “traditional” global consensus mechanism used for more complex transactions and to periodically checkpoint the global state of the blockchain. This consensus engine, leveraging graph (more specifically DAG) data structures, is also a cutting-edge consensus mechanism in and of itself. There are two primary components to Sui’s consensus engine [8]: Narwhal, Sui’s mempool, or basically a butler keeping pending consensus transactions in check Tusk (and its predecessor Bullshark), protocols that ensure the ordering of the consensus transactions Narwhal essentially takes a bunch of pending transactions and “graph-ifies” these while they are waiting to be processed [9]. Instead of rolling up transactions into confirmation blocks like in a traditional blockchain consensus, Narwhal will stamp transactions with a version number, as well as with pointers to version numbers of previous transactions, thereby creating a directed acyclic graph (DAG) that is a kind of like a file tree. This DAG structure is quite similar to Arweave’s notion of a “blockweave,” which turns the 1-dimensional LinkedList-like structure of a “blockchain” into a file-tree like graph in order to allow for more efficient storage of data [10]. Tusk is also essentially a modified-version of the HotStuff consensus protocol, optimized for the DAG-structure of that Narwhal provides, focused on reducing the overhead communication between the validators on the network. When Narwal and Tusk are used in conjunction, they can achieve up to “160,000 tx/sec with about 3 seconds latency” under traditional Byzantine Fault Tolerance (BFT) conditions [9]. Thus, the Narwal-Tusk transaction efficiency is already cutting-edge, even without considering the additional optimization that Sui implements for single-writer transactions using the Byzantine Consistent Broadcast algorithm mentioned above. Sui Tokenomics In addition to having a host of technical innovations aimed at optimizing the composability and scalability of the network, Sui also implements a unique tokenomics model that takes into the account the cost of storage on the network [11]. Traditionally, blockchains such as Ethereum do not need to take into account the cost of storing data onto the network, because the “gas” price to write data onto the blockchain are so high, therefore becoming a natural deterrent to people writing large amounts of data onto the network. However, in the case of Sui, the network’s high scalability means that gas fees become extremely low, such that people may indeed want to store large portions of data onto the chain. As shown in the diagram, Sui’s tokenomics model is essentially the traditional tokenomics model of a proof-of-stake chain, with an added twist of the “storage fund.” So what does the “storage fund” solve? A blockchain is fundamentally an append-only data-structure, and must guarantee that once data is written onto the blockchain, it will always stay there. Suppose we have two users, A and B. User A uses Sui at its inception. Because there isn’t much “legacy data” that validators must keep in mind, User A can enjoy very low gas fees. On the other hand, User B only uses Sui once the network has matured and there is a significant amount of data on-chain. Because validators need to upkeep all of this legacy data, User B needs to pay much higher fees than User A. The “storage fund” is basically a way to even out this problem, making User A pay more fees so that the two users can use the network with a comparable level of fees [12]. Sui’s storage fund pays out to the validators in the system, or those that actually store the data of the blockchain and do the “maintenance work” of data upkeep. As the maintenance work increases, the payout of the validators will also increase, such that more people will be incentivized to become a validator in the system rather than a token delegator that stakes the coin and forgets about it [13]. As more people become validators, the total amount of storage space available also increases. Therefore, the Sui tokenomics model is specifically designed to address the question of “what happens next” after high scalability is attained. It is a paradigmatic tokenomics model for a highly scalable chain that uses game theory to ensure that gas prices will remain relatively constant throughout the blockchain’s lifetime. Evaluation From Byzantine Consistent Broadcast and the Narwhal-Tusk consensus mechanism to the Sui-Move language and the Storage-Fund Tokenomics, Sui is a project that has major innovations in every single part of a blockchain project, a status unique even compared to other new L1 chains such as Aptos. All of these innovations are compounded together into a cohesive narrative — an end-to-end optimization for the blockchain’s mass adoption at the scale of modern social media and other Web 2 projects. Out of this whole range of technical innovations, perhaps the biggest technical highlight of Sui is still its “unlimited” upper bound for single-write transactions (due to the Byzantine Consistent Broadcast algorithm). Therefore, projects that heavily leverage single-write transactions will definitely realize the biggest scalability gains through using the Sui network. Sui provides several examples of these “single-writer” DApps, including regular peer-to-peer token transactions, a public bulletin board, private messaging, and mass minting and distribution of game assets [14]. Essentially, the optimization for single-writer transactions means that the most common functions of social media apps can be onboarded onto Sui without any scalability worries, and the full extent of these social media apps’ network effects can be put into play thanks to Sui’s scalability solutions. An example of this is ComingChat, attempting to create a holistic instant messaging service, like Telegram and WhatsApp with wallet integration, planning to deploy on Sui and Aptos [15]. Moreover, Sui’s technical designs also are able to realize NFTs full potential as mass commodities. Off the bat, Sui completely solves the price-spike problem common with Ethereum mass-distribution of NFTs (such as with BAYC). But looking beyond this, through making mass-minting affordable and scalable, Sui opens the floodgates for NFTs that have low intrinsic value, but high social purpose. From on-chain coupons, to decentralized IDs, to credit cards and boarding passes, everything that Apple Wallet or any other physical wallet can hold can all be put on-chain as NFTs thanks to Sui’s scalability solutions. Taking this one step further, Sui can even act as a permanent blockchain version of MongoDB. Recall that Sui Move’s language features allow Sui to very cleanly define and transact “resources” of different custom types, and allow for a great degree of polymorphism and modularity. These actually can be used to construct MongoDB-like artifacts and allow for a novel way of structurally storing objects on a blockchain, with all of the security and permanence guarantees of the blockchain. I believe that Sui’s biggest contribution in the space is in constructing a viable blueprint for a L1 blockchain that is able to handle Web 2 level scalability. More importantly, I believe the development of Sui is symbolic of Web 3’s growing maturity, and growing confidence in the field that the long-run holds a promise of a billion-user scale. In this sense, no matter if the Sui ecosystem is able to thrive in the long run, I believe the project itself, as well as its innovative re-visioning of the blockchain already represents a resounding success. 🐩 @0xfishylosopher 📅 9 November 2022 Many thanks to Web3.com Ventures for consultation and sponsorship of this article. References [1] https://sui.io/resources-sui/raise-announcement/ [2] https://medium.com/mysten-labs/recap-7-21-sui-ama-unlocking-opportunities-on-sui-with-adeniyi-abiodun-f4fa96a15d7c [3] https://sui.io/resources-move/why-we-created-sui-move/ [4] https://docs.sui.io/learn/how-sui-works [5] https://www.cloudflare.com/learning/ssl/transport-layer-security-tls/ [6] https://docs.sui.io/learn/how-sui-works#simple-transactions [7] https://boxmining.com/sui-blockchain/ [8] https://docs.sui.io/learn/architecture/consensus [9] https://arxiv.org/abs/2105.11827 [10] https://www.arweave.org/technology [11] https://docs.sui.io/learn/tokenomics [12] https://docs.sui.io/learn/tokenomics/storage-fund [13] More on delegators and validators: https://coinguides.org/validators-delegators-staking/ [14] A list of single-writer app examples: https://docs.sui.io/learn/single-writer-apps [15] https://coming.chat/

Sui-nnovation

Web3.com Ventures Original Research Analysis

0xFishylosopher

Introduction

The Sui blockchain is one of the major new L1 projects utilizing the Move language, and has gathered major media and capital attention in its technological developments [1]. Compared with many other new L1 blockchains, particularly with Aptos, Sui boasts a whole set of experimental designs, technological innovations, and unique tokenomics to provide a blockchain that “scales horizontally” with virtually unlimited upside [2]. Within this article, I will walk through the primary technological breakthroughs of Sui, as well as the project’s unique tokenomics model, before providing a critical assessment of Sui’s potential as a scalability-focused L1 chain.

Sui Technical Principles

There is a wide variety of technical innovations that Sui as a project makes that all aim to optimize transaction speed and throughput, particularly for simple peer-to-peer transactions. In this section, I will focus on three of the most prominent technological innovations: the modification of the Move language to create Sui Move, the handling of “simple transactions” without consensus, and the unique Narwhal-Tusk consensus engine.

Sui Move

First, unlike Aptos and other Move-language projects that directly take Meta Libra’s original Move language “out-of-the-box,” Sui uses its own custom version of the Move language, Sui Move, to write its smart contracts. Move, as a resource-based smart-contract language implemented using ability-enabled structs, is naturally optimized for parallel execution. But while the original Diem-Move language was developed with a “permissioned system” in mind where access control was strictly enforced, Sui’s idea of mass-adoption necessitates a much more open system in order to accelerate onboarding. To this end, Sui Move creates more primitive functions, such as the transfer function, that allow for “frictionless mass asset creation” [3].

Sui’s other modifications to the Move-language include a clean-up of Move’s typing systems to allow for a much smoother programming experience. Through allowing native-typed asset transfers and ownership, developers do not need to use “wrapper patterns” like in the traditional Move language. Sui Move’s optimization of resource “typing” also allows for the bundling of different-typed assets together into a larger container of assets, thus extending the functionality of custom-defined types and allowing for a cleaner polymorphism structure [3].

Simple “Single-Writer” Transactions

Sui’s handling of simple “single-writer” transactions is arguably the project's most important innovation, and the key underwriter for the project’s scalability. Single-writer transactions, as its name suggests, include common blockchain transactions such as peer-to-peer payments and NFT transfers where there is only one sender involved. Sui states that it “optimizes for single-writer objects, allowing a design that forgoes consensus for simple transactions” [4]. This seems contradictory to the principles of blockchain design — after all, the consensus is the crucial step in making sure the data sent over the blockchain is secure. So how does Sui handle data security without consensus?

The key here is that Sui uses an algorithm known as “Byzantine Consistent Broadcast,” a simpler algorithm that eliminates overhead from traditional consensus approaches, but still guarantees security and liveness under Byzantine adversarial conditions (basically the standard security requirement for consensus). This algorithm is actually fairly similar to the Transport Layer Security (TLS) algorithm that is traditionally used by webpages to establish a secure connection [5].

In a traditional consensus mechanism, there is a “fire-and-forget” approach for the sender, where after submitting a transaction to the validators, the sender does nothing more with the transaction, and the validators do all the work [6]. This is because validators need to check that the transaction does not conflict with the other transactions sent in by other clients. But in the case of a simple “single-writer” operation, we already know that there will not be other conflicting writes from other senders. Therefore, we can offload the computational burden of tallying votes to the sender, rather than use up valuable validator computation.

More specifically, simple transactions undergo the following four steps [6]:

The sender sends transaction data to the validators, and keeps a record that it has sent this transaction

The validators send their (proof-of-stake weighted) votes back to the sender

The sender tallies up all of the votes received by validators, and once it crosses the validation threshold, creates a “verification certificate” that is sent back to the validators

The validators receive this certificate, check validity and instantly finalize this transaction.

Because the validators do not need to communicate with each other (as in the case of traditional consensus), this ingenious algorithm allows simple-transactions have very little computational costs for validators on the Sui network, while the certificate mechanism also provides these transactions with a fair level of robustness on the security side.

More importantly, this algorithm is highly scalable. Because of this ingenious algorithm, the Sui network theoretically does not have an upper bound on the number of simple transactions that it can process. Since the main computation of tallying up validator votes is done on the client-side, as more and more people use the Sui network, the validators don’t need to face the same computational pressure as Ethereum, Solana, or even Aptos faces (the users essentially DIY computation). This is why the creators of Sui advertise the blockchain as being qualitatively different compared to other blockchains in that it uses a “horizontal scaling” strategy [7].

Narwhal-Tusk Consensus Engine

Of course, while Sui may optimize for simple single-writer transactions, it also has a more “traditional” global consensus mechanism used for more complex transactions and to periodically checkpoint the global state of the blockchain. This consensus engine, leveraging graph (more specifically DAG) data structures, is also a cutting-edge consensus mechanism in and of itself. There are two primary components to Sui’s consensus engine [8]:

Narwhal, Sui’s mempool, or basically a butler keeping pending consensus transactions in check

Tusk (and its predecessor Bullshark), protocols that ensure the ordering of the consensus transactions

Narwhal essentially takes a bunch of pending transactions and “graph-ifies” these while they are waiting to be processed [9]. Instead of rolling up transactions into confirmation blocks like in a traditional blockchain consensus, Narwhal will stamp transactions with a version number, as well as with pointers to version numbers of previous transactions, thereby creating a directed acyclic graph (DAG) that is a kind of like a file tree. This DAG structure is quite similar to Arweave’s notion of a “blockweave,” which turns the 1-dimensional LinkedList-like structure of a “blockchain” into a file-tree like graph in order to allow for more efficient storage of data [10].

Tusk is also essentially a modified-version of the HotStuff consensus protocol, optimized for the DAG-structure of that Narwhal provides, focused on reducing the overhead communication between the validators on the network. When Narwal and Tusk are used in conjunction, they can achieve up to “160,000 tx/sec with about 3 seconds latency” under traditional Byzantine Fault Tolerance (BFT) conditions [9]. Thus, the Narwal-Tusk transaction efficiency is already cutting-edge, even without considering the additional optimization that Sui implements for single-writer transactions using the Byzantine Consistent Broadcast algorithm mentioned above.

Sui Tokenomics

In addition to having a host of technical innovations aimed at optimizing the composability and scalability of the network, Sui also implements a unique tokenomics model that takes into the account the cost of storage on the network [11]. Traditionally, blockchains such as Ethereum do not need to take into account the cost of storing data onto the network, because the “gas” price to write data onto the blockchain are so high, therefore becoming a natural deterrent to people writing large amounts of data onto the network. However, in the case of Sui, the network’s high scalability means that gas fees become extremely low, such that people may indeed want to store large portions of data onto the chain.

As shown in the diagram, Sui’s tokenomics model is essentially the traditional tokenomics model of a proof-of-stake chain, with an added twist of the “storage fund.” So what does the “storage fund” solve? A blockchain is fundamentally an append-only data-structure, and must guarantee that once data is written onto the blockchain, it will always stay there. Suppose we have two users, A and B. User A uses Sui at its inception. Because there isn’t much “legacy data” that validators must keep in mind, User A can enjoy very low gas fees. On the other hand, User B only uses Sui once the network has matured and there is a significant amount of data on-chain. Because validators need to upkeep all of this legacy data, User B needs to pay much higher fees than User A. The “storage fund” is basically a way to even out this problem, making User A pay more fees so that the two users can use the network with a comparable level of fees [12].

Sui’s storage fund pays out to the validators in the system, or those that actually store the data of the blockchain and do the “maintenance work” of data upkeep. As the maintenance work increases, the payout of the validators will also increase, such that more people will be incentivized to become a validator in the system rather than a token delegator that stakes the coin and forgets about it [13]. As more people become validators, the total amount of storage space available also increases.

Therefore, the Sui tokenomics model is specifically designed to address the question of “what happens next” after high scalability is attained. It is a paradigmatic tokenomics model for a highly scalable chain that uses game theory to ensure that gas prices will remain relatively constant throughout the blockchain’s lifetime.

Evaluation

From Byzantine Consistent Broadcast and the Narwhal-Tusk consensus mechanism to the Sui-Move language and the Storage-Fund Tokenomics, Sui is a project that has major innovations in every single part of a blockchain project, a status unique even compared to other new L1 chains such as Aptos. All of these innovations are compounded together into a cohesive narrative — an end-to-end optimization for the blockchain’s mass adoption at the scale of modern social media and other Web 2 projects.

Out of this whole range of technical innovations, perhaps the biggest technical highlight of Sui is still its “unlimited” upper bound for single-write transactions (due to the Byzantine Consistent Broadcast algorithm). Therefore, projects that heavily leverage single-write transactions will definitely realize the biggest scalability gains through using the Sui network.

Sui provides several examples of these “single-writer” DApps, including regular peer-to-peer token transactions, a public bulletin board, private messaging, and mass minting and distribution of game assets [14]. Essentially, the optimization for single-writer transactions means that the most common functions of social media apps can be onboarded onto Sui without any scalability worries, and the full extent of these social media apps’ network effects can be put into play thanks to Sui’s scalability solutions. An example of this is ComingChat, attempting to create a holistic instant messaging service, like Telegram and WhatsApp with wallet integration, planning to deploy on Sui and Aptos [15].

Moreover, Sui’s technical designs also are able to realize NFTs full potential as mass commodities. Off the bat, Sui completely solves the price-spike problem common with Ethereum mass-distribution of NFTs (such as with BAYC). But looking beyond this, through making mass-minting affordable and scalable, Sui opens the floodgates for NFTs that have low intrinsic value, but high social purpose. From on-chain coupons, to decentralized IDs, to credit cards and boarding passes, everything that Apple Wallet or any other physical wallet can hold can all be put on-chain as NFTs thanks to Sui’s scalability solutions.

Taking this one step further, Sui can even act as a permanent blockchain version of MongoDB. Recall that Sui Move’s language features allow Sui to very cleanly define and transact “resources” of different custom types, and allow for a great degree of polymorphism and modularity. These actually can be used to construct MongoDB-like artifacts and allow for a novel way of structurally storing objects on a blockchain, with all of the security and permanence guarantees of the blockchain.

I believe that Sui’s biggest contribution in the space is in constructing a viable blueprint for a L1 blockchain that is able to handle Web 2 level scalability. More importantly, I believe the development of Sui is symbolic of Web 3’s growing maturity, and growing confidence in the field that the long-run holds a promise of a billion-user scale. In this sense, no matter if the Sui ecosystem is able to thrive in the long run, I believe the project itself, as well as its innovative re-visioning of the blockchain already represents a resounding success.

🐩 @0xfishylosopher

📅 9 November 2022

Many thanks to Web3.com Ventures for consultation and sponsorship of this article.

References

[1] https://sui.io/resources-sui/raise-announcement/

[2] https://medium.com/mysten-labs/recap-7-21-sui-ama-unlocking-opportunities-on-sui-with-adeniyi-abiodun-f4fa96a15d7c

[3] https://sui.io/resources-move/why-we-created-sui-move/

[4] https://docs.sui.io/learn/how-sui-works

[5] https://www.cloudflare.com/learning/ssl/transport-layer-security-tls/

[6] https://docs.sui.io/learn/how-sui-works#simple-transactions

[7] https://boxmining.com/sui-blockchain/

[8] https://docs.sui.io/learn/architecture/consensus

[9] https://arxiv.org/abs/2105.11827

[10] https://www.arweave.org/technology

[11] https://docs.sui.io/learn/tokenomics

[12] https://docs.sui.io/learn/tokenomics/storage-fund

[13] More on delegators and validators: https://coinguides.org/validators-delegators-staking/

[14] A list of single-writer app examples: https://docs.sui.io/learn/single-writer-apps

[15] https://coming.chat/
zk-Sync v2.0 and the Future of zk-RollupsWeb3.com Ventures Original Research Analysis 0xFishylosopher Introduction “zk-Rollups” has probably been the hottest Web 3 buzzword of the year. With zk-Sync’s v2.0 “baby alpha” mainnet launch in just the past few days, this excitement has reached its zenith [1]. But behind all these buzzwords, what do “zk-Rollups” really refer to? And where does zk-Sync come into play? In this article, I will endeavor to dive deep into the principles and practice of zk-Rollups, explicate the key technical hallmarks of zk-Sync v2.0 as a project, and explore the potential future implications for this long-awaited technology. Principles of zk-Rollups Why do we need zk-Rollups in the first place? Sure, Ethereum is great. But in its current state, the network is fundamentally a diseconomy of scale. As network activity increases, gas prices become prohibitively expensive, particularly if there is a surge of network activity all at once. As Ethereum gains popularity in usage and traction over the past few years, its current limited scalability has become the network’s Achilles’ heel. This is where “rollups” come in — Ethereum rollups are essentially a “plugin” that provides Ethereum with extra magnitudes of scalability and thus fixing its inherent diseconomy of scale. The intuition behind the idea is simple. Imagine you have 5 items that you need to carry from point A to point B. The “regular” way of doing so would be carry Item 1, carry Item 2, etc. one after another. But this is obviously slow and cumbersome. A “rollup” is essentially “rolling” all 5 items into a single bag, thus allowing you to make a single trip instead of 5. But there are two caveats: How do we make sure the rollup can “fit” everything? How do we make sure the rollup is not spoofed? zk-Rollups are one of the leading types of rollup technologies (the other being Optimistic Rollups) leveraging “zero-knowledge proofs” to solve these two issues. To address these issues, a zk-Rollup will bundle a certain number of transactions together, do the computation on the L2, and submit both state changes and a “validity proof” to a verifier on the L1 showing that the computations were done with integrity. This “validity proof” takes place in the form of a “Zero-Knowledge Proof,” a mathematical way of telling someone you know something without telling them what you know. A simple example of a Zero-Knowledge Proof is a code autograder (for CS homework). The autograder is a ”verifier” that gives you a bunch of randomly generated test-cases, and you are a “prover” that must be able to pass all the test-cases to prove that you have the correct code. All the while, you don’t share your code with the autograder directly. And voila, you have just conducted a “Zero-Knowledge Proof,” proving that you know something without saying what you know. [2] The code autograder above uses an “interactive Zero-Knowledge-Proof,” where the autograder and code-provider directly “interact” with one another. In contrast, most zk-Rollups use a more mathematically complicated non-interactive proof (such as a zk-SNARK, or Zero Knowledge Succinct Noninteractive ARgument of Knowledge), which saves both time and space compared to an interactive proof. While the technical details of zk-SNARKs are beyond the scope of this article, the underlying principle of test-case-passing is the same. The holy grail of zk-Rollups is a Zero-Knowledge Ethereum Virtual Machine (zk-EVM) that allows developers to port over any Ethereum smart contract without modification onto a zk-Rollup chain. But this is hard. Because every “problem” requires different sets of “test-cases,” developing a “proof algorithm” that can solve every imaginable test-case is a technical bottleneck of Zero-Knowledge Proofs and zk-Rollups. As Vitalik Buterin himself states: In general, my own view is that in the short term, optimistic rollups are likely to win out for general-purpose EVM computation and ZK rollups are likely to win out for simple payments, exchange and other application-specific use cases, but in the medium to long term ZK rollups will win out in all use cases as ZK-SNARK technology improves. [3] Thus, historically zk-Rollups have only been established technologies for application-specific use cases, where “test-cases” are well-defined and limited in scope. However, several projects are rapidly advancing towards the “castle on the hill” — a generic EVM-compatible zk-Rollup algorithm. [4] zk-Sync v2.0 zk-Sync v2.0 is just one of the many projects in currently in play in developing a zk-EVM (others include StarkNet, Polygon Hermez, and Scroll). Unlike zk-Sync v1.0, which required users to re-build large sections of their codebases to port from the EVM to zk-Sync, in zk-Sync v2.0 programmers can deploy their applications with little-to-no changes — or as zk-Sync may like to claim. In practice, not all zk-EVMs are created equal. There is a distinct tradeoff between composability (how close it is to original EVM contracts) and performance (how fast the zk-Rollups will run) [6]. Within this tradeoff, zk-Sync chose to completely optimize for performance, thus sacrificing on composability. In Vitalik Buterin’s perspective, there are four distinct types of zk-EVMs, summarized in the following chart: As Vitalik states, in its current state zk-Sync v2.0 is a Type 4 zk-EVM, which is able to compile contracts written in Solidity and high-level languages using its own compiler, which is distinct from the EVM. Because zk-Sync has full control over the design of its compiler, they are able to aggressively optimize for speed and throughput. The cost of this is that some DApps and EVM debugging toolchains may be incompatible with zk-Sync v2.0. Essentially, zk-Sync is the same car shell as Ethereum but with an engine swapped out [5]. Indeed, in its developer documentation Matter Labs claims that while smart contract “read” operations can be integrated without any changes in code, smart contract “write” operations need “additional code” because of “fundamental differences between L1 and L2” [6]. Actually, this is slightly misleading. It is not so much due to a “fundamental difference” between L1 and L2 but more so because of the type of zk-Rollup that Matter Labs has decided to pursue — the Type 4 rollup. Because zk-Sync is fundamentally a Type 4 rollup that uses a different compiler and bytecode, this means that smart contracts have different addresses, and the debugger infrastructure that relies on the analysis of bytecode may not be able to work on zk-Sync v2.0 [7]. In the future, zk-Sync may add in more native support for EVM byte-code, allowing the system to slowly transition to a Type 3 rollup that supports a wider range of these “edge cases.” But for zk-Sync’s Type 4 or Type 3 zk-Rollup to succeed in comparison with Polygon Hermez and Scroll Labs’ Type 2 rollup, which essentially trades speed for wider compatibility, there must be two important preconditions. First, there is only a tiny fraction of unimportant projects that are incompatible with zk-Sync’s custom compiler. Secondly, there is a qualitative difference in zk-Sync’s execution speed compared with a Type 2 zk-EVM. Unfortunately, I personally believe this is unlikely to be the case. Any advanced development ecosystem relies on a mature “scaffolding” infrastructure, including convenient, modularized, debugging and testing tools. If, as Vitalik postulates, a large portion of EVM-native debugging tools will be unable to port to zk-Sync because of differences in bytecode, then zk-Sync will have to develop its own suite of testing and debugging tools. This is extra overhead that may ultimately hinder the adoption speed of zk-Sync as a L2 solution, compared to its more composable Type 2 zk-EVM competitors such as Polygon Hermez and Scroll. The Future for zk-Rollups With many competitive players in the battle for zk-EVMs, arguably it is only a matter of time before we see a fully-functional zk-EVM. But what’s next? A road is only useful so long as there are buildings on the road; the long-term strength of a zk-Rollup comes from the projects using that solution. Right now, DeFi, GameFi, and mobile applications are the main beneficiaries of zk-Rollup infrastructure. Both DeFi and GameFi are fundamentally economies of scale, as they thrive in an environment where there are lots of people are using them. Mobile applications such as mobile wallets also open the floodgates to the mass consumer that is too lazy (or can’t afford) a Desktop PC. Using zk-Rollups for these situations therefore makes a lot of sense. But this is by no means the limit of zk-Rollups usefulness. If anything, this is just the beginning. zk-Rollups are to Ethereum what 5G is to the Internet. Just as 5G can enable a new world of IoT applications and systems, zk-Rollups may also open the floodgates to a “Blockchain of Things,” allowing the digital appliances of our physical world — fridges, watches, traffic lights and all — to be intergrated with smart contracts secured on Ethereum. One of the largest arguments against the IoT is that it will allow Big Tech to overstep into our everyday life. But with a “Blockchain of Things,” we can enjoy the conveniences of IoT without worrying about our smart-appliances being compromised on a centralized database. Instead of convenience OR privacy, we can have convenience AND privacy. That’s the world that zk-Rollups can promise us. 🐩 @0xfishylosopher 📅 31 October 2022 This information is purely educational and should not be taken as financial advice. All views expressed are those of the author’s and are not necessarily endorsed by Web3.com Ventures. References [1] https://blog.matter-labs.io/baby-alpha-has-arrived-5b10798bc623 [2] Adapted from https://pages.cs.wisc.edu/~mkowalcz/628.pdf [3] https://vitalik.ca/general/2021/01/05/rollup.html [4] https://www.coindesk.com/tech/2022/07/20/the-sudden-rise-of-evm-compatible-zk-rollups/ [5] https://cryptobriefing.com/the-race-scale-ethereum-zkevm-rollups/ [6] https://docs.zksync.io/dev/contracts/#porting-smart-contracts [7] https://vitalik.ca/general/2022/08/04/zkevm.html

zk-Sync v2.0 and the Future of zk-Rollups

Web3.com Ventures Original Research Analysis

0xFishylosopher

Introduction

“zk-Rollups” has probably been the hottest Web 3 buzzword of the year. With zk-Sync’s v2.0 “baby alpha” mainnet launch in just the past few days, this excitement has reached its zenith [1]. But behind all these buzzwords, what do “zk-Rollups” really refer to? And where does zk-Sync come into play? In this article, I will endeavor to dive deep into the principles and practice of zk-Rollups, explicate the key technical hallmarks of zk-Sync v2.0 as a project, and explore the potential future implications for this long-awaited technology.

Principles of zk-Rollups

Why do we need zk-Rollups in the first place? Sure, Ethereum is great. But in its current state, the network is fundamentally a diseconomy of scale. As network activity increases, gas prices become prohibitively expensive, particularly if there is a surge of network activity all at once. As Ethereum gains popularity in usage and traction over the past few years, its current limited scalability has become the network’s Achilles’ heel.

This is where “rollups” come in — Ethereum rollups are essentially a “plugin” that provides Ethereum with extra magnitudes of scalability and thus fixing its inherent diseconomy of scale. The intuition behind the idea is simple. Imagine you have 5 items that you need to carry from point A to point B. The “regular” way of doing so would be carry Item 1, carry Item 2, etc. one after another. But this is obviously slow and cumbersome. A “rollup” is essentially “rolling” all 5 items into a single bag, thus allowing you to make a single trip instead of 5.

But there are two caveats:

How do we make sure the rollup can “fit” everything?

How do we make sure the rollup is not spoofed?

zk-Rollups are one of the leading types of rollup technologies (the other being Optimistic Rollups) leveraging “zero-knowledge proofs” to solve these two issues. To address these issues, a zk-Rollup will bundle a certain number of transactions together, do the computation on the L2, and submit both state changes and a “validity proof” to a verifier on the L1 showing that the computations were done with integrity. This “validity proof” takes place in the form of a “Zero-Knowledge Proof,” a mathematical way of telling someone you know something without telling them what you know.

A simple example of a Zero-Knowledge Proof is a code autograder (for CS homework). The autograder is a ”verifier” that gives you a bunch of randomly generated test-cases, and you are a “prover” that must be able to pass all the test-cases to prove that you have the correct code. All the while, you don’t share your code with the autograder directly. And voila, you have just conducted a “Zero-Knowledge Proof,” proving that you know something without saying what you know. [2]

The code autograder above uses an “interactive Zero-Knowledge-Proof,” where the autograder and code-provider directly “interact” with one another. In contrast, most zk-Rollups use a more mathematically complicated non-interactive proof (such as a zk-SNARK, or Zero Knowledge Succinct Noninteractive ARgument of Knowledge), which saves both time and space compared to an interactive proof. While the technical details of zk-SNARKs are beyond the scope of this article, the underlying principle of test-case-passing is the same.

The holy grail of zk-Rollups is a Zero-Knowledge Ethereum Virtual Machine (zk-EVM) that allows developers to port over any Ethereum smart contract without modification onto a zk-Rollup chain. But this is hard. Because every “problem” requires different sets of “test-cases,” developing a “proof algorithm” that can solve every imaginable test-case is a technical bottleneck of Zero-Knowledge Proofs and zk-Rollups.

As Vitalik Buterin himself states:

In general, my own view is that in the short term, optimistic rollups are likely to win out for general-purpose EVM computation and ZK rollups are likely to win out for simple payments, exchange and other application-specific use cases, but in the medium to long term ZK rollups will win out in all use cases as ZK-SNARK technology improves. [3]

Thus, historically zk-Rollups have only been established technologies for application-specific use cases, where “test-cases” are well-defined and limited in scope. However, several projects are rapidly advancing towards the “castle on the hill” — a generic EVM-compatible zk-Rollup algorithm. [4]

zk-Sync v2.0

zk-Sync v2.0 is just one of the many projects in currently in play in developing a zk-EVM (others include StarkNet, Polygon Hermez, and Scroll). Unlike zk-Sync v1.0, which required users to re-build large sections of their codebases to port from the EVM to zk-Sync, in zk-Sync v2.0 programmers can deploy their applications with little-to-no changes — or as zk-Sync may like to claim.

In practice, not all zk-EVMs are created equal. There is a distinct tradeoff between composability (how close it is to original EVM contracts) and performance (how fast the zk-Rollups will run) [6]. Within this tradeoff, zk-Sync chose to completely optimize for performance, thus sacrificing on composability.

In Vitalik Buterin’s perspective, there are four distinct types of zk-EVMs, summarized in the following chart:

As Vitalik states, in its current state zk-Sync v2.0 is a Type 4 zk-EVM, which is able to compile contracts written in Solidity and high-level languages using its own compiler, which is distinct from the EVM. Because zk-Sync has full control over the design of its compiler, they are able to aggressively optimize for speed and throughput. The cost of this is that some DApps and EVM debugging toolchains may be incompatible with zk-Sync v2.0. Essentially, zk-Sync is the same car shell as Ethereum but with an engine swapped out [5].

Indeed, in its developer documentation Matter Labs claims that while smart contract “read” operations can be integrated without any changes in code, smart contract “write” operations need “additional code” because of “fundamental differences between L1 and L2” [6]. Actually, this is slightly misleading. It is not so much due to a “fundamental difference” between L1 and L2 but more so because of the type of zk-Rollup that Matter Labs has decided to pursue — the Type 4 rollup. Because zk-Sync is fundamentally a Type 4 rollup that uses a different compiler and bytecode, this means that smart contracts have different addresses, and the debugger infrastructure that relies on the analysis of bytecode may not be able to work on zk-Sync v2.0 [7].

In the future, zk-Sync may add in more native support for EVM byte-code, allowing the system to slowly transition to a Type 3 rollup that supports a wider range of these “edge cases.” But for zk-Sync’s Type 4 or Type 3 zk-Rollup to succeed in comparison with Polygon Hermez and Scroll Labs’ Type 2 rollup, which essentially trades speed for wider compatibility, there must be two important preconditions. First, there is only a tiny fraction of unimportant projects that are incompatible with zk-Sync’s custom compiler. Secondly, there is a qualitative difference in zk-Sync’s execution speed compared with a Type 2 zk-EVM.

Unfortunately, I personally believe this is unlikely to be the case. Any advanced development ecosystem relies on a mature “scaffolding” infrastructure, including convenient, modularized, debugging and testing tools. If, as Vitalik postulates, a large portion of EVM-native debugging tools will be unable to port to zk-Sync because of differences in bytecode, then zk-Sync will have to develop its own suite of testing and debugging tools. This is extra overhead that may ultimately hinder the adoption speed of zk-Sync as a L2 solution, compared to its more composable Type 2 zk-EVM competitors such as Polygon Hermez and Scroll.

The Future for zk-Rollups

With many competitive players in the battle for zk-EVMs, arguably it is only a matter of time before we see a fully-functional zk-EVM. But what’s next? A road is only useful so long as there are buildings on the road; the long-term strength of a zk-Rollup comes from the projects using that solution.

Right now, DeFi, GameFi, and mobile applications are the main beneficiaries of zk-Rollup infrastructure. Both DeFi and GameFi are fundamentally economies of scale, as they thrive in an environment where there are lots of people are using them. Mobile applications such as mobile wallets also open the floodgates to the mass consumer that is too lazy (or can’t afford) a Desktop PC. Using zk-Rollups for these situations therefore makes a lot of sense.

But this is by no means the limit of zk-Rollups usefulness. If anything, this is just the beginning. zk-Rollups are to Ethereum what 5G is to the Internet. Just as 5G can enable a new world of IoT applications and systems, zk-Rollups may also open the floodgates to a “Blockchain of Things,” allowing the digital appliances of our physical world — fridges, watches, traffic lights and all — to be intergrated with smart contracts secured on Ethereum.

One of the largest arguments against the IoT is that it will allow Big Tech to overstep into our everyday life. But with a “Blockchain of Things,” we can enjoy the conveniences of IoT without worrying about our smart-appliances being compromised on a centralized database. Instead of convenience OR privacy, we can have convenience AND privacy. That’s the world that zk-Rollups can promise us.

🐩 @0xfishylosopher

📅 31 October 2022

This information is purely educational and should not be taken as financial advice. All views expressed are those of the author’s and are not necessarily endorsed by Web3.com Ventures.

References

[1] https://blog.matter-labs.io/baby-alpha-has-arrived-5b10798bc623

[2] Adapted from https://pages.cs.wisc.edu/~mkowalcz/628.pdf

[3] https://vitalik.ca/general/2021/01/05/rollup.html

[4] https://www.coindesk.com/tech/2022/07/20/the-sudden-rise-of-evm-compatible-zk-rollups/

[5] https://cryptobriefing.com/the-race-scale-ethereum-zkevm-rollups/

[6] https://docs.zksync.io/dev/contracts/#porting-smart-contracts

[7] https://vitalik.ca/general/2022/08/04/zkevm.html
On the Move — A Critical Analysis of AptosWeb3.com Ventures Original Research Analysis 0xFishylosopher Introduction With Aptos’ recent mainnet launch and its token launch on Binance and other major centralized exchanges, there’s no better time to write an exploratory analysis of this long-hailed project. The orphaned child of Meta’s attempts at creating the Libra cryptocurrency, Aptos carries on Libra’s ideal of being the blockchain that will onboard the next million users in crypto. The over-arching focus of Aptos is to deliver a scaleable, safe, and reliable blockchain [1]. To do so, Aptos relies on a new smart-contract language, Move, ostensibly a “safer” language than the current “gold standard” of Ethereum’s Solidity. Of course, Aptos is not the first L1 blockchain that markets itself as a “scaleable, safe and reliable.” Other chains, most notably Solana, have also tried to use this as their slogan in their past, with mixed success [2]. In this essay, I will endeavor to start from the basic attributes of the Move language, analyze Aptos’ technical fundamentals in the context of other similar projects, and deliver a critical assessment on the potential and limitations of this hot new chain. Move Language You can’t do an analysis of Aptos without an analysis of the Move Language. Indeed, this is arguably where most of Aptos’ potential comes from. Move was originally developed by Meta’s Libra Team (now named Diem) to provide a paradigmatically different smart contract programming model compared to other smart contract languages; instead of adopting a transaction-based approach, Move adopts a resource-based approach to defining smart contract models [3]. So what does all of this mean? Starting with Bitcoin, the vast majority of notable blockchains have adopted a “transaction-first” approach, where each entry in a blockchain’s block is a transaction (say between Alice and Bob). A transaction is a relative object; for there to be a transaction, there needs to be an Alice and a Bob address in existence. To ensure that Alice is not “double spending” a coin in a transaction-first approach, we need to check that there is no conflict with all other transactions, a hassle that inevitably slows transactions down. On the other hand, Move’s “resource-based” approach is dramatically different. Instead of recording lines of A to B transactions, Move records the object (or resource) changing hands and merely updates attributes accordingly. Importantly, resources are atomic entities — a resource X existing does not depend on any other resource existing, unlike in the case of transactions, where a transaction T depends on two prior agents, A and B’s existence. Thus, having a resource-based approach allows for a much more atomic, independent structure optimized for parallel execution. ”Resources” in Move are essentially objects that have certain attributes. In technical jargon, they are implemented structs (i.e. a container data structure) [4]. At this point, people with Ethereum programming backgrounds may raise a question: Solidity also has structs, as well as other data structures. What makes the structs in Move special? I believe that the most important difference is that while structs in Solidity have a relatively peripheral part in the language (being mostly used for clarity), structs are the basis of everything in Move. Moreover, Move’s structs have different “abilities,” including “copy”, “drop”, “store”, and “key,” which define if an object is duplicable, storable, deletable etc [5]. This sort of struct notation is completely different from Solidity, or indeed the vast majority of common programming languages (eg. C/C++). However, these different “abilities” are deeply intuitive. Some things you can “copy,” like a coin or token. Some things you can “drop,” or throw away. Other things, like a unique digital asset, might not be able to be copied. Now consider the following official definition of a “resource” on Move: “We often refer to struct values as resources if they cannot be copied and cannot be dropped.” — Structs and Resources, Move Language Documentation [4] So “resources” are essentially unique “structs” that will always be there for you. How romantic. Move’s complete redefinition of a “struct” allows it to use one data structure to succinctly define tokens, NFTs, smart contracts, and other digital assets in a clean, modular approach [6]. This is something that Ethereum’s EVM simply cannot compare; anyone with some experience developing using the EVM knows how frustrating it is to deal with structs (and nested structs) in Solidity. Crucially, Move’s resource based approach underwrites its security guarantees. As Move’s whitepaper mentions, in many programs there is the indirect representation of assets, which means scarcity and access control is not scalable. For example, while Ether’s scarcity is hardcoded and protected by the EVM, other ERC-20 tokens “do not inherit these protections, they must be careful not to introduce bugs that allow duplication, reuse, or loss of assets,” [4]. Consider the classic reentrancy attack. This is something that has long plagued EVM projects, often causing them to lose millions and millions of dollars [7]. Because of Move’s resource-based approach, the reentrancy attack is impossible on Move. Thus, the barrier to entry coding Move is arguably a lot more accessible than the EVM — a crucial factor for mass adoption. Aptos — Technical Principles Arguably the Aptos’ biggest accomplishment is simply putting Move as a language onto the market. ****Indeed, much of Aptos’ design is taken directly from the Diem (Libra) blockchain developed by Meta. Recall that Aptos is a chain that has an emphasis on scalability and security. There are two main technical implementations used to achieve this. The first is in its consensus layer, namely the AptosBFT consensus mechanism. The second is its execution layer, or the Block-STM execution engine. Let’s examine these in order. First, AptosBFT is a Byzantine Fault-Tolerant consensus mechanism, which essentially means that it can continue to guarantee security so long as no more than 1/3 of the nodes on the network are malicious actors [8]. This is a standard practice: Bitcoin’s Proof of Work, Ethereum’s Proof-of-Stake etc. are all Byzantine Fault-Tolerant. So how does AptosBFT actually implement its consensus mechanism? Like all other Proof-of-Stake chains, Aptos relies on “staked” validator nodes. It implements a variant of the HotStuff consensus algorithm, a modern PoS algorithm that offers a much cleaner communication protocol between nodes compared to earlier models (eg. pBFT) by having nodes communicate with a “rotating leader” rather than with every other node [9]. While AptosBFT introduces several optimizations on top of HotStuff, most notably a “key rotation mechanism” (to enhance security) and a reduction in overhead (to enhance scalability), there are no major structural change to HotStuff’s PoS design [10]. On the other hand, the Block-STM parallel execution engine is a far more original design, and arguably the project’s primary technical breakthrough [11]. Recall that resources are atomic — the changing hands of a certain resource X does not depend on any other resource. By grouping transactions into blocks (like lego pieces), we can leverage a massive parallelism in executing resource transactions, gaining a 20x increase in computation speed compared with Ethereum [11]. But how does this compare to Solana and other parallel execution engines? Solana’s primary innovation was to hardcode all of its smart contracts dependencies in order to allow it to conduct parallel executions. With all the dependencies known in advance, the Solana execution engine can then lock up the necessary dependencies before runtime. On the other hand, Aptos manages dependencies on the fly, first running all the transactions “optimistically,” and in the case of failure, re-running those specific transactions that have failed [12]. Let’s use a simple analogy to explain the above. Ethereum is a one-lane road where all the cars must travel in a single line behind one another, whereas both Solana and Aptos are four-lane expressways. This obviously speeds things up. But the question is, how do the cars travel on the expressway? In the case of Solana, each car is essentially assigned a specific lane, and will always drive along the lane. This is very fast and efficient in the best-case scenario, but if the assignment process goes wrong you have a very bad worst-case scenario. On the other hand, in the case of Aptos, the cars are first assigned some lanes, but if this doesn’t work, then the cars are allowed to swerve into other lanes to prevent accidents. Therefore, even though Aptos may not run as fast as Solana in the best-case scenario, it is more reliable than Solana, and has a better worse-case scenario [13]. Evaluation — Potential and Limitations Aptos’ is currently the first and only major Move-based L1 chain that has launched a mainnet. As such, Aptos’ mainnet release is an incredibly symbolic event; the money and media attention that Aptos has attracted is not necessarily commensurate with its inherent technical innovation. As mentioned above, the primary technical innovation that Aptos itself has made is in its Block-STM parallel execution engine. Most of the other technical aspects, including its consensus mechanism and use of the Move language, are marginally improved legacy products inherited from Meta [14]. Regarding the Block-STM parallel execution engine, it is unclear whether this will cause a qualitative difference between the user-experience on Aptos versus the user-experience on Solana; if the user-experience is the same, the user won’t care how it is implemented on the technical level. As with any infrastructure project, Aptos’ long-run success is dependent on what projects choose to build on it. Currently, the projects in its ecosystem, such as the Liquidswap DEX, Topaz NFT marketplace, and Aptos Names domain service, while necessary products in any blockchain ecosystem, are not unique enough to Aptos’ design [15]. These projects are like the grocery store, a school, and a hospital in a small village. While necessary to the village’s functioning, they don’t give the village any unique character that is not found in any other village. Looking back in time, part of Solana’s success may be attributed to projects such as StepN that were actually able to leverage Solana’s advantage of parallelism. These GameFi and SocialFi projects often involve large amounts of parallel data that are relatively isolated. Thus, these applications are uniquely suited to parallel execution engines, such as Aptos and Solana, while also able to bring a large consumer base and high transaction volume. Indeed, at one point, StepN alone accounted for around ~20% of Solana’s fee-paying users [16]. Thus, for Aptos to succeed as an infrastructure project, there needs to be a unique array of projects that greatly take advantage of Aptos’ parallel execution engine, projects that couldn’t be done on Ethereum or even Solana. With Aptos’ novel Block-STM design, these could potentially take shape in the form of a new class of SocialFi projects where users interact in small cliques (like group chats). Because these groups can be fluid and dynamic, while also somewhat independent of one another, they are much more suited to Aptos’ parallel execution engine than Solana’s or Ethereum’s. But of course, this is just an abstract speculation. Nonetheless, Aptos’ mainnet launch is still an incredibly important and symbolic event in the crypto world. It represents the culmination of years’ of Meta engineers’ efforts in bringing to life a qualitatively different paradigm for solving Ethereum’s bottleneck of scalability. Move’s unique language features mean that there inevitably will be DApps that are far better to be implemented on Move rather than on Solidity or other smart contract languages. But will those projects necessarily be implemented in the Aptos ecosystem? No one knows. Aptos may enjoy some first-mover advantages of being able to bring Move into production and tying its name with the Move language. But at the same time, it may well suffer from a MySpace-like fate of being the first-mover. At least one thing is certain: Aptos may be the market’s first experiment with Move, but it certainly will not be the last [17]. 🐩 @0xfishylosopher 📅 22 October 2022 Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision. References [1] See the main page of Aptos Labs: https://aptoslabs.com/ [2] Introduction to Solana: https://www.forbes.com/advisor/investing/cryptocurrency/what-is-solana/ [3] Move Language Whitepaper: https://diem-developers-components.netlify.app/papers/diem-move-a-language-with-programmable-resources/2020-05-26.pdf [4] Move Developer Documentation for Structs and Resources: https://github.com/move-language/move/blob/main/language/documentation/book/src/structs-and-resources.md [5] Details on Struct Abilities: https://move-language.github.io/move/abilities.html [6] See Move Language Tutorial: https://101blockchains.com/move-programming-language-tutorial/ [7] https://blog.chain.link/reentrancy-attacks-and-the-dao-hack/ [8] Details of Byzantine Fault Tolerance: https://decrypt.co/resources/byzantine-fault-tolerance-what-is-it-explained [9] HotStuff Consensus Model: https://hackernoon.com/hotstuff-the-consensus-protocol-behind-safestake-and-facebooks-librabft [10] AptosBFT Consensus: https://blog.pontem.network/aptosbft-all-you-need-to-know-about-the-bft-consensus-in-aptos-ff4cf22c7a80 [11] See Block-STM technical paper: https://arxiv.org/abs/2203.06871 [12] Aptos vs Solana Execution Engine Comparison: https://www.gofyeo.com/post/aptos-vs-solana-a-technical-comparison-part-1 [13] Aptos worst-case analysis as O(nÂČ), Solana as O(n⁎). Details: https://antiape.substack.com/p/cf40447d-d674-4454-9e09-f003517de6de?s=r [14] See Aptos Whitepaper: https://aptos.dev/assets/files/Aptos-Whitepaper-47099b4b907b432f81fc0effd34f3b6a.pdf [15] Aptos Ecosystem: https://coinmarketcap.com/alexandria/article/what-is-aptos-the-ultimate-guide-to-the-aptos-ecosystem [16] From Messari Crypto’s Twitter: https://twitter.com/MessariCrypto/status/1567880625218019333 [17] See Sui, another Move-based Language: https://sui.io/

On the Move — A Critical Analysis of Aptos

Web3.com Ventures Original Research Analysis

0xFishylosopher

Introduction

With Aptos’ recent mainnet launch and its token launch on Binance and other major centralized exchanges, there’s no better time to write an exploratory analysis of this long-hailed project. The orphaned child of Meta’s attempts at creating the Libra cryptocurrency, Aptos carries on Libra’s ideal of being the blockchain that will onboard the next million users in crypto.

The over-arching focus of Aptos is to deliver a scaleable, safe, and reliable blockchain [1]. To do so, Aptos relies on a new smart-contract language, Move, ostensibly a “safer” language than the current “gold standard” of Ethereum’s Solidity. Of course, Aptos is not the first L1 blockchain that markets itself as a “scaleable, safe and reliable.” Other chains, most notably Solana, have also tried to use this as their slogan in their past, with mixed success [2]. In this essay, I will endeavor to start from the basic attributes of the Move language, analyze Aptos’ technical fundamentals in the context of other similar projects, and deliver a critical assessment on the potential and limitations of this hot new chain.

Move Language

You can’t do an analysis of Aptos without an analysis of the Move Language. Indeed, this is arguably where most of Aptos’ potential comes from. Move was originally developed by Meta’s Libra Team (now named Diem) to provide a paradigmatically different smart contract programming model compared to other smart contract languages; instead of adopting a transaction-based approach, Move adopts a resource-based approach to defining smart contract models [3].

So what does all of this mean? Starting with Bitcoin, the vast majority of notable blockchains have adopted a “transaction-first” approach, where each entry in a blockchain’s block is a transaction (say between Alice and Bob). A transaction is a relative object; for there to be a transaction, there needs to be an Alice and a Bob address in existence. To ensure that Alice is not “double spending” a coin in a transaction-first approach, we need to check that there is no conflict with all other transactions, a hassle that inevitably slows transactions down.

On the other hand, Move’s “resource-based” approach is dramatically different. Instead of recording lines of A to B transactions, Move records the object (or resource) changing hands and merely updates attributes accordingly. Importantly, resources are atomic entities — a resource X existing does not depend on any other resource existing, unlike in the case of transactions, where a transaction T depends on two prior agents, A and B’s existence. Thus, having a resource-based approach allows for a much more atomic, independent structure optimized for parallel execution.

”Resources” in Move are essentially objects that have certain attributes. In technical jargon, they are implemented structs (i.e. a container data structure) [4]. At this point, people with Ethereum programming backgrounds may raise a question: Solidity also has structs, as well as other data structures. What makes the structs in Move special? I believe that the most important difference is that while structs in Solidity have a relatively peripheral part in the language (being mostly used for clarity), structs are the basis of everything in Move. Moreover, Move’s structs have different “abilities,” including “copy”, “drop”, “store”, and “key,” which define if an object is duplicable, storable, deletable etc [5]. This sort of struct notation is completely different from Solidity, or indeed the vast majority of common programming languages (eg. C/C++). However, these different “abilities” are deeply intuitive. Some things you can “copy,” like a coin or token. Some things you can “drop,” or throw away. Other things, like a unique digital asset, might not be able to be copied. Now consider the following official definition of a “resource” on Move:

“We often refer to struct values as resources if they cannot be copied and cannot be dropped.” — Structs and Resources, Move Language Documentation [4]

So “resources” are essentially unique “structs” that will always be there for you. How romantic.

Move’s complete redefinition of a “struct” allows it to use one data structure to succinctly define tokens, NFTs, smart contracts, and other digital assets in a clean, modular approach [6]. This is something that Ethereum’s EVM simply cannot compare; anyone with some experience developing using the EVM knows how frustrating it is to deal with structs (and nested structs) in Solidity.

Crucially, Move’s resource based approach underwrites its security guarantees. As Move’s whitepaper mentions, in many programs there is the indirect representation of assets, which means scarcity and access control is not scalable. For example, while Ether’s scarcity is hardcoded and protected by the EVM, other ERC-20 tokens “do not inherit these protections, they must be careful not to introduce bugs that allow duplication, reuse, or loss of assets,” [4]. Consider the classic reentrancy attack. This is something that has long plagued EVM projects, often causing them to lose millions and millions of dollars [7]. Because of Move’s resource-based approach, the reentrancy attack is impossible on Move. Thus, the barrier to entry coding Move is arguably a lot more accessible than the EVM — a crucial factor for mass adoption.

Aptos — Technical Principles

Arguably the Aptos’ biggest accomplishment is simply putting Move as a language onto the market. ****Indeed, much of Aptos’ design is taken directly from the Diem (Libra) blockchain developed by Meta. Recall that Aptos is a chain that has an emphasis on scalability and security. There are two main technical implementations used to achieve this. The first is in its consensus layer, namely the AptosBFT consensus mechanism. The second is its execution layer, or the Block-STM execution engine. Let’s examine these in order.

First, AptosBFT is a Byzantine Fault-Tolerant consensus mechanism, which essentially means that it can continue to guarantee security so long as no more than 1/3 of the nodes on the network are malicious actors [8]. This is a standard practice: Bitcoin’s Proof of Work, Ethereum’s Proof-of-Stake etc. are all Byzantine Fault-Tolerant. So how does AptosBFT actually implement its consensus mechanism? Like all other Proof-of-Stake chains, Aptos relies on “staked” validator nodes. It implements a variant of the HotStuff consensus algorithm, a modern PoS algorithm that offers a much cleaner communication protocol between nodes compared to earlier models (eg. pBFT) by having nodes communicate with a “rotating leader” rather than with every other node [9]. While AptosBFT introduces several optimizations on top of HotStuff, most notably a “key rotation mechanism” (to enhance security) and a reduction in overhead (to enhance scalability), there are no major structural change to HotStuff’s PoS design [10].

On the other hand, the Block-STM parallel execution engine is a far more original design, and arguably the project’s primary technical breakthrough [11]. Recall that resources are atomic — the changing hands of a certain resource X does not depend on any other resource. By grouping transactions into blocks (like lego pieces), we can leverage a massive parallelism in executing resource transactions, gaining a 20x increase in computation speed compared with Ethereum [11].

But how does this compare to Solana and other parallel execution engines? Solana’s primary innovation was to hardcode all of its smart contracts dependencies in order to allow it to conduct parallel executions. With all the dependencies known in advance, the Solana execution engine can then lock up the necessary dependencies before runtime. On the other hand, Aptos manages dependencies on the fly, first running all the transactions “optimistically,” and in the case of failure, re-running those specific transactions that have failed [12].

Let’s use a simple analogy to explain the above. Ethereum is a one-lane road where all the cars must travel in a single line behind one another, whereas both Solana and Aptos are four-lane expressways. This obviously speeds things up. But the question is, how do the cars travel on the expressway? In the case of Solana, each car is essentially assigned a specific lane, and will always drive along the lane. This is very fast and efficient in the best-case scenario, but if the assignment process goes wrong you have a very bad worst-case scenario. On the other hand, in the case of Aptos, the cars are first assigned some lanes, but if this doesn’t work, then the cars are allowed to swerve into other lanes to prevent accidents. Therefore, even though Aptos may not run as fast as Solana in the best-case scenario, it is more reliable than Solana, and has a better worse-case scenario [13].

Evaluation — Potential and Limitations

Aptos’ is currently the first and only major Move-based L1 chain that has launched a mainnet. As such, Aptos’ mainnet release is an incredibly symbolic event; the money and media attention that Aptos has attracted is not necessarily commensurate with its inherent technical innovation.

As mentioned above, the primary technical innovation that Aptos itself has made is in its Block-STM parallel execution engine. Most of the other technical aspects, including its consensus mechanism and use of the Move language, are marginally improved legacy products inherited from Meta [14]. Regarding the Block-STM parallel execution engine, it is unclear whether this will cause a qualitative difference between the user-experience on Aptos versus the user-experience on Solana; if the user-experience is the same, the user won’t care how it is implemented on the technical level.

As with any infrastructure project, Aptos’ long-run success is dependent on what projects choose to build on it. Currently, the projects in its ecosystem, such as the Liquidswap DEX, Topaz NFT marketplace, and Aptos Names domain service, while necessary products in any blockchain ecosystem, are not unique enough to Aptos’ design [15]. These projects are like the grocery store, a school, and a hospital in a small village. While necessary to the village’s functioning, they don’t give the village any unique character that is not found in any other village.

Looking back in time, part of Solana’s success may be attributed to projects such as StepN that were actually able to leverage Solana’s advantage of parallelism. These GameFi and SocialFi projects often involve large amounts of parallel data that are relatively isolated. Thus, these applications are uniquely suited to parallel execution engines, such as Aptos and Solana, while also able to bring a large consumer base and high transaction volume. Indeed, at one point, StepN alone accounted for around ~20% of Solana’s fee-paying users [16].

Thus, for Aptos to succeed as an infrastructure project, there needs to be a unique array of projects that greatly take advantage of Aptos’ parallel execution engine, projects that couldn’t be done on Ethereum or even Solana. With Aptos’ novel Block-STM design, these could potentially take shape in the form of a new class of SocialFi projects where users interact in small cliques (like group chats). Because these groups can be fluid and dynamic, while also somewhat independent of one another, they are much more suited to Aptos’ parallel execution engine than Solana’s or Ethereum’s. But of course, this is just an abstract speculation.

Nonetheless, Aptos’ mainnet launch is still an incredibly important and symbolic event in the crypto world. It represents the culmination of years’ of Meta engineers’ efforts in bringing to life a qualitatively different paradigm for solving Ethereum’s bottleneck of scalability. Move’s unique language features mean that there inevitably will be DApps that are far better to be implemented on Move rather than on Solidity or other smart contract languages. But will those projects necessarily be implemented in the Aptos ecosystem? No one knows. Aptos may enjoy some first-mover advantages of being able to bring Move into production and tying its name with the Move language. But at the same time, it may well suffer from a MySpace-like fate of being the first-mover.

At least one thing is certain: Aptos may be the market’s first experiment with Move, but it certainly will not be the last [17].

🐩 @0xfishylosopher

📅 22 October 2022

Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision.

References

[1] See the main page of Aptos Labs: https://aptoslabs.com/

[2] Introduction to Solana: https://www.forbes.com/advisor/investing/cryptocurrency/what-is-solana/

[3] Move Language Whitepaper: https://diem-developers-components.netlify.app/papers/diem-move-a-language-with-programmable-resources/2020-05-26.pdf

[4] Move Developer Documentation for Structs and Resources: https://github.com/move-language/move/blob/main/language/documentation/book/src/structs-and-resources.md

[5] Details on Struct Abilities: https://move-language.github.io/move/abilities.html

[6] See Move Language Tutorial: https://101blockchains.com/move-programming-language-tutorial/

[7] https://blog.chain.link/reentrancy-attacks-and-the-dao-hack/

[8] Details of Byzantine Fault Tolerance: https://decrypt.co/resources/byzantine-fault-tolerance-what-is-it-explained

[9] HotStuff Consensus Model: https://hackernoon.com/hotstuff-the-consensus-protocol-behind-safestake-and-facebooks-librabft

[10] AptosBFT Consensus: https://blog.pontem.network/aptosbft-all-you-need-to-know-about-the-bft-consensus-in-aptos-ff4cf22c7a80

[11] See Block-STM technical paper: https://arxiv.org/abs/2203.06871

[12] Aptos vs Solana Execution Engine Comparison: https://www.gofyeo.com/post/aptos-vs-solana-a-technical-comparison-part-1

[13] Aptos worst-case analysis as O(nÂČ), Solana as O(n⁎). Details: https://antiape.substack.com/p/cf40447d-d674-4454-9e09-f003517de6de?s=r

[14] See Aptos Whitepaper: https://aptos.dev/assets/files/Aptos-Whitepaper-47099b4b907b432f81fc0effd34f3b6a.pdf

[15] Aptos Ecosystem: https://coinmarketcap.com/alexandria/article/what-is-aptos-the-ultimate-guide-to-the-aptos-ecosystem

[16] From Messari Crypto’s Twitter: https://twitter.com/MessariCrypto/status/1567880625218019333

[17] See Sui, another Move-based Language: https://sui.io/
Decentralized Identity In Web3.0What decentralized identity brings to Web3.0? Introduction In our daily life, we frequently interact with different identities such as email addresses, social media accounts, passports, and more. This data is generally stored in centralized servers and controlled by third-party companies, so owners of identities have no control over this information. What’s worse, if the organization bans the owner from accessing its services or the organization ceases to exist, the data stored will be eliminated. Nowadays, more people have started to pay attention to the privacy and security of their personal data. The Decentralized Identity, a.k.a DID, has developed rapidly in the past few years. The development of DID benefits from the emergence of blockchain technology and the impact of COVID-19 on offline authentications. As a report from KBV Research says, the global decentralized identity market size is expected to reach $8.9 billion by 2028, rising at a market growth of 78.5% compound annual growth rate during the forecast period. The commercial potential of DID products deserves more attention. In the development of DID, there is still debate around its specific definition. Currently, the most accepted standard for DID is made by W3C. The three core elements in this system are: DID: The Decentralized Identifier, is defined as a URI composed of three parts: the scheme did, a method identifier, and a unique, method-specific identifier specified by the DID method. DID is resolvable to DID documents. DID Document: DID documents contain information associated with DID. They typically express verification methods, such as cryptographic public keys, and services relevant to interactions with the DID subject. Verifiable Data Registry: System that supports recording DID and returning data necessary to produce DID documents. Examples: Distributed ledgers, and peer-to-peer networks. With this DID system, users will not reveal their identity when authenticating. In the creation of DID, a third-party agency writes the identity certificate into DID Docs according to information provided by the user. During authentication, the owner of DID only needs to give the authenticate permission to access a specific DID Doc without revealing a practical identity. Why DID is important to Web3.0? In current Web3.0, the identity certificates we talk about generally refer to the wallet address. However, since the creation of the wallet does not require any additional verification procedures, a large number of “fake” identities get created. In “Decentralized Society: Finding Web3’s Soul”, Vitalik also points out that the lack of native web3 social identity prevents Web3.0 applications from getting rid of the Web2.0 business model to develop their native products of Web3.0. DeFi This problem greatly impacts the development of DeFi projects, so they cannot even provide low-mortgage lending services like in traditional finance, which declines capital efficiency in the crypto market and goes against the idea of providing financial services to people that are ignored by traditional finance. Due to a lack of liquidity/capital efficiency, even the blue-chip DeFi projects are performing poorly during the bear market. Reducing or canceling collateral will be the key to the next wave of DeFi booming, and a mature and effective DID system is the premise to achieve this goal. DAO DAO is also a concept that has gradually matured with the development of blockchain technology. DAOs typically use token-based governance systems and voting weights increase as token holdings rise. Such a system can work stably in isolated environments but are quite inefficient when they need to cooperate or build a larger ecosystem together. By implementing contributions into DID identities, contributions become transferable between DAOs, allowing good contributors to develop different DAOs at the same time. In addition, Web3.0 projects can easily find excellent candidates through DID identity instead of using the less efficient Web2.0 recruitment method. NFT The booming of NFTs last year led to a new wave in the Crypto market, and brings $ETH hits new ATH in November 2021. NFT provides artists with a new monetization channel and enables them to build their community independently. However, without a unified DID system, plagiarism and fraud can cause a significant impact on the confidence of the NFT market. It also makes it difficult for artists to communicate effectively with their fans. With DID infrastructures, creators can protect their rights as content creators. Buyers and sellers in the secondary market will also be able to verify the origin of their collections. Additionally, it can prevent speculations from the community, and provide exclusive content to fans. Airdrops Airdrop is always one of the most popular features in crypto. Projects use airdrops to accumulate their first user base and community engagements. Such a model naturally attracts the attention of speculators. Airdrop farmers start to grid airdrops by creating numerous wallets and interacting with the project. To avoid this malicious attack, projects keep raising the barrier to getting airdrops these days. But the overly complex quests also prevent new crypto players from joining the airdrop campaign. With DID authentication, the project can ensure sufficient attention was received during airdrop events while reducing the barrier for users to participate in this event. How DID integrates with the current Crypto market? On-chain Credit protocol The credit model is the cornerstone of traditional finance as it enables more borrowers to receive loans from banks or institutions without having to post collateral. There are some DeFi protocols started to establish credit systems within their ecosystem. Goldfinch is a lending protocol that connects real-world businesses with the crypto market. The protocol is based on a principle called “trust through consensus”, that borrowers show creditworthiness based on the collective assessment of other participants rather than based on their crypto assets. Although this credit system is not fully on-chain, as it is combined with off-chain legal agreements, it still expands the boundaries for the application of the DID system in DeFi. On-Chain Identity Aggregation The demand for integrated on-chain identities is expected to increase as a large number of Dapps have been developed these years. The interoperability of personal identity data stored in different Dapps is fundamental when prompting the whole ecosystem to move forward. Besides interoperability, a unified data structure or identifier is also crucial for the multi-chain ecosystem in the future. With an identity aggregator, protocols can provide services to users by receiving identity data from a single identifier. Litentry is a DID aggregator on Polkadot that enables linking user identities among multiple decentralized networks. It established an identity-based network to store user activity data and identity authentication. Protocols can access this data after receiving permission from owners. Future of DID Although Decentralized Identity has come a long way, its adoption in Web 3.0 is still in its early stages. Similar to KYC as the entrance of CEX, the DID infrastructures will become the entrance of Web3.0 in the future. This will inevitably lead them to compete directly with the current entrance, wallets, but DID itself can hardly capture value since the value of user identities is determined by the ecosystem. It is more likely that multi-chain wallets will integrate with DID system in the near future and become identity aggregators. It is still difficult for us to determine how the DID system will explode now, but as an indispensable product for Web3.0 in the future, it will bring new possibilities to the crypto market. Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision. 🐩 @SoxPt50 📅 10 October 2022 Link: [1] https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4105763 [2] https://mirror.xyz/shreyjain.eth/TyBzMOegl3rMNxpAFoJ36MjE0pGfdLcrVCBgy-x3qS8 [3] https://www.nasdaq.com/articles/the-risks-and-benefits-of-on-chain-credit-protocols

Decentralized Identity In Web3.0

What decentralized identity brings to Web3.0?

Introduction

In our daily life, we frequently interact with different identities such as email addresses, social media accounts, passports, and more. This data is generally stored in centralized servers and controlled by third-party companies, so owners of identities have no control over this information. What’s worse, if the organization bans the owner from accessing its services or the organization ceases to exist, the data stored will be eliminated.

Nowadays, more people have started to pay attention to the privacy and security of their personal data. The Decentralized Identity, a.k.a DID, has developed rapidly in the past few years. The development of DID benefits from the emergence of blockchain technology and the impact of COVID-19 on offline authentications. As a report from KBV Research says, the global decentralized identity market size is expected to reach $8.9 billion by 2028, rising at a market growth of 78.5% compound annual growth rate during the forecast period. The commercial potential of DID products deserves more attention.

In the development of DID, there is still debate around its specific definition. Currently, the most accepted standard for DID is made by W3C.

The three core elements in this system are:

DID: The Decentralized Identifier, is defined as a URI composed of three parts: the scheme did, a method identifier, and a unique, method-specific identifier specified by the DID method. DID is resolvable to DID documents.

DID Document: DID documents contain information associated with DID. They typically express verification methods, such as cryptographic public keys, and services relevant to interactions with the DID subject.

Verifiable Data Registry: System that supports recording DID and returning data necessary to produce DID documents. Examples: Distributed ledgers, and peer-to-peer networks.

With this DID system, users will not reveal their identity when authenticating. In the creation of DID, a third-party agency writes the identity certificate into DID Docs according to information provided by the user. During authentication, the owner of DID only needs to give the authenticate permission to access a specific DID Doc without revealing a practical identity.

Why DID is important to Web3.0?

In current Web3.0, the identity certificates we talk about generally refer to the wallet address. However, since the creation of the wallet does not require any additional verification procedures, a large number of “fake” identities get created. In “Decentralized Society: Finding Web3’s Soul”, Vitalik also points out that the lack of native web3 social identity prevents Web3.0 applications from getting rid of the Web2.0 business model to develop their native products of Web3.0.

DeFi

This problem greatly impacts the development of DeFi projects, so they cannot even provide low-mortgage lending services like in traditional finance, which declines capital efficiency in the crypto market and goes against the idea of providing financial services to people that are ignored by traditional finance. Due to a lack of liquidity/capital efficiency, even the blue-chip DeFi projects are performing poorly during the bear market.

Reducing or canceling collateral will be the key to the next wave of DeFi booming, and a mature and effective DID system is the premise to achieve this goal.

DAO

DAO is also a concept that has gradually matured with the development of blockchain technology. DAOs typically use token-based governance systems and voting weights increase as token holdings rise. Such a system can work stably in isolated environments but are quite inefficient when they need to cooperate or build a larger ecosystem together.

By implementing contributions into DID identities, contributions become transferable between DAOs, allowing good contributors to develop different DAOs at the same time. In addition, Web3.0 projects can easily find excellent candidates through DID identity instead of using the less efficient Web2.0 recruitment method.

NFT

The booming of NFTs last year led to a new wave in the Crypto market, and brings $ETH hits new ATH in November 2021. NFT provides artists with a new monetization channel and enables them to build their community independently. However, without a unified DID system, plagiarism and fraud can cause a significant impact on the confidence of the NFT market. It also makes it difficult for artists to communicate effectively with their fans.

With DID infrastructures, creators can protect their rights as content creators. Buyers and sellers in the secondary market will also be able to verify the origin of their collections. Additionally, it can prevent speculations from the community, and provide exclusive content to fans.

Airdrops

Airdrop is always one of the most popular features in crypto. Projects use airdrops to accumulate their first user base and community engagements. Such a model naturally attracts the attention of speculators. Airdrop farmers start to grid airdrops by creating numerous wallets and interacting with the project. To avoid this malicious attack, projects keep raising the barrier to getting airdrops these days. But the overly complex quests also prevent new crypto players from joining the airdrop campaign.

With DID authentication, the project can ensure sufficient attention was received during airdrop events while reducing the barrier for users to participate in this event.

How DID integrates with the current Crypto market?

On-chain Credit protocol

The credit model is the cornerstone of traditional finance as it enables more borrowers to receive loans from banks or institutions without having to post collateral. There are some DeFi protocols started to establish credit systems within their ecosystem.

Goldfinch is a lending protocol that connects real-world businesses with the crypto market. The protocol is based on a principle called “trust through consensus”, that borrowers show creditworthiness based on the collective assessment of other participants rather than based on their crypto assets. Although this credit system is not fully on-chain, as it is combined with off-chain legal agreements, it still expands the boundaries for the application of the DID system in DeFi.

On-Chain Identity Aggregation

The demand for integrated on-chain identities is expected to increase as a large number of Dapps have been developed these years. The interoperability of personal identity data stored in different Dapps is fundamental when prompting the whole ecosystem to move forward. Besides interoperability, a unified data structure or identifier is also crucial for the multi-chain ecosystem in the future. With an identity aggregator, protocols can provide services to users by receiving identity data from a single identifier.

Litentry is a DID aggregator on Polkadot that enables linking user identities among multiple decentralized networks. It established an identity-based network to store user activity data and identity authentication. Protocols can access this data after receiving permission from owners.

Future of DID

Although Decentralized Identity has come a long way, its adoption in Web 3.0 is still in its early stages. Similar to KYC as the entrance of CEX, the DID infrastructures will become the entrance of Web3.0 in the future. This will inevitably lead them to compete directly with the current entrance, wallets, but DID itself can hardly capture value since the value of user identities is determined by the ecosystem. It is more likely that multi-chain wallets will integrate with DID system in the near future and become identity aggregators.

It is still difficult for us to determine how the DID system will explode now, but as an indispensable product for Web3.0 in the future, it will bring new possibilities to the crypto market.

Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision.

🐩 @SoxPt50

📅 10 October 2022

Link:

[1] https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4105763

[2] https://mirror.xyz/shreyjain.eth/TyBzMOegl3rMNxpAFoJ36MjE0pGfdLcrVCBgy-x3qS8

[3] https://www.nasdaq.com/articles/the-risks-and-benefits-of-on-chain-credit-protocols
CC0, More Possibilities for NFTsNFT and IP The NFT market has experienced a boom in the past year and reached its ATH monthly trading volume of $6B in January 2022. NFT trading platforms like LooksRare and Magic Eden were developing rapidly and threatened the dominance of OpenSea. Nansen’s reports in April also indicated that the NFT market is growing more than the crypto market in the first quarter of 2022. However, the market quickly turned over after Luna’s collapse and Celsius’s suspension. Liquid assets in the crypto market were greatly reduced after U.S. Federal Reserve increased its benchmark interest by 75 basis points, which forced holders to sell their NFTs and regain their cash flow. The price of NFTs got dumped corresponding with the sharp drop in digital currency prices. Blue-Chip NFTs like BAYC and Azuki dropped over 50% from their ATH. Besides the price reduction, the future direction of the NFT market is still foggy. The NFT was initially created to properly demonstrate the copyrights of creators in the digital world using blockchain technologies. It provides creators with a new distribution channel that directly connects them with their fanbases/communities. The value of any NFT comes from the copyright it represents when compared with its photoshoots. However, when we consider whether an NFT is more expensive than others, we are talking about its cultural gene, community, and self-identity. In general, IP is the core value of NFTs. Since NFT has high market potential, the debates surrounding IP rights never stop. In the past few years, approaches for building brands, communities, and content through IP differ between NFT projects. Let’s take a look at Cryptopunk and BAYC. Before Larva Labs’ acquisition by Yuga Labs, CryptoPunks only allowed its holders to use the NFT as their profile picture. Bored Apes, on the other hand, allowed its collectors to monetize the Ape they have. The looser copyright management approach of BAYC resulted in the appearance of its derivatives like Bored Wine which let collectors transform their Apes into a label on vintage wine. BAYC gives more free space for content creation that avoids messy copyright complications to stimulate the creativity of the community, so the projects can gain extroversion that expands the community size. Some other NFT projects have gone even further by completely removing copyright protection from its project. The recent rise of Sudoswap, which removes royalties, also indicates the days of making money from copyrights are gone. What CC0 brings to NFTs Creative Commons Zero license, so-called CC0, means “no rights reserved” on intellectual property (IP). CC0 license allows content creators to make derivative works and profit from NFTs without worrying about legal consequences, which maximizes the liberation of creation. Nouns project, launched in August 2021, is considered the first influential CC0 NFT project. It was an experimental attempt against the copyright protection approach of Cryptopunk by bootstrapping identity, community, governance, and a treasury that can be used by the community. The use of its copyright is determined by the NounsDAO composed of its collectors, which has approved over 130 derivative projects like collaborating with Budweiser, putting out a documentary, and even creating a Nouns coffee brand. This experiment is successful since it secured about $45 million in its DAO treasury. There are also many other NFTs using the CC0 license after the Nouns project, but the outcomes are quite different. The Goblintown project, which was released with “No roadmap. No utility. No Discord. CC0,” states on its website, took over the trading volume ranking chart of OpenSea on June 2, 2022, that achieved over 43.7% of the total trading volume on the platform with its derivatives. The Goblintown is a challenge to the creator-led NFT community type as it completely eliminates the influence of the creator. CC0 license allows the Goblintown community to be fully autonomy and independent from the original version of NFT itself. It enhances two important attributes of NFT, culture gene and meme-ability. As we just stated, the CC0 license of Goblintown drives expansions of the original project to create a more engaged community. Additionally, since new derivatives keep producing, attention from external parties can switch back to the original content, and strengthen its position in the collective unconsciously. This cycle creates a flywheel effect that each new derivative raises the value of the original and feedback to itself. There are some other famous NFT projects that benefit from the CC0, such as mfer, Loot, and Cryptoadz. The CC0 license does not necessarily bring positive effects to its collectors, especially for projects that switch to CC0 after launch. MoonBirds project founder, Kevin Rose, announced they are going to CC0 on August 4, 2022. Though some collectors welcomed this surprise move, many collectors got pissed off. It was considered as a betrayal to its collectors since developers did not consult the community before giving away their IP rights. The well-known collector and NFT thought leader Pranksy pointed out a crucial distinction, after MoonBirds’ announcement, between a project that goes CC0 from launch and one that changes its mind after that. “Whether the developers should be responsible for any loss of collectors after the switch.” “Whether it reduces the confidence of community after taking the copyrights from them.” These debates lead to an in-depth question about whether an NFT project should go CC0 at a specific time. Pros and Cons There is no universal answer to this problem, as the traits may vary between NFTs. Still, we can find some shared characteristics of NFT projects that go CC0. Nouns, Cryptoadz, MoonBirds, and Blitmap are all pixelated NFT collections. In other words, they all have a low-fidelity resolution that reduces the barrier for third parties to create derivatives. Also, these pictures lack enough details to express a specific cultural gene and other characteristics of the project. Compatibility and extensibility are keys for CC0 NFTs. Collectors can use this NFT anywhere they want to, which incentivizes community expansion. Third-party creators can receive most IP benefits with the least creative barriers. These features are not only aimed at derivatives on-chain but also connected with a more brand vision of NFTs. Loot bags are fundamental primitives that make sense for all types of adventure games; the Nouns brand is fittable for glasses. These two features can also become pain points for investors who aim to build commercial products on top of the NFT they hold since collectors of any CC0 NFT can’t exclude others from creating content for the same IP. Investors must choose wisely when their motivation to purchase an NFT is not aligned with the value brought by the CC0 license. For now, CC0 NFTs’ developers mostly benefit from the royalties generated in the secondary market like OpenSea. CC0 empowers the prosperous derivatives ecosystem that brought attention to the original NFT collection and boosts its trading volume of it. However, the rise of SudoSwap recently may change this approach as it refuses to pay royalties to creators. This is an inevitable change in the expansion of the NFT and Crypto market, because not every collector is unwilling to pay for the royalties, especially for investors that purchase NFT for commercial purposes. The current revenue model of CC0 NFT projects is unsustainable for developers when the NFT market goes bigger. The CC0 strengthened the IP overall, but shred the profit from it. CC0 license is not a general solution to the struggles of the current NFT market. It only provides an alternative for creators to bootstrap their projects. We expect to see an NFT-origin license along with the development of blockchain technology and NFT communities. Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision. 🐩 @SoxPt50 📅 26 August 2022 Link: [1] https://nftnow.com/news/goblintown-derivatives-have-taken-over-openseas-volume-chart/ [2] https://newsletter.banklesshq.com/p/the-bull-case-for-cc0-nfts?s=r&triedSigningIn=true [3] https://a16zcrypto.com/cc0-nft-creative-commons-zero-license-rights/

CC0, More Possibilities for NFTs

NFT and IP

The NFT market has experienced a boom in the past year and reached its ATH monthly trading volume of $6B in January 2022.

NFT trading platforms like LooksRare and Magic Eden were developing rapidly and threatened the dominance of OpenSea. Nansen’s reports in April also indicated that the NFT market is growing more than the crypto market in the first quarter of 2022.

However, the market quickly turned over after Luna’s collapse and Celsius’s suspension. Liquid assets in the crypto market were greatly reduced after U.S. Federal Reserve increased its benchmark interest by 75 basis points, which forced holders to sell their NFTs and regain their cash flow. The price of NFTs got dumped corresponding with the sharp drop in digital currency prices. Blue-Chip NFTs like BAYC and Azuki dropped over 50% from their ATH.

Besides the price reduction, the future direction of the NFT market is still foggy. The NFT was initially created to properly demonstrate the copyrights of creators in the digital world using blockchain technologies. It provides creators with a new distribution channel that directly connects them with their fanbases/communities.

The value of any NFT comes from the copyright it represents when compared with its photoshoots. However, when we consider whether an NFT is more expensive than others, we are talking about its cultural gene, community, and self-identity. In general, IP is the core value of NFTs.

Since NFT has high market potential, the debates surrounding IP rights never stop. In the past few years, approaches for building brands, communities, and content through IP differ between NFT projects. Let’s take a look at Cryptopunk and BAYC.

Before Larva Labs’ acquisition by Yuga Labs, CryptoPunks only allowed its holders to use the NFT as their profile picture. Bored Apes, on the other hand, allowed its collectors to monetize the Ape they have. The looser copyright management approach of BAYC resulted in the appearance of its derivatives like Bored Wine which let collectors transform their Apes into a label on vintage wine. BAYC gives more free space for content creation that avoids messy copyright complications to stimulate the creativity of the community, so the projects can gain extroversion that expands the community size.

Some other NFT projects have gone even further by completely removing copyright protection from its project. The recent rise of Sudoswap, which removes royalties, also indicates the days of making money from copyrights are gone.

What CC0 brings to NFTs

Creative Commons Zero license, so-called CC0, means “no rights reserved” on intellectual property (IP). CC0 license allows content creators to make derivative works and profit from NFTs without worrying about legal consequences, which maximizes the liberation of creation.

Nouns project, launched in August 2021, is considered the first influential CC0 NFT project. It was an experimental attempt against the copyright protection approach of Cryptopunk by bootstrapping identity, community, governance, and a treasury that can be used by the community. The use of its copyright is determined by the NounsDAO composed of its collectors, which has approved over 130 derivative projects like collaborating with Budweiser, putting out a documentary, and even creating a Nouns coffee brand. This experiment is successful since it secured about $45 million in its DAO treasury.

There are also many other NFTs using the CC0 license after the Nouns project, but the outcomes are quite different.

The Goblintown project, which was released with “No roadmap. No utility. No Discord. CC0,” states on its website, took over the trading volume ranking chart of OpenSea on June 2, 2022, that achieved over 43.7% of the total trading volume on the platform with its derivatives.

The Goblintown is a challenge to the creator-led NFT community type as it completely eliminates the influence of the creator. CC0 license allows the Goblintown community to be fully autonomy and independent from the original version of NFT itself. It enhances two important attributes of NFT, culture gene and meme-ability.

As we just stated, the CC0 license of Goblintown drives expansions of the original project to create a more engaged community. Additionally, since new derivatives keep producing, attention from external parties can switch back to the original content, and strengthen its position in the collective unconsciously. This cycle creates a flywheel effect that each new derivative raises the value of the original and feedback to itself. There are some other famous NFT projects that benefit from the CC0, such as mfer, Loot, and Cryptoadz.

The CC0 license does not necessarily bring positive effects to its collectors, especially for projects that switch to CC0 after launch. MoonBirds project founder, Kevin Rose, announced they are going to CC0 on August 4, 2022. Though some collectors welcomed this surprise move, many collectors got pissed off. It was considered as a betrayal to its collectors since developers did not consult the community before giving away their IP rights.

The well-known collector and NFT thought leader Pranksy pointed out a crucial distinction, after MoonBirds’ announcement, between a project that goes CC0 from launch and one that changes its mind after that. “Whether the developers should be responsible for any loss of collectors after the switch.” “Whether it reduces the confidence of community after taking the copyrights from them.”

These debates lead to an in-depth question about whether an NFT project should go CC0 at a specific time.

Pros and Cons

There is no universal answer to this problem, as the traits may vary between NFTs. Still, we can find some shared characteristics of NFT projects that go CC0.

Nouns, Cryptoadz, MoonBirds, and Blitmap are all pixelated NFT collections. In other words, they all have a low-fidelity resolution that reduces the barrier for third parties to create derivatives. Also, these pictures lack enough details to express a specific cultural gene and other characteristics of the project.

Compatibility and extensibility are keys for CC0 NFTs. Collectors can use this NFT anywhere they want to, which incentivizes community expansion. Third-party creators can receive most IP benefits with the least creative barriers. These features are not only aimed at derivatives on-chain but also connected with a more brand vision of NFTs. Loot bags are fundamental primitives that make sense for all types of adventure games; the Nouns brand is fittable for glasses.

These two features can also become pain points for investors who aim to build commercial products on top of the NFT they hold since collectors of any CC0 NFT can’t exclude others from creating content for the same IP. Investors must choose wisely when their motivation to purchase an NFT is not aligned with the value brought by the CC0 license.

For now, CC0 NFTs’ developers mostly benefit from the royalties generated in the secondary market like OpenSea. CC0 empowers the prosperous derivatives ecosystem that brought attention to the original NFT collection and boosts its trading volume of it. However, the rise of SudoSwap recently may change this approach as it refuses to pay royalties to creators.

This is an inevitable change in the expansion of the NFT and Crypto market, because not every collector is unwilling to pay for the royalties, especially for investors that purchase NFT for commercial purposes. The current revenue model of CC0 NFT projects is unsustainable for developers when the NFT market goes bigger. The CC0 strengthened the IP overall, but shred the profit from it.

CC0 license is not a general solution to the struggles of the current NFT market. It only provides an alternative for creators to bootstrap their projects. We expect to see an NFT-origin license along with the development of blockchain technology and NFT communities.

Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision.

🐩 @SoxPt50

📅 26 August 2022

Link:

[1] https://nftnow.com/news/goblintown-derivatives-have-taken-over-openseas-volume-chart/

[2] https://newsletter.banklesshq.com/p/the-bull-case-for-cc0-nfts?s=r&triedSigningIn=true

[3] https://a16zcrypto.com/cc0-nft-creative-commons-zero-license-rights/
Introduction to TokenomicsIntroduction It has been over a decade since the whitepaper of Bitcoin was released, and we can see the impressive evolution of blockchain technology since then. However, the development of the on-chain economy and business implications is not keeping up with its technology. Designing a well-functioning tokenomics is the most important part of any on-chain protocol, as it defines the pattern of a cryptocurrency’s issuance and supplies within the system. We can find many famous projects that rise and fall because of their tokenomics. Before we dive into more discussions about tokenomics, let’s first look at the token itself and how it performs in the system. From the ground level, the token can be defined as any digital assets issued on and utilizing the blockchain. However, in this paper, I will only refer to cryptocurrency or digital assets that run on the top of a public chain with specific utilities, which exclude cryptocurrencies like $BTC and $ETH. This type of token, they are worthless without the protocol it stands for. Its value is only determined by what it represents, but its role can be diversified, from the right to reduce the exchange rate in DEXs to the membership representation in SocialFi projects. Thus a general definition of the token can hardly be concluded without considering the tokenomics behind it. The emergence of tokenomics, or digitized value, is because blockchain technology introduces the idea of digital scarcity, thus creating a new type of digital economy. This digital economy system is coded by token designers using smart contracts within a self-governed economic system, and they generally define the token economy from Two major aspects: Monetary Policy and Incentive Behaviors Monetary Policy In traditional finance, monetary policies are used by the central bank to control the overall money supply and promote economic growth and employ strategies. Monetary policy is about controlling the amount of money available in its economy and the channels through which money is printed. The policy can either be expansionary or contractionary depending on the performance of its economy. For tokenomics, the monetary policy refers to token issuance and whether it’s deflationary or inflationary. Projects control it with pre-coded smart contracts, and it is mainly controlled from two sides: Burnability of Token and Token Supply. The burnability of the token determines whether the token can be burned to create artificial scarcity, and adjust the circulating supply of token. The token supply is fundamental to tokenomics as it determines the dynamics of tokens. Similar to the role of the velocity of money in traditional finance, the velocity of token indicates whether its ecosystem is expanding or contracting. It measures the transaction frequency of a token in a set period. Typical classifications are: Time-based: Tokens are all minted at the TGE event and distributed afterward according to a pre-defined schedule. This supply strategy is implemented directly through smart contracts that control the ecosystem. This strategy has the best transparency but the least flexibility, requiring a carefully designed ecosystem to sustain the most extreme cases. It is generally used by protocols that leverage the TGE events as a fundraising opportunity. Demand Driven: Tokens are created, distributed, or burned according to current demand, which is monitored by a pre-coded algorithm on smart contracts. In this way, supply levels are artificially aligned with the progression of the project. This strategy is generally used by projects that intend to control the price range of their tokens and maintain the motivation of demand. However, a pre-defined algorithm can sometimes fail under boundary cases, especially when the fluctuation of cryptocurrencies is high. Discretionary: Tokens are created and distributed or burned without a pre-defined schedule, but are determined by the need of the DAO of the project. The DAO passes the proposals and generates & distributes these tokens according to the progression of the project or dealing with unanticipated events. This type of delivery strategy sacrifices a portion of transparency and trust to gain more flexibility and risk resistance. The discretionary strategy provides the most flexibility and risk resistance if its governance system works as it intends to. However, there are always conflicts between organizations in an ecosystem. For example, investors may want the value of their tokens to increase, but users benefit from lower inflation and reduced volatility. The governance system can easily mess up tokenomics by passing malicious governance proposals. In this case, the development of tokenomics also relies on the development of DAO management. Recent on-chain protocols have a more compound token economy that can hardly be defined by a single characteristic, but it still has the features of the types above. StepN is considered one of the hottest properties in the GameFi track at this moment. It is an M2E project that pays its users to walk with a similar tokenomics system like the Axie Infinity in general. Players are rewarded in tokens for the walk they have done each day. To play, users must buy a STEPN NFT beforehand. This type of project will initially offer attractive token rewards but gradually drop in value as more users farm these rewards. The major issue of this type of tokenomics (Before the StepN) is that they are unsustainable in the long run because their source of yields/rewards is generated without real value. StepN inherits the Dual-Token system from Axie Infinity: GST, the primary reward token, has an unlimited supply so it can be minted to fulfill unpredictable future demand. The inflationary pressure of GST is partially hedged by the price of its NFTs, which creates an entry barrier. Players are willing to stay for a longer period, so inflation can be held off. GMT, the governance token of its ecosystem, has a limited supply and a time-based vesting period. The token economy of StepN combined three typical classifications of tokenomics mentioned above. GST token can be minted by the will of its team to extend the reward size according to the growth of users, so the GST can keep its incentive. It is a demand-driven tokenomics with a discretional approach. For GMT token, it has a time-based tokenomics. Additionally, the velocity of the token transaction is also affected by technical constraints and the improvement of technologies like ZK rollups and the PoS merge of Ethereum can unleash the greater capability of tokenomics. Incentive Behaviors As we just mentioned, the conflicting incentives in an ecosystem can easily lead to a mess. The design of incentive behaviors is another important part of tokenomics. Incentives in any on-chain ecosystem can be classified into two types: passive incentives and active incentives. Passive incentives are attributes of the token that enable incentivized behavior, which stands as a prerequisite for certain activities in the network. This type of incentive does not profit their holders directly compared with active incentives. On the other hand, active incentives are tokens that profit their holders from certain activities in the ecosystem. More precisely, it will agitate its holder to perform the incentivized behaviors. It is noteworthy that negative incentives such as punishment for holding certain token/digital assets are considered alternatives to active incentives. Passive Incentives Medium of exchange: The same function as traditional money. Some tokens can be used to exchange services or other assets in the protocol. Governance Rights: For the vast majority of current protocols, governance rights are a common characteristic of their tokens. Holding or staking tokens is a prerequisite to voting on proposals that can greatly influence the performance of the protocol, which is fundamental in the DAO governance system. Rights of governance can also be an active incentive when its holder can gain profit directly from certain proposals. A clear boundary should be established for a well-defined governance token, or governance attacks will easily destroy the protocol. Active Incentives Accessibility: The tokens grant holders the right to access services and content on the platform. This is the most common incentive behavior of tokens for on-chain protocols, and its value is greatly influenced by the performance of the protocol itself. Speculation: Token holders expect the price of their tokens to go up with the development of the protocol. In this case, tokens are stores of value that their holders are willing to keep and this will reduce the velocity of the token within the system greatly. Profit Sharing: Token holders can receive a share of the revenue generated by the protocol through staking/holding their tokens. This type of incentive might come in two different ways: The first one is that the token issuer mints more tokens or unlocks tokens from its reserve to pay the holders. In this case, the circulating supply of tokens increases. Or, they can distribute a portion of its profits with another cryptocurrency, and maintain the scarcity of its token. Passive incentives and active incentives listed above are generally combined to create current tokenomics, and there are also some other incentives shaping the behavior of token holders in real cases. For on-chain protocols, its ecosystem is generally closed compared to traditional finance, and only chosen tokens can capture or generate value within it. In this case, the expected incentive behaviors of the token in a specific protocol should be limited to create controllable token dynamics with the token supply. Conclusions The tokenomics can reflect economic and social costs when accounting for on-chain projects. For me, I would prefer to define tokenomics as a social-technical system that designing a well-functioning tokenomics is about shaping effective incentive schemes according to both its financial objective, and the expected behavior of the community of token holders. Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision. 🐩 @SoxPt50 📅 4 August 2022

Introduction to Tokenomics

Introduction

It has been over a decade since the whitepaper of Bitcoin was released, and we can see the impressive evolution of blockchain technology since then. However, the development of the on-chain economy and business implications is not keeping up with its technology.

Designing a well-functioning tokenomics is the most important part of any on-chain protocol, as it defines the pattern of a cryptocurrency’s issuance and supplies within the system. We can find many famous projects that rise and fall because of their tokenomics.

Before we dive into more discussions about tokenomics, let’s first look at the token itself and how it performs in the system.

From the ground level, the token can be defined as any digital assets issued on and utilizing the blockchain. However, in this paper, I will only refer to cryptocurrency or digital assets that run on the top of a public chain with specific utilities, which exclude cryptocurrencies like $BTC and $ETH .

This type of token, they are worthless without the protocol it stands for. Its value is only determined by what it represents, but its role can be diversified, from the right to reduce the exchange rate in DEXs to the membership representation in SocialFi projects. Thus a general definition of the token can hardly be concluded without considering the tokenomics behind it.

The emergence of tokenomics, or digitized value, is because blockchain technology introduces the idea of digital scarcity, thus creating a new type of digital economy. This digital economy system is coded by token designers using smart contracts within a self-governed economic system, and they generally define the token economy from Two major aspects: Monetary Policy and Incentive Behaviors

Monetary Policy

In traditional finance, monetary policies are used by the central bank to control the overall money supply and promote economic growth and employ strategies. Monetary policy is about controlling the amount of money available in its economy and the channels through which money is printed. The policy can either be expansionary or contractionary depending on the performance of its economy.

For tokenomics, the monetary policy refers to token issuance and whether it’s deflationary or inflationary. Projects control it with pre-coded smart contracts, and it is mainly controlled from two sides: Burnability of Token and Token Supply.

The burnability of the token determines whether the token can be burned to create artificial scarcity, and adjust the circulating supply of token.

The token supply is fundamental to tokenomics as it determines the dynamics of tokens. Similar to the role of the velocity of money in traditional finance, the velocity of token indicates whether its ecosystem is expanding or contracting. It measures the transaction frequency of a token in a set period.

Typical classifications are:

Time-based: Tokens are all minted at the TGE event and distributed afterward according to a pre-defined schedule. This supply strategy is implemented directly through smart contracts that control the ecosystem. This strategy has the best transparency but the least flexibility, requiring a carefully designed ecosystem to sustain the most extreme cases. It is generally used by protocols that leverage the TGE events as a fundraising opportunity.

Demand Driven: Tokens are created, distributed, or burned according to current demand, which is monitored by a pre-coded algorithm on smart contracts. In this way, supply levels are artificially aligned with the progression of the project. This strategy is generally used by projects that intend to control the price range of their tokens and maintain the motivation of demand. However, a pre-defined algorithm can sometimes fail under boundary cases, especially when the fluctuation of cryptocurrencies is high.

Discretionary: Tokens are created and distributed or burned without a pre-defined schedule, but are determined by the need of the DAO of the project. The DAO passes the proposals and generates & distributes these tokens according to the progression of the project or dealing with unanticipated events. This type of delivery strategy sacrifices a portion of transparency and trust to gain more flexibility and risk resistance.

The discretionary strategy provides the most flexibility and risk resistance if its governance system works as it intends to. However, there are always conflicts between organizations in an ecosystem. For example, investors may want the value of their tokens to increase, but users benefit from lower inflation and reduced volatility. The governance system can easily mess up tokenomics by passing malicious governance proposals. In this case, the development of tokenomics also relies on the development of DAO management.

Recent on-chain protocols have a more compound token economy that can hardly be defined by a single characteristic, but it still has the features of the types above.

StepN is considered one of the hottest properties in the GameFi track at this moment. It is an M2E project that pays its users to walk with a similar tokenomics system like the Axie Infinity in general. Players are rewarded in tokens for the walk they have done each day. To play, users must buy a STEPN NFT beforehand. This type of project will initially offer attractive token rewards but gradually drop in value as more users farm these rewards. The major issue of this type of tokenomics (Before the StepN) is that they are unsustainable in the long run because their source of yields/rewards is generated without real value.

StepN inherits the Dual-Token system from Axie Infinity: GST, the primary reward token, has an unlimited supply so it can be minted to fulfill unpredictable future demand. The inflationary pressure of GST is partially hedged by the price of its NFTs, which creates an entry barrier. Players are willing to stay for a longer period, so inflation can be held off. GMT, the governance token of its ecosystem, has a limited supply and a time-based vesting period.

The token economy of StepN combined three typical classifications of tokenomics mentioned above. GST token can be minted by the will of its team to extend the reward size according to the growth of users, so the GST can keep its incentive. It is a demand-driven tokenomics with a discretional approach. For GMT token, it has a time-based tokenomics.

Additionally, the velocity of the token transaction is also affected by technical constraints and the improvement of technologies like ZK rollups and the PoS merge of Ethereum can unleash the greater capability of tokenomics.

Incentive Behaviors

As we just mentioned, the conflicting incentives in an ecosystem can easily lead to a mess. The design of incentive behaviors is another important part of tokenomics.

Incentives in any on-chain ecosystem can be classified into two types: passive incentives and active incentives. Passive incentives are attributes of the token that enable incentivized behavior, which stands as a prerequisite for certain activities in the network. This type of incentive does not profit their holders directly compared with active incentives. On the other hand, active incentives are tokens that profit their holders from certain activities in the ecosystem. More precisely, it will agitate its holder to perform the incentivized behaviors. It is noteworthy that negative incentives such as punishment for holding certain token/digital assets are considered alternatives to active incentives.

Passive Incentives

Medium of exchange: The same function as traditional money. Some tokens can be used to exchange services or other assets in the protocol.

Governance Rights: For the vast majority of current protocols, governance rights are a common characteristic of their tokens. Holding or staking tokens is a prerequisite to voting on proposals that can greatly influence the performance of the protocol, which is fundamental in the DAO governance system.

Rights of governance can also be an active incentive when its holder can gain profit directly from certain proposals. A clear boundary should be established for a well-defined governance token, or governance attacks will easily destroy the protocol.

Active Incentives

Accessibility: The tokens grant holders the right to access services and content on the platform. This is the most common incentive behavior of tokens for on-chain protocols, and its value is greatly influenced by the performance of the protocol itself.

Speculation: Token holders expect the price of their tokens to go up with the development of the protocol. In this case, tokens are stores of value that their holders are willing to keep and this will reduce the velocity of the token within the system greatly.

Profit Sharing: Token holders can receive a share of the revenue generated by the protocol through staking/holding their tokens. This type of incentive might come in two different ways: The first one is that the token issuer mints more tokens or unlocks tokens from its reserve to pay the holders. In this case, the circulating supply of tokens increases. Or, they can distribute a portion of its profits with another cryptocurrency, and maintain the scarcity of its token.

Passive incentives and active incentives listed above are generally combined to create current tokenomics, and there are also some other incentives shaping the behavior of token holders in real cases. For on-chain protocols, its ecosystem is generally closed compared to traditional finance, and only chosen tokens can capture or generate value within it. In this case, the expected incentive behaviors of the token in a specific protocol should be limited to create controllable token dynamics with the token supply.

Conclusions

The tokenomics can reflect economic and social costs when accounting for on-chain projects. For me, I would prefer to define tokenomics as a social-technical system that designing a well-functioning tokenomics is about shaping effective incentive schemes according to both its financial objective, and the expected behavior of the community of token holders.

Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision.

🐩 @SoxPt50

📅 4 August 2022

Security and the Future of DeFi GovernanceDeFi is one of the most revolutionized applications of blockchain technology. The development of conventional financial instruments in the past decade was increasing the number of services that investors can choose from, which provides more financial autonomy for investors. However, the governance rights of investors were barely improved, and the value of their assets was determined by the institutions. A paradigm shift started when decentralized finance first boomed in 2020. DeFi, as a combination of blockchain technology, algorithmic automation, and community governance structure, removed the intermediates from the financial system and allow investors to interact with their funds directly. DeFi is decentralized not only because it utilizes blockchain technologies and peer-to-peer transactions, but also because it distributes the governance rights to a community of “shareholders”. Currently, the most popular governance pattern implemented in DeFi protocols is the on-chain governance system with a single type of governance token. In this system, all protocol changes will be voted directly on the blockchain. Proposals are smart contracts, and they will be executed if they lock enough amount of governance tokens. In this system, governance tokens are initially distributed to users who interact with the protocols for a certain amount of time before being listed on exchanges, which incentivize platform users and help protocols pass the cold-start stage. Value of Governance Token The governance token captures the value of a DeFi protocol from different aspects, and its value is determined by the governance rights, speculation/expectation, and incentive. Let’s take a look at the rights of governance tokens in MakerDAO, Uniswap, and Compound. From the governance perspective, the governance token generally has three major value captures. Cash Flow: Most DeFi projects will distribute a certain portion of their revenue regularly to a treasury in their ecosystem, which is the insurance funds or for dividend purposes. The holders of governance tokens can determine the rate of dividend they received by proposing a new poll on-chain. Protocol Changes: The intrinsic value of a governance token is its power to determine the future of a protocol. Token holders can vote on smart contract modifications, business direction changes, and many other factors that will fundamentally influence the performance of a protocol. Future Token Distributions: This type of value is usually related to yield farming/liquidity mining services in the DeFi protocol. Governance token holders can determine the rate of minting/vesting of new tokens that will be distributed via yield farming. This activity will influence the future engagement of a DeFi protocol. Risk of DeFi Governance The weight carried by a vote is proportionate to the number of governance tokens that the voter holds. This logic makes sense in general cases as the more tokens they hold, they are less likely to initiate malicious proposals against their benefits. However, the skyrocketing TVL of DeFi protocol makes governance attacks highly profitable. As the DAO becomes increasingly popular these days, governance attacks are more common. There are several on-chain governance attacks. Yam Finance Yam Finance, a DeFi protocol that rose in the DeFi summer of 2020, prevented a governance attack in July 2022, which was seeking to control its treasury that contains $3M worth of cryptocurrencies. On July 7th, attackers initiated a governance proposal via internal transaction, and they hid an unverified smart contract into this proposal. This malicious smart contract will cede the control of the reserve to attackers once the proposal executes. Before the Yam Finance Team was able to freeze this proposal, it achieved a quorum of the proposal and was in danger of being passed. Build Finance DAO Build Finance DAO is a decentralized investment organization, and they fund other projects through its $BUILD token. The most well-known investment made by them is the MetricExchange. Although the slow progression of investments makes the community inactive, it still has about $500K worth of cryptocurrencies locked in its treasury. Build Finance DAO has a special govern mechanism that allows the owner of a specific smart contract to mint $BUILD tokens and controls its treasury. This vulnerable system makes Build Finance DAO become a good target for governance attacks. On Feb 10th, 2022, attackers acquired enough amount of governance tokens themselves to pass a malicious proposal that will grant them control of the treasury, and keep the community unnoticed. After taking control of the protocol, the attackers minted and sold various tokens from the treasury by leveraging smart contracts and draining funds from liquidity pools on other DEXs that listed $BUILD and $METRIC. According to The Block’s estimations, attackers gained the equivalent of 160 $ETH in this governance attack. Mirror Mirror protocol is an on-chain synthetic assets protocol on the Terra network. Last Christmas, Mirror on the Terra network got attacked. The attacker disguised the malicious governance proposal by declaring it is a request for cooperation with the Solana network, but it will send $64.2M worth of $MIR tokens to the attacker. Other than this main attack, attackers also initiated multiple other malicious proposals that tried to drain the treasury to distract the Mirror team. The Mirror team stopped this attack by initiating a new proposal to alert unwary users about such a scam. The development of the DeFi governance system at the current stage is incompetent with its TVL increasing. Most protocols require centralized and manual protection from the team against governance attacks rather than a decentralized system. When simply calculating the voting power based on the number of governance tokens that hold by voters, the protocol can have a higher systematic risk rate than a governance structure with a staking mechanism. Lacking a quantitatively analyzing method of governance proposals, the governance tokens can easily be misvalued in a bear market. The inconsistency between the TVL of protocol and the Market Cap of governance tokens is increased when the price of governance tokens is getting dumped. The cost of governance attacks got deeply reduced, and this opportunity will attract speculators to make governance attacks. The Future of DeFi Governance We always say a bear market is a good time for building. Now is a good time for builders to explore the future of the DeFi governance structures. veToken model that originates from Curve Finance is an innovative governance system for DeFi protocols. It creates a veToken to replace the protocol’s native token for governance purposes. Users need to lock their native token for a certain amount of time to get the corresponding amount of veTokens. Since veToken is Non-transferable and intrinsic economic value is zero, Curve separates its utility token from the governance and reduces the systematic risk of governance attack. However, the success of veToken in Curve Finance can hardly be replicated by other DeFi protocols. Curve Finance has a well-designed financial system that generates stable revenue for its liquidity providers, and its strong composability makes it extends its user base by integrating with many projects. In this case, the governance token can sacrifice a portion of its property as an incentive for stronger governance stability. The veToken model was mostly criticized for Bribery (Curve War). This governance model is pretty fragile when engaging in lending or borrowing services, as it allows substandard assets to become collateral. The future DeFi governance structure needs to prevent malicious attacks from both the technical and social sides. Also, the movement made by dYdX that they plan to deploy their blockchain on Cosmos shows the future of DeFi governance needs a multi-chain solution. Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision. 🐩 @SoxPt50 📅 19 July 2022

Security and the Future of DeFi Governance

DeFi is one of the most revolutionized applications of blockchain technology.

The development of conventional financial instruments in the past decade was increasing the number of services that investors can choose from, which provides more financial autonomy for investors. However, the governance rights of investors were barely improved, and the value of their assets was determined by the institutions.

A paradigm shift started when decentralized finance first boomed in 2020. DeFi, as a combination of blockchain technology, algorithmic automation, and community governance structure, removed the intermediates from the financial system and allow investors to interact with their funds directly.

DeFi is decentralized not only because it utilizes blockchain technologies and peer-to-peer transactions, but also because it distributes the governance rights to a community of “shareholders”.

Currently, the most popular governance pattern implemented in DeFi protocols is the on-chain governance system with a single type of governance token. In this system, all protocol changes will be voted directly on the blockchain. Proposals are smart contracts, and they will be executed if they lock enough amount of governance tokens.

In this system, governance tokens are initially distributed to users who interact with the protocols for a certain amount of time before being listed on exchanges, which incentivize platform users and help protocols pass the cold-start stage.

Value of Governance Token

The governance token captures the value of a DeFi protocol from different aspects, and its value is determined by the governance rights, speculation/expectation, and incentive.

Let’s take a look at the rights of governance tokens in MakerDAO, Uniswap, and Compound.

From the governance perspective, the governance token generally has three major value captures.

Cash Flow: Most DeFi projects will distribute a certain portion of their revenue regularly to a treasury in their ecosystem, which is the insurance funds or for dividend purposes. The holders of governance tokens can determine the rate of dividend they received by proposing a new poll on-chain.

Protocol Changes: The intrinsic value of a governance token is its power to determine the future of a protocol. Token holders can vote on smart contract modifications, business direction changes, and many other factors that will fundamentally influence the performance of a protocol.

Future Token Distributions: This type of value is usually related to yield farming/liquidity mining services in the DeFi protocol. Governance token holders can determine the rate of minting/vesting of new tokens that will be distributed via yield farming. This activity will influence the future engagement of a DeFi protocol.

Risk of DeFi Governance

The weight carried by a vote is proportionate to the number of governance tokens that the voter holds. This logic makes sense in general cases as the more tokens they hold, they are less likely to initiate malicious proposals against their benefits.

However, the skyrocketing TVL of DeFi protocol makes governance attacks highly profitable. As the DAO becomes increasingly popular these days, governance attacks are more common.

There are several on-chain governance attacks.

Yam Finance

Yam Finance, a DeFi protocol that rose in the DeFi summer of 2020, prevented a governance attack in July 2022, which was seeking to control its treasury that contains $3M worth of cryptocurrencies.

On July 7th, attackers initiated a governance proposal via internal transaction, and they hid an unverified smart contract into this proposal. This malicious smart contract will cede the control of the reserve to attackers once the proposal executes.

Before the Yam Finance Team was able to freeze this proposal, it achieved a quorum of the proposal and was in danger of being passed.

Build Finance DAO

Build Finance DAO is a decentralized investment organization, and they fund other projects through its $BUILD token. The most well-known investment made by them is the MetricExchange. Although the slow progression of investments makes the community inactive, it still has about $500K worth of cryptocurrencies locked in its treasury.

Build Finance DAO has a special govern mechanism that allows the owner of a specific smart contract to mint $BUILD tokens and controls its treasury. This vulnerable system makes Build Finance DAO become a good target for governance attacks.

On Feb 10th, 2022, attackers acquired enough amount of governance tokens themselves to pass a malicious proposal that will grant them control of the treasury, and keep the community unnoticed. After taking control of the protocol, the attackers minted and sold various tokens from the treasury by leveraging smart contracts and draining funds from liquidity pools on other DEXs that listed $BUILD and $METRIC.

According to The Block’s estimations, attackers gained the equivalent of 160 $ETH in this governance attack.

Mirror

Mirror protocol is an on-chain synthetic assets protocol on the Terra network.

Last Christmas, Mirror on the Terra network got attacked. The attacker disguised the malicious governance proposal by declaring it is a request for cooperation with the Solana network, but it will send $64.2M worth of $MIR tokens to the attacker. Other than this main attack, attackers also initiated multiple other malicious proposals that tried to drain the treasury to distract the Mirror team.

The Mirror team stopped this attack by initiating a new proposal to alert unwary users about such a scam.

The development of the DeFi governance system at the current stage is incompetent with its TVL increasing. Most protocols require centralized and manual protection from the team against governance attacks rather than a decentralized system.

When simply calculating the voting power based on the number of governance tokens that hold by voters, the protocol can have a higher systematic risk rate than a governance structure with a staking mechanism. Lacking a quantitatively analyzing method of governance proposals, the governance tokens can easily be misvalued in a bear market. The inconsistency between the TVL of protocol and the Market Cap of governance tokens is increased when the price of governance tokens is getting dumped. The cost of governance attacks got deeply reduced, and this opportunity will attract speculators to make governance attacks.

The Future of DeFi Governance

We always say a bear market is a good time for building. Now is a good time for builders to explore the future of the DeFi governance structures.

veToken model that originates from Curve Finance is an innovative governance system for DeFi protocols. It creates a veToken to replace the protocol’s native token for governance purposes. Users need to lock their native token for a certain amount of time to get the corresponding amount of veTokens. Since veToken is Non-transferable and intrinsic economic value is zero, Curve separates its utility token from the governance and reduces the systematic risk of governance attack.

However, the success of veToken in Curve Finance can hardly be replicated by other DeFi protocols. Curve Finance has a well-designed financial system that generates stable revenue for its liquidity providers, and its strong composability makes it extends its user base by integrating with many projects. In this case, the governance token can sacrifice a portion of its property as an incentive for stronger governance stability.

The veToken model was mostly criticized for Bribery (Curve War). This governance model is pretty fragile when engaging in lending or borrowing services, as it allows substandard assets to become collateral.

The future DeFi governance structure needs to prevent malicious attacks from both the technical and social sides. Also, the movement made by dYdX that they plan to deploy their blockchain on Cosmos shows the future of DeFi governance needs a multi-chain solution.

Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision.

🐩 @SoxPt50

📅 19 July 2022
The $stETH and Recent Crypto MarketThe major change after the merge of Ethereum is its consensus network will switch from Proof-of-Work to Proof-of-Stake. Compared to PoW, PoS is more energy-efficient and it increases the scalability of Ethereum. For each Proof-of-Stake network, there will be validators who are responsible for checking the validity of new blocks propagated over the network, and creating and propagating new blocks themselves. In Ethereum 2.0, to become a validator, users must deposit at least 32 ETH into the smart contract. The Ethereum network will randomly select validators from these stakers in each epoch (in Ethereum 2.0, there will be 1 slot = 12 sec, and 32 slot = 1 epoch). If stakers perform honestly in an epoch, they will receive rewards in $ETH from the network. However, both the rewards and principles will be locked until the merge. Or even several months after the merge considering the fact that state transition and transaction function will not be immediately available after the merge. Introducing Lido The illiquidity issue becomes a severe problem for investors who want to earn stable and attractive rewards from Ethereum 2.0 while maintaining their exposure to the yields from other DeFi protocols. Additionally, there are a considerable amount of retail investors who are unable to meet the 32ETH requirement. Lido was introduced to solve these issues by providing liquidity for stakers and allowing participation with any amount of $ETH. For each $ETH staked through Lido, staker will receive the same amount of $stETH, which represents the value of staked $ETH on the beacon chain. The balance of $stETH will increase correspondingly as the rewards received on Ethereum 2.0. $stETH holders can redeem the $stETH to $ETH on a 1:1 basis after the transaction function become available. This mechanism makes $stETH “peg” $ETH, and provides liquidity to stakers that they can use $stETH in DeFi protocols similarly as they were holding $ETH. $stETH Is A Derivative However, “Peg” between $stETH and $ETH is not strict, and Lido’s system design is far from the swap mechanism between $UST and $Luna. To better understand the relationship between $stETH and $ETH, I will briefly introduce the Bond market here. In conventional finance, companies use corporate bonds to raise capital from investors, repaying them with principal plus interest when time hits the maturity of the bond. Bond investors can also sell/buy bonds on a secondary market to gain liquidity, so its price is influenced by the demand/liquidity of the market rather than the issuer before its maturity. Traditionally, bonds are preferred by risk aversion investors since bonds are debt investments. Due to the characteristic of absolute priority, the company must first pay off debtors before paying off preferred shareholders and common shareholders in a dissolution or bankruptcy event. Although there is no bankruptcy protection for crypto investors, $stETH does provide a risk aversion investment opportunity because of the robust usage of $ETH and the reputation of its community. By analogy $stETH with the bond, the staked $ETH is the principle, the 4% APY provided by Lido is the interest rate and the DeFi protocols are secondary markets. The major difference between $stETH and a corporate bond is its maturity date (merge) is unknown. (Not perpetual bond since perpetual will not pay back principal) Maturity determines the price deduction of a bond as a risk parameter. Since investors can redeem bonds for principal plus interest at maturity, the price of a bond will approach its face value as time gets close to its maturity or vice versa. There are also some other discount factors like Lido’s contract risk and Lido’s credibility. The $stETH should always be priced less than $ETH before the transaction is functional on Ethereum 2.0, and trading at a price below its “parity” should be expected at the current stage. We can check the performance of $BETH, a similar project introduced by Binance several weeks after Lido, its parity to $ETH has never hit 1.0 in the past 15 months. There are also some worries about “depegged” $stETH will lead to a dump of $ETH. Since on-chain DEXs never promise their users can redeem $stETH to $ETH on a 1:1 basis, the price of $stETH will be determined by the liquidity of the pool rather than a fixed algorithm, which means there is no way to generate an infinite amount of $stETH from nowhere as arbitrageurs did to $Luna. The ceiling amount of $stETH on market is also restricted by the total supply of $ETH. I will explain more in the next part about why the price of $stETH will not fall below a certain level. The crypto market in general The struggle of 3AC and Celsius last week shows us the power of liquidity in a crypto winter. As FED raised benchmark interest rates three-quarters of a percentage point last week against currently skyrocketing inflation, the liquidity of $USD got further reduced. The high inflation, interest rate, and gas fees are forcing retail investors to sell their risk assets to pay their bills, and the selling pressure on cryptocurrencies is increasing in general. Institutions have to exit their positions on illiquid assets and prepare enough cash to avoid a potential bank run when the macro market becomes panicked. $stETH specifically, since its liquidity pool on Curve, Aave and Uni are mainly supplied by institutions like FTX and Celsius, the exiting institutions easily destroyed the price parity in an AMM DEX. The declining price of $stETH also triggered margin calls for leveraged stakers who utilize Aave to repetitively stake ETH into Lido. They are forced to sell other cryptocurrencies at a lower price to pay off the debt, or their $stETH will be liquidated and dumped into the market. Either way, it will put more selling pressure on the crypto market as most $stETH was deposited into the Aave pool. From the macro level, the financial market (both conventional market and crypto) has partially overreacted to the monetary tightening in the last month and already expected the FED rate hike of 75 basis points in July. The risky assets like cryptocurrencies may face another dump in July when the liquidity of the USD is tightening, but the independency of the FED now is incomparable with the Volcker moment. With Biden’s approval rating hitting 36% recently (as the new record low), Democrats are most likely to lose two houses in the upcoming midterm election. The political pressure on FED may turn it back to monetary easing very soon to please voters. The market is expected to retain a relatively stable state after 3 rate hikes when the interest rate hits about 3.25%. Conclusion The Lido/$stETH system is not close to the Terra, and $stETH “depeg” should be expected. The price drop of $stETH is basically an imbalance between the supply and demand of liquidity, but it should maintain above a certain negative premium level as the demand for ETH still exists. The downside pressure of the crypto market will stay for a while until the FED interest rate hits 3% or the inflation rate gets back to 2%. Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision. 🐩 @SoxPt50 📅 8 July 2022

The $stETH and Recent Crypto Market

The major change after the merge of Ethereum is its consensus network will switch from Proof-of-Work to Proof-of-Stake. Compared to PoW, PoS is more energy-efficient and it increases the scalability of Ethereum.

For each Proof-of-Stake network, there will be validators who are responsible for checking the validity of new blocks propagated over the network, and creating and propagating new blocks themselves.

In Ethereum 2.0, to become a validator, users must deposit at least 32 ETH into the smart contract. The Ethereum network will randomly select validators from these stakers in each epoch (in Ethereum 2.0, there will be 1 slot = 12 sec, and 32 slot = 1 epoch). If stakers perform honestly in an epoch, they will receive rewards in $ETH from the network.

However, both the rewards and principles will be locked until the merge. Or even several months after the merge considering the fact that state transition and transaction function will not be immediately available after the merge.

Introducing Lido

The illiquidity issue becomes a severe problem for investors who want to earn stable and attractive rewards from Ethereum 2.0 while maintaining their exposure to the yields from other DeFi protocols. Additionally, there are a considerable amount of retail investors who are unable to meet the 32ETH requirement.

Lido was introduced to solve these issues by providing liquidity for stakers and allowing participation with any amount of $ETH . For each $ETH staked through Lido, staker will receive the same amount of $stETH, which represents the value of staked $ETH on the beacon chain. The balance of $stETH will increase correspondingly as the rewards received on Ethereum 2.0. $stETH holders can redeem the $stETH to $ETH on a 1:1 basis after the transaction function become available.

This mechanism makes $stETH “peg” $ETH , and provides liquidity to stakers that they can use $stETH in DeFi protocols similarly as they were holding $ETH .

$stETH Is A Derivative

However, “Peg” between $stETH and $ETH is not strict, and Lido’s system design is far from the swap mechanism between $UST and $Luna.

To better understand the relationship between $stETH and $ETH , I will briefly introduce the Bond market here. In conventional finance, companies use corporate bonds to raise capital from investors, repaying them with principal plus interest when time hits the maturity of the bond. Bond investors can also sell/buy bonds on a secondary market to gain liquidity, so its price is influenced by the demand/liquidity of the market rather than the issuer before its maturity.

Traditionally, bonds are preferred by risk aversion investors since bonds are debt investments. Due to the characteristic of absolute priority, the company must first pay off debtors before paying off preferred shareholders and common shareholders in a dissolution or bankruptcy event. Although there is no bankruptcy protection for crypto investors, $stETH does provide a risk aversion investment opportunity because of the robust usage of $ETH and the reputation of its community.

By analogy $stETH with the bond, the staked $ETH is the principle, the 4% APY provided by Lido is the interest rate and the DeFi protocols are secondary markets. The major difference between $stETH and a corporate bond is its maturity date (merge) is unknown. (Not perpetual bond since perpetual will not pay back principal) Maturity determines the price deduction of a bond as a risk parameter. Since investors can redeem bonds for principal plus interest at maturity, the price of a bond will approach its face value as time gets close to its maturity or vice versa. There are also some other discount factors like Lido’s contract risk and Lido’s credibility.

The $stETH should always be priced less than $ETH before the transaction is functional on Ethereum 2.0, and trading at a price below its “parity” should be expected at the current stage.

We can check the performance of $BETH, a similar project introduced by Binance several weeks after Lido, its parity to $ETH has never hit 1.0 in the past 15 months.

There are also some worries about “depegged” $stETH will lead to a dump of $ETH . Since on-chain DEXs never promise their users can redeem $stETH to $ETH on a 1:1 basis, the price of $stETH will be determined by the liquidity of the pool rather than a fixed algorithm, which means there is no way to generate an infinite amount of $stETH from nowhere as arbitrageurs did to $Luna. The ceiling amount of $stETH on market is also restricted by the total supply of $ETH . I will explain more in the next part about why the price of $stETH will not fall below a certain level.

The crypto market in general

The struggle of 3AC and Celsius last week shows us the power of liquidity in a crypto winter. As FED raised benchmark interest rates three-quarters of a percentage point last week against currently skyrocketing inflation, the liquidity of $USD got further reduced. The high inflation, interest rate, and gas fees are forcing retail investors to sell their risk assets to pay their bills, and the selling pressure on cryptocurrencies is increasing in general.

Institutions have to exit their positions on illiquid assets and prepare enough cash to avoid a potential bank run when the macro market becomes panicked. $stETH specifically, since its liquidity pool on Curve, Aave and Uni are mainly supplied by institutions like FTX and Celsius, the exiting institutions easily destroyed the price parity in an AMM DEX.

The declining price of $stETH also triggered margin calls for leveraged stakers who utilize Aave to repetitively stake ETH into Lido. They are forced to sell other cryptocurrencies at a lower price to pay off the debt, or their $stETH will be liquidated and dumped into the market. Either way, it will put more selling pressure on the crypto market as most $stETH was deposited into the Aave pool.

From the macro level, the financial market (both conventional market and crypto) has partially overreacted to the monetary tightening in the last month and already expected the FED rate hike of 75 basis points in July. The risky assets like cryptocurrencies may face another dump in July when the liquidity of the USD is tightening, but the independency of the FED now is incomparable with the Volcker moment. With Biden’s approval rating hitting 36% recently (as the new record low), Democrats are most likely to lose two houses in the upcoming midterm election. The political pressure on FED may turn it back to monetary easing very soon to please voters. The market is expected to retain a relatively stable state after 3 rate hikes when the interest rate hits about 3.25%.

Conclusion

The Lido/$stETH system is not close to the Terra, and $stETH “depeg” should be expected. The price drop of $stETH is basically an imbalance between the supply and demand of liquidity, but it should maintain above a certain negative premium level as the demand for ETH still exists. The downside pressure of the crypto market will stay for a while until the FED interest rate hits 3% or the inflation rate gets back to 2%.

Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision.

🐩 @SoxPt50

📅 8 July 2022
DAO tooling & infrastructure: upon which DAOs are builtDAO, as an emerging crypto vertical, has gained great progress and development in recent years. As of now (21 Jun 2022), there have been 1085 DAOs with more than 3 million members according to the data of deepdao.io. To enable these organizations to flourish, a vibrant tooling and infrastructure stack has been created, driving a thriving DAO ecosystem. These projects provide communities or projects with different templates, frameworks and tools to pool resources and launch their first DAO. They typically provide smart contracts and interfaces to facilitate on-chain operations for decentralized communities, making it easy for anyone with limited technical skills to launch a DAO. In this article, we will introduce some important DAO infrastructure and tools from four most fundamental aspects — treasury management, governance and voting, contribution and reputation, compensation. Treasury Management A DAO isn’t a DAO without a treasury — a shared wallet that can be used to send and receive crypto funds on the blockchain. A DAO treasury serves as the lever for actions within the organization: making purchases, enabling fundraising and interacting with token or NFT sales, rewarding contributions, etc. In addition, a crypto treasury is the most fundamental staking mechanism by which community engagement is incentivized. Gnosis Safe Having started developing blockchain-based financial infrastructure products since 2016, Gnosis Safe is a decentralized asset management platform that sets a secure standard for managing funds and can hold a wide range of asset types as well as NFTs. Gnosis Safe has become the first choice for those looking for a lightweight multi-signature solution for DAOs. It supports offline signatures, hardware ledgers, and allows users to configure the number of signatures required. Currently Gnosis Safe supports multiple public chains such as Ethereum, Polygon, BSC, Avalanche and so on. Besides, non-ether transactions on Ethereum are made possible by Gnosis through transaction relay mechanism. Juicebox The Juicebox protocol is a programmable treasury. Projects can use it to configure how its tokens should be minted when it receives funds, and under what conditions those funds can be distributed to preprogrammed addresses or reclaimed by its community. These rules can evolve over funding cycles, allowing people to bootstrap open-ended projects and add structure, constraints, extensions, and incentives over time as needed. Juicebox also acts as a payment terminal and programmable treasury for projects, allowing DAOs to manage treasury actions such as crowdfunding, employee salaries and automated payouts when a project closes. It is light enough for a group of friends, yet powerful enough for a global network sharing thousands of ETH, ERC20 tokens or other assets. Governance & Voting DAOs must maintain transparency and visibility of how their decisions and funds flow in and out of the community, thus many DAOs use voting to make democratic decisions. Members can earn voting privileges in the DAO through contribution and express their opinions by voting on specific actions. Once approved, those votes automatically translate to on-chain action and get recorded on the blockchain, such as transferring funds from the treasury to an external entity. Snapshot Snapshot is an off-chain gasless decentralized governance and voting system. It provides flexibility on how voting power is calculated for a vote. Snapshot supports various voting types to cater to the needs of organizations, such as single choice, approval voting, quadratic voting and more. Creating proposals and voting on Snapshot is user-friendly and does not cost gas as the process is performed off-chain. Moreover, proposals and votes are stored as signatures on IPFS and there are no restrictions on projects or blockchain networks. Over the past year, the high gas fee on Ethereum has caused problems in this ecosystem. With voting gradually decoupling from on-chain execution, Snapshot has become a popular way in DAO governance. To date, Snapshot has supported 13 public chains and over 4,000 projects and the numbers are still growing rapidly. Tally Tally is a powerful platform for analyzing and voting on crypto protocols. Anyone can connect a wallet and see all the DAOs he or she is involved with in one place. It’s effortless to develop and submit proposals. Tally supports Ethereum, Polygon, and Avalanche. Users can use Tally to create and pass proposals, enable delegation, and power voting, if their DAOs involve submission and voting of proposals. Another feature worth mentioning is that Tally’s SafeGuard allows a certain number of token holders to revoke a transaction initiated by multi-signature or reclaim funds if they believe the signer’s decision was improper or unfair. Contribution & Reputation To become a contributor, one generally goes through the following journey: from being completely unaware of DAO, to lurking on various social platforms like Twitter or Discord, to connecting with other members, and finally making the first contribution to DAO and creating value on a consistent basis. Ideally, members’ sense of belonging will be strengthened during this contributor journey, prompting them to grow with the community, contribute more, and move toward a common vision. Achieving this ideal situation involves process design issues, the essence of which lies in choosing the right tools and maximizing the value of the tools as much as possible. Good use of tools can help solve macro problems, such as: How to provide the right opportunities and information to the right people at the right time How members’ contributions are quantified, motivated and appropriately rewarded How to foster trust through roles and reputation mechanisms Gitcoin Gitcoin is a platform where coders and developers can get paid to work on open-source software in a wide variety of programming languages. It mainly involves tracking and verifying off-chain tasks and self-reported tasks. Users can also submit their own project ideas to the Gitcoin platform in order to crowdsource funding from contributing donors. Aside from direct community crowdfunding, Gitcoin employs a unique system known as quadratic funding to help match community funding efforts to accelerate development of the projects. Overall, Gitcoin is a platform designed to foster the development of meaningful, open-source projects and better align the interests of donors and developers. Gitcoin uses its token GTC to incentivize contributions from contributors all around the world. Contributions to Gitcoin could come in different forms — code changes, writing docs, answering questions from users and anything helping keep the infrastructure running, etc. All can be recorded and marked as contributions. RabbitHole RabbitHole is a path for project treasuries to distribute their tokens to incentivize participation and future contributors. It is the first crypto-native paid acquisition channel for projects and protocols. Through gamified approaches, RabbitHole helps different decentralized networks attract the right users, allowing users to discover and use new Dapps, protocols or crypto platforms in exchange for token rewards. By distributing tokens, RabbitHole helps projects acquire real users and participants who are knowledgeable and engaged. It also offers crypto tutorials on topics such as NFT and DeFi with proof-of-attendance protocols so users can demonstrate their knowledge of specific topics. Compensation Creating a suitable compensation mechanism for members who write, moderate, and work for the DAO is paramount. Recognition and compensation of people who add value to the DAO will be essential for long-term success. Every contributor should be fairly compensated for the inputs he or she gave and treated with respect and dignity by the members of the DAO. Moreover, a DAO should make its contributors feel like that the rules, policies and procedures of the DAO are clear, fair and transparently communicated. DAO contributors have significantly greater power of choice than their corporate counterparts exactly because DAOs have permeable organizational boundaries like communities. Thus, it becomes critical for DAOs to attract and retain contributors with a reasonable compensation system. Coordinape Coordinape is a Web3 native platform that enables DAOs to easily distribute compensation to contributors through a peer-to-peer fair distribution. Community grants, internal salaries, and special projects can all be incentivized and rewarded by the community itself. Instead of cumbersome voting or black box committees, contributors themselves can quickly and transparently get rewarded for the value they created. Coordinape moves more decision making to the community. The simple premise is that if you ask everyone in the community who is doing good work, their collective answers will give a better sense of where the value is and who should be most rewarded. Over time, this also provides valuable insights for the DAO about what kinds of work are prioritized, what the community finds most valuable, and who are the key contributors in different areas. The shared visual display also enables everyone in the community to see who is working on what and find opportunities for collaboration or reduce duplication of effort. Coordinape’s current implementation is off-chain while it will continue to develop an on-chain version, using an EVM compatible L2 or side chain. Utopia Trusted by notable DAOs such as Friends with Benefits, PleasrDAO, and Yield Guild, Utopia is a collaborative payroll and expense management tool. It helps to manage payments, accounting, and reporting for DAOs. Users can automate payroll using fiat or crypto, keep track of DAO expenses, and assign roles to contributors to ease organization. One impressive feature is that Utopia can batch execute all invoices and recurring payroll into one single multi-sig transaction without paying any gas. Before the introduction of Utopia, the vast majority of DAOs were operating token distribution and payments based on raw dumps of data. Utopia has built invoices and reimbursements workflows so that DAOs can receive, pay out and manage these requests. Additionally, Utopia’s all-in-one transactions allow DAOs to batch payments across multiple people and coins. Summary Ultimately, the DAO community’s behavior will determine its success. The most important part is that the community keeps caring about what they are doing — the best tooling and infrastructure today is built to promote social cohesion and prioritize long term coordination. The DAO infrastructure tools mentioned above are only a snapshot of an ever growing landscape of products and services looking to ensure and sustain fast-growing communities as they scale in members and mission. The time is ripe for the next wave of DAO infrastructure and tools that will undoubtedly reshape how DAOs are managed and operated. Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision. 🐩 @chestersigned 📅 1 July 2022 Links: https://medium.com/1kxnetwork/organization-legos-the-state-of-dao-tooling-866b6879e93e https://identityreview.com/the-ultimate-guide-to-dao-tooling/ https://www.alchemy.com/overviews/the-definitive-guide-to-starting-a-dao#dao-13 https://mirror.xyz/twhittle.eth/GwRXJ1_D9bK4P_LKfz3U-HvnvZpcx4ttkP6AvVvxh9s

DAO tooling & infrastructure: upon which DAOs are built

DAO, as an emerging crypto vertical, has gained great progress and development in recent years. As of now (21 Jun 2022), there have been 1085 DAOs with more than 3 million members according to the data of deepdao.io. To enable these organizations to flourish, a vibrant tooling and infrastructure stack has been created, driving a thriving DAO ecosystem. These projects provide communities or projects with different templates, frameworks and tools to pool resources and launch their first DAO. They typically provide smart contracts and interfaces to facilitate on-chain operations for decentralized communities, making it easy for anyone with limited technical skills to launch a DAO.

In this article, we will introduce some important DAO infrastructure and tools from four most fundamental aspects — treasury management, governance and voting, contribution and reputation, compensation.

Treasury Management

A DAO isn’t a DAO without a treasury — a shared wallet that can be used to send and receive crypto funds on the blockchain. A DAO treasury serves as the lever for actions within the organization: making purchases, enabling fundraising and interacting with token or NFT sales, rewarding contributions, etc. In addition, a crypto treasury is the most fundamental staking mechanism by which community engagement is incentivized.

Gnosis Safe

Having started developing blockchain-based financial infrastructure products since 2016, Gnosis Safe is a decentralized asset management platform that sets a secure standard for managing funds and can hold a wide range of asset types as well as NFTs.

Gnosis Safe has become the first choice for those looking for a lightweight multi-signature solution for DAOs. It supports offline signatures, hardware ledgers, and allows users to configure the number of signatures required. Currently Gnosis Safe supports multiple public chains such as Ethereum, Polygon, BSC, Avalanche and so on. Besides, non-ether transactions on Ethereum are made possible by Gnosis through transaction relay mechanism.

Juicebox

The Juicebox protocol is a programmable treasury. Projects can use it to configure how its tokens should be minted when it receives funds, and under what conditions those funds can be distributed to preprogrammed addresses or reclaimed by its community. These rules can evolve over funding cycles, allowing people to bootstrap open-ended projects and add structure, constraints, extensions, and incentives over time as needed.

Juicebox also acts as a payment terminal and programmable treasury for projects, allowing DAOs to manage treasury actions such as crowdfunding, employee salaries and automated payouts when a project closes. It is light enough for a group of friends, yet powerful enough for a global network sharing thousands of ETH, ERC20 tokens or other assets.

Governance & Voting

DAOs must maintain transparency and visibility of how their decisions and funds flow in and out of the community, thus many DAOs use voting to make democratic decisions. Members can earn voting privileges in the DAO through contribution and express their opinions by voting on specific actions. Once approved, those votes automatically translate to on-chain action and get recorded on the blockchain, such as transferring funds from the treasury to an external entity.

Snapshot

Snapshot is an off-chain gasless decentralized governance and voting system. It provides flexibility on how voting power is calculated for a vote. Snapshot supports various voting types to cater to the needs of organizations, such as single choice, approval voting, quadratic voting and more. Creating proposals and voting on Snapshot is user-friendly and does not cost gas as the process is performed off-chain. Moreover, proposals and votes are stored as signatures on IPFS and there are no restrictions on projects or blockchain networks.

Over the past year, the high gas fee on Ethereum has caused problems in this ecosystem. With voting gradually decoupling from on-chain execution, Snapshot has become a popular way in DAO governance. To date, Snapshot has supported 13 public chains and over 4,000 projects and the numbers are still growing rapidly.

Tally

Tally is a powerful platform for analyzing and voting on crypto protocols. Anyone can connect a wallet and see all the DAOs he or she is involved with in one place. It’s effortless to develop and submit proposals. Tally supports Ethereum, Polygon, and Avalanche. Users can use Tally to create and pass proposals, enable delegation, and power voting, if their DAOs involve submission and voting of proposals.

Another feature worth mentioning is that Tally’s SafeGuard allows a certain number of token holders to revoke a transaction initiated by multi-signature or reclaim funds if they believe the signer’s decision was improper or unfair.

Contribution & Reputation

To become a contributor, one generally goes through the following journey: from being completely unaware of DAO, to lurking on various social platforms like Twitter or Discord, to connecting with other members, and finally making the first contribution to DAO and creating value on a consistent basis.

Ideally, members’ sense of belonging will be strengthened during this contributor journey, prompting them to grow with the community, contribute more, and move toward a common vision. Achieving this ideal situation involves process design issues, the essence of which lies in choosing the right tools and maximizing the value of the tools as much as possible. Good use of tools can help solve macro problems, such as:

How to provide the right opportunities and information to the right people at the right time

How members’ contributions are quantified, motivated and appropriately rewarded

How to foster trust through roles and reputation mechanisms

Gitcoin

Gitcoin is a platform where coders and developers can get paid to work on open-source software in a wide variety of programming languages. It mainly involves tracking and verifying off-chain tasks and self-reported tasks.

Users can also submit their own project ideas to the Gitcoin platform in order to crowdsource funding from contributing donors. Aside from direct community crowdfunding, Gitcoin employs a unique system known as quadratic funding to help match community funding efforts to accelerate development of the projects. Overall, Gitcoin is a platform designed to foster the development of meaningful, open-source projects and better align the interests of donors and developers.

Gitcoin uses its token GTC to incentivize contributions from contributors all around the world. Contributions to Gitcoin could come in different forms — code changes, writing docs, answering questions from users and anything helping keep the infrastructure running, etc. All can be recorded and marked as contributions.

RabbitHole

RabbitHole is a path for project treasuries to distribute their tokens to incentivize participation and future contributors. It is the first crypto-native paid acquisition channel for projects and protocols. Through gamified approaches, RabbitHole helps different decentralized networks attract the right users, allowing users to discover and use new Dapps, protocols or crypto platforms in exchange for token rewards.

By distributing tokens, RabbitHole helps projects acquire real users and participants who are knowledgeable and engaged. It also offers crypto tutorials on topics such as NFT and DeFi with proof-of-attendance protocols so users can demonstrate their knowledge of specific topics.

Compensation

Creating a suitable compensation mechanism for members who write, moderate, and work for the DAO is paramount. Recognition and compensation of people who add value to the DAO will be essential for long-term success.

Every contributor should be fairly compensated for the inputs he or she gave and treated with respect and dignity by the members of the DAO. Moreover, a DAO should make its contributors feel like that the rules, policies and procedures of the DAO are clear, fair and transparently communicated.

DAO contributors have significantly greater power of choice than their corporate counterparts exactly because DAOs have permeable organizational boundaries like communities. Thus, it becomes critical for DAOs to attract and retain contributors with a reasonable compensation system.

Coordinape

Coordinape is a Web3 native platform that enables DAOs to easily distribute compensation to contributors through a peer-to-peer fair distribution. Community grants, internal salaries, and special projects can all be incentivized and rewarded by the community itself. Instead of cumbersome voting or black box committees, contributors themselves can quickly and transparently get rewarded for the value they created.

Coordinape moves more decision making to the community. The simple premise is that if you ask everyone in the community who is doing good work, their collective answers will give a better sense of where the value is and who should be most rewarded. Over time, this also provides valuable insights for the DAO about what kinds of work are prioritized, what the community finds most valuable, and who are the key contributors in different areas. The shared visual display also enables everyone in the community to see who is working on what and find opportunities for collaboration or reduce duplication of effort.

Coordinape’s current implementation is off-chain while it will continue to develop an on-chain version, using an EVM compatible L2 or side chain.

Utopia

Trusted by notable DAOs such as Friends with Benefits, PleasrDAO, and Yield Guild, Utopia is a collaborative payroll and expense management tool. It helps to manage payments, accounting, and reporting for DAOs. Users can automate payroll using fiat or crypto, keep track of DAO expenses, and assign roles to contributors to ease organization. One impressive feature is that Utopia can batch execute all invoices and recurring payroll into one single multi-sig transaction without paying any gas.

Before the introduction of Utopia, the vast majority of DAOs were operating token distribution and payments based on raw dumps of data. Utopia has built invoices and reimbursements workflows so that DAOs can receive, pay out and manage these requests. Additionally, Utopia’s all-in-one transactions allow DAOs to batch payments across multiple people and coins.

Summary

Ultimately, the DAO community’s behavior will determine its success. The most important part is that the community keeps caring about what they are doing — the best tooling and infrastructure today is built to promote social cohesion and prioritize long term coordination.

The DAO infrastructure tools mentioned above are only a snapshot of an ever growing landscape of products and services looking to ensure and sustain fast-growing communities as they scale in members and mission. The time is ripe for the next wave of DAO infrastructure and tools that will undoubtedly reshape how DAOs are managed and operated.

Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision.

🐩 @chestersigned

📅 1 July 2022

Links:

https://medium.com/1kxnetwork/organization-legos-the-state-of-dao-tooling-866b6879e93e

https://identityreview.com/the-ultimate-guide-to-dao-tooling/

https://www.alchemy.com/overviews/the-definitive-guide-to-starting-a-dao#dao-13

https://mirror.xyz/twhittle.eth/GwRXJ1_D9bK4P_LKfz3U-HvnvZpcx4ttkP6AvVvxh9s
Why equity or mixed investment is better for Web3 GamingTLDR While the future of Web3 game is promising, there have been signs of defects in the token financing of many companies and projects. Equity investment or mixed investment may be a possible solution, but concerted efforts is also irreplaceable for a fair market environment. Market Status and Forecast In the second half of 2021, the concept of Web3 games/metaverse began to emerge, and its superior short-term profitability conquered the capital and traditional game developers. More and more VCs are looking for early-stage, high-quality projects in the market. For the last 5 months of this year, leading VC institutions have invested in 170 Web3 games and metaverse projects, with a total amount of 6.13 billion USD The proportion of investment in Web3 games remains prominent during the bear market. For example, a16z launched two Web3 game funds in May and ended up with up to 5.1 billion US dollars raises. Such investment behavior represents the optimistic side of the Web3 game market. As implied in the concept of ownership in blockchain, most Web3 games will have open economies where in-game assets can be traded on secondary markets without permission. Traditional in-game virtual items are estimated to be $50bn industry — this number mostly includes primary sales as trading secondary game items are typically forbidden. The black market for secondary game items was estimated [1] to be $5bn in 2015. According to Vida Research, traditional gaming will converge into Web3 gaming, which will accelerate the industry’s growth as a whole and the open economy nature of Web3 games will attract speculator/financial capital, thus cumulatively increasing the game NFT market size to above $100bn. Current flaws At present, most of the Web3 financing methods in the market are token financing, including seed rounds and private rounds. Only a small number of them conduct equity financing and strategic investment. For project parties and potential users, it is true that token financing has some advantages, such as value distribution to holders, revenue rapid realization, etc. However, token financing also has several drawbacks, especially for Investors. Chaotic distribution of implicit revenue As defined by the Web3 Index and FutureMoney Research [2], revenue can be divided into Explicit and Implicit. The latter is commonly found in x-2-earn and Web3 games, Which is similar to Supply-side Revenue in Defi, but even closer to Token Sale. Participants use this protocol in order to obtain speculative income from Token, contribute “income” in the form of ETH or SOL, get Token and sell them at a profit in the future. There are two typical examples, LooksRare charges the user’s transaction fee denominated in ETH [3], and Stepn charges the fee denominated in Sol. This implicit revenue is essentially another public token sale, which will greatly increase the liquid supply of capped governance tokens, For investors, it generally takes 1–3 years to unlock the token linearly, so it is almost impossible to share this part of the profit and even reduce investor’s own investment return. This part of the income will be taken away by the project party alone. LooksRare does this, while Stepn is slightly better as they announced a buyback plan in the first quarter, unfortunately, there is no follow-up. Buyback is the most common type of value return to token holders (including early investors). Risk from disclosure of additional issuance income Under Gamefi’s dual token system, the Utility token is an important income to reward players. There is no upper limit and it is generally not repurchased. The starting point of this design is good, and it is helpful to maintain the stability of the game economy, which is not affected by investors. On the other side, it also brings the Trojan horse. The project party may trade the utility token repeatedly through multiple addresses to make huge profits. The centralized rule-makers are not required to disclose the direction of the Utility token to the community or investors. Although the governance token is designed for deflation, the secret profit of the utility token will continue to take away the value of the entire game economy, and investors will be kept in the dark this whole time. Tokens may not carry the value of positive externalities Games are a content consumption industry, as the final value comes from the IP. Any game product has its life cycle, however, IP and production capabilities are the core assets that can be passed down. For the current Web3 game, most of the tokens carry the profits of individual products, and cannot be bound with IP or production capacity. It is conceivable that if a product fails, the game’s token will hardly be reused, and the return of investors will be lost. Even if a Web3 game product is successful, there is no IP value to be shared by investors. Suggestion Equity investment or mixed investment may solve the above problems. Regularize opaque income distribution Equity investments are usually accompanied by dividend rights. The implicit revenue of the project party is obliged to share with investors in clearly written rules. As an example, a16z general partners David Haber and Jonathan Lai will join the 「Carry1st」 board as observers. Committed to long-term common interests Investors have long-term cooperation with project parties through equity, which can help users grow and improve products in many ways. Investors use their own resources to give the project party legal, financial, technical, market, and other support in the early stage. After the success of the product, the project party will also share other external benefits such as IP and content. Binance’s strategic investment in Stepn in April may inspire us. Help Web3game grow healthily Under the condition of asymmetric risk, the phenomenon of bad products driving out good products (Gresham’s Law) often occurs. This is also the reason that most Web3 games are now devoted to financing and Ponzi, and very little effort is devoted to content and fun to play. As an important initiator in the market, investors have the responsibility to use more reasonable solutions to encourage the healthy growth of this industry. Of course, there are some companies that focus on the product itself, in order to avoid investment advice. Some good games will be independently analyzed in the future. Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision. 🐩 @Nemo_eth 📅 21 June 2022 Reference: [1] https://www.mds.deutsche-boerse.com/resource/blob/1335780/224223c1b0e8cf02d8948bcea3258d05/data/Changing-the-game-for-the-world-of-games.pdf [2] https://mirror.xyz/0xfu.eth [3] https://thedefiant.io/looksrare-opensea-protocol-revenue

Why equity or mixed investment is better for Web3 Gaming

TLDR

While the future of Web3 game is promising, there have been signs of defects in the token financing of many companies and projects. Equity investment or mixed investment may be a possible solution, but concerted efforts is also irreplaceable for a fair market environment.

Market Status and Forecast

In the second half of 2021, the concept of Web3 games/metaverse began to emerge, and its superior short-term profitability conquered the capital and traditional game developers. More and more VCs are looking for early-stage, high-quality projects in the market. For the last 5 months of this year, leading VC institutions have invested in 170 Web3 games and metaverse projects, with a total amount of 6.13 billion USD

The proportion of investment in Web3 games remains prominent during the bear market. For example, a16z launched two Web3 game funds in May and ended up with up to 5.1 billion US dollars raises. Such investment behavior represents the optimistic side of the Web3 game market.

As implied in the concept of ownership in blockchain, most Web3 games will have open economies where in-game assets can be traded on secondary markets without permission. Traditional in-game virtual items are estimated to be $50bn industry — this number mostly includes primary sales as trading secondary game items are typically forbidden. The black market for secondary game items was estimated [1] to be $5bn in 2015.

According to Vida Research, traditional gaming will converge into Web3 gaming, which will accelerate the industry’s growth as a whole and the open economy nature of Web3 games will attract speculator/financial capital, thus cumulatively increasing the game NFT market size to above $100bn.

Current flaws

At present, most of the Web3 financing methods in the market are token financing, including seed rounds and private rounds. Only a small number of them conduct equity financing and strategic investment. For project parties and potential users, it is true that token financing has some advantages, such as value distribution to holders, revenue rapid realization, etc.

However, token financing also has several drawbacks, especially for Investors.

Chaotic distribution of implicit revenue

As defined by the Web3 Index and FutureMoney Research [2], revenue can be divided into Explicit and Implicit. The latter is commonly found in x-2-earn and Web3 games, Which is similar to Supply-side Revenue in Defi, but even closer to Token Sale. Participants use this protocol in order to obtain speculative income from Token, contribute “income” in the form of ETH or SOL, get Token and sell them at a profit in the future.

There are two typical examples, LooksRare charges the user’s transaction fee denominated in ETH [3], and Stepn charges the fee denominated in Sol. This implicit revenue is essentially another public token sale, which will greatly increase the liquid supply of capped governance tokens, For investors, it generally takes 1–3 years to unlock the token linearly, so it is almost impossible to share this part of the profit and even reduce investor’s own investment return.

This part of the income will be taken away by the project party alone. LooksRare does this, while Stepn is slightly better as they announced a buyback plan in the first quarter, unfortunately, there is no follow-up. Buyback is the most common type of value return to token holders (including early investors).

Risk from disclosure of additional issuance income

Under Gamefi’s dual token system, the Utility token is an important income to reward players. There is no upper limit and it is generally not repurchased. The starting point of this design is good, and it is helpful to maintain the stability of the game economy, which is not affected by investors.

On the other side, it also brings the Trojan horse. The project party may trade the utility token repeatedly through multiple addresses to make huge profits. The centralized rule-makers are not required to disclose the direction of the Utility token to the community or investors.

Although the governance token is designed for deflation, the secret profit of the utility token will continue to take away the value of the entire game economy, and investors will be kept in the dark this whole time.

Tokens may not carry the value of positive externalities

Games are a content consumption industry, as the final value comes from the IP. Any game product has its life cycle, however, IP and production capabilities are the core assets that can be passed down. For the current Web3 game, most of the tokens carry the profits of individual products, and cannot be bound with IP or production capacity.

It is conceivable that if a product fails, the game’s token will hardly be reused, and the return of investors will be lost. Even if a Web3 game product is successful, there is no IP value to be shared by investors.

Suggestion

Equity investment or mixed investment may solve the above problems.

Regularize opaque income distribution

Equity investments are usually accompanied by dividend rights. The implicit revenue of the project party is obliged to share with investors in clearly written rules.

As an example, a16z general partners David Haber and Jonathan Lai will join the 「Carry1st」 board as observers.

Committed to long-term common interests

Investors have long-term cooperation with project parties through equity, which can help users grow and improve products in many ways.

Investors use their own resources to give the project party legal, financial, technical, market, and other support in the early stage. After the success of the product, the project party will also share other external benefits such as IP and content.

Binance’s strategic investment in Stepn in April may inspire us.

Help Web3game grow healthily

Under the condition of asymmetric risk, the phenomenon of bad products driving out good products (Gresham’s Law) often occurs. This is also the reason that most Web3 games are now devoted to financing and Ponzi, and very little effort is devoted to content and fun to play.

As an important initiator in the market, investors have the responsibility to use more reasonable solutions to encourage the healthy growth of this industry.

Of course, there are some companies that focus on the product itself, in order to avoid investment advice. Some good games will be independently analyzed in the future.

Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision.

🐩 @Nemo_eth

📅 21 June 2022

Reference:

[1] https://www.mds.deutsche-boerse.com/resource/blob/1335780/224223c1b0e8cf02d8948bcea3258d05/data/Changing-the-game-for-the-world-of-games.pdf

[2] https://mirror.xyz/0xfu.eth

[3] https://thedefiant.io/looksrare-opensea-protocol-revenue
DeFi Lending In A NutshellThe DeFi lending platforms offer crypto loans in a trustless manner and allow users to enlist the crypto coins they have in the DeFi lending platforms for lending purposes. With this decentralized platform, a borrower can directly take a loan, called DeFi P2P lending. Moreover, the lending protocol even allows the lender to earn interest. Why we need DeFi Lending Getting profits from assets while still maintaining exposure to the upside of the assets. A long-term crypto hodler who doesn’t want to sell their crypto investment but still needs to pay for their bills may take out a loan on their crypto so they can pay for their daily needs. BTC mining farm operators may want to take out loans to buy new equipment instead of selling their BTC when the price is low. Leveraged trading A leveraged trade is a trade made with borrowed money. Margin refers to the collateral used to make the leveraged trade. Margin trading not only amplifies the investor’s loss or gain made on trade but also enables risk hedging like diversifying concentrated portfolios. Earn Interest Lending protocols serve holders with a better interest rate compared with traditional bank accounts, and it makes more customers access this passive earning opportunity. Advantages of DeFi Lending Transparency DeFi Lending is facilitated by blockchain technology, which implies that it inherits the advantages of blockchain as well. As it is deployed on a public blockchain, the exact content of smart contracts is freely available and auditable for public users. In addition, users’ historical interactions with protocols and their lend and borrow positions are transparently recorded on the blockchain. The Market information is public to everybody. Transparency ensures the security and authenticity of the system. Liquidity In conventional finance, the liquidity in lending mechanisms is sub-optimal. There are many factors like lending period and credit rating that make the liquidity of both demand and supply sides not properly served. The inefficient system makes the oversupply of liquidity in one submarket unable to be promptly transferred to serve the demand from another submarket. In DeFi Lending, funds supplied to a lending protocol are pooled together and can be utilized efficiently. Because of smart contracts and blockchain, lending can be performed inexpensively and instantaneously. Trustless & Permissionless Regulations keep many people from lending services in traditional finance, and its centralized lending services result in a high intermediary cost, which causes high market friction and inefficient usage of market liquidity. In DeFi Lending, lenders do not need to trust borrowers’ solvency at the most time, as the smart contracts automatically enforce the liquidation when default risk is present. By eliminating the middleman, DeFi Lending can limitlessly reach people around the world. Popular DeFi Lending Platforms Aave Aave is a dual-token DeFi protocol based on the Ethereum network, considered the fastest-growing DeFi protocol. While it provides comprehensive financial services, Aave is prominent for its lending and borrowing among other offerings. It has the protocol native asset $LEND and interest asset aToken. It has a hybrid lending system, which provides its users with both short-term stable and floating interest rate lending. The floating interest rate is determined by the offer and demand in Aave, and the short-term stable interest rate [1] is unchanged until the average borrow rate is lower than 25% APY and the utilization rate is over 95%. In Aave, lenders will not enter into a deal with a specific borrower but they will send their assets directly into a liquidity pool. They receive the aToken as the return, which can be redeemed in a 1:1 ratio for the deposited asset. The number of aToken will gradually increase as the lender earns the interest rate. For borrowers, they need to deposit collateral first before they can borrow from the protocol, which determines the maximum loan amount. The ratio is regulated by the LTV (Loan to Value) coefficient [2]. For example, If the LTV is 70%, the user needs to deposit 100 $ETH to borrow 70 $ETH. Maker / Oasis Maker is a dual-token system built on Ethereum using $MKR as a governance token, and the stablecoin $DAI for issuing loans. $DAI is created to function like actual money. As a stablecoin, it is soft-pegged to the US Dollar. Unlike Aave and most other DeFi Lending protocols, Oasis only provides a borrowing service, and users can borrow its stablecoin $DAI from the protocol. The protocol is governed by the Collateralized Debt Positions (CDP) system, which is a set of smart contracts that hold onto the collateral deposited by users to generate $DAI for borrowing. The presence of the debt locks the user’s collateral assets within the smart contract until the user is ready to pay back the amount of the $DAI they borrowed. For borrowers, Maker offers stability in the volatile crypto market. Since $DAI is soft-pegged to the US Dollar and backed by a surplus of collateral locked in the protocol. When the market is highly volatile, Maker allows its users to store the value without abandoning their crypto positions. TrueFi TrueFi is a DeFi Lending protocol for uncollateralized on-chain lending. Prior to TrueFi, most of DeFi lending protocols were utilizing over-collateralized lending, which is antithetical to the idea of borrowing and limited the mainstream adoption of DeFi. The introduction of credit scoring to crypto is considered a paradigm shift for DeFi, without any doubt, it will improve the mainstream adoption of DeFi. In TrueFi protocol, lenders can directly add their assets ($BUSD, $USDC, and $USDT) into a lending pool to be used to fund new loans, for which lenders earn interest and $TRU, the network’s native asset. Any unused capital is sent to high-yield DeFi protocols (like Aave) for maintaining returns. Borrowers (currently reserved to institutions only) are whitelisted through a rigorous onboarding process that involves a deep review of their business, the signing of an enforceable lending agreement, and $TRU community‘s approval. Once they are approved, borrowers will need to submit a request for capital at an interest rate and credit limit determined by their credit score, which is subject to further $TRU community approval. The Borrower must return the principal and interest on or before the term expired. Delinquent borrowers will face legal action pursuant to the loan agreement signed during onboarding. For each loan, the $TRU community is required to signal their opinion on the loan by voting “Yes” or “No”. They are incentivized to vote on loans carefully, as their staked $TRU may be liquidated to protect lenders in case of default. Notional Notional protocol provides a fixed-interest rate for its investors, its working principle is similar to the zero-coupon bond in conventional finance markets that is issued and priced at a deep discount from its face value at maturity. Since its face value is predetermined, Investors can make a fixed return on their investment by purchasing a zero-coupon bond at the discounted price and redeeming it later for its full face value. fCash token and its AMM liquidity pool are two major elements that compose the Notional protocol. fCash is the zero-coupon bond in Notional protocol, which represents a claim on a positive or negative cash flow at a specific point in the future. In Notional protocol, lenders and borrowers don’t trade against each other directly, they trade against liquidity providers. These liquidity providers ensure that there is always cash and fCash available for either lenders or borrowers at any point in time and receive trading fees in return. Conclusions DeFi lending rapidly evolved after the DeFi summer. These protocols are continuing to extend the boundaries of their financial services in order to cover more customers that were abandoned by conventional finance. The paradigm shift from off-chain to on-chain financial services is still in its early days, but like the on-chain credit score system brought by TrueFi, the potential of the DeFi system is unlimited. The future of DeFi lending itself and the financial system incorporating with DeFi could lead to widespread adoption of DeFi by Fintech firms and the government. This DeFi would certainly lead to an open and simple financial system in the near future. Updates After the Celsius Suspended its Fund Withdrawal (June.14.2022) We have seen the collapse of Terra last month ends the fanaticism of algorithm stablecoin, and the Celsius now seems frustrating the future of CeFi. Celsius offers varied interest rates based on the LTV ratio of the collateral provided by borrowers, the more collateral they provided, the lower rate they can access. They managed their investors’ deposits privately by distributing them into yield farming protocols (lending protocols)like Anchor. There are some rumors said Celsius lost a lot of funds during the collapse of Luna. As a CeFi company, it is impossible for investors to check what is happening internally at Celsius. In this case, investors made a bank run against Celsius when hearing bad rumors under a panicky bear market, and forced Celsius to suspend its fund withdrawal. The necessity of transparent fund management received more concerns when the Federal tightened their pockets and the inflation rate hits the moon. Vigilant investors in a bear market are now seeking protocols that they have more controllability rather than higher yields. And the emergence of lending aggregators, like Fuji DAO, improves capital efficiency while maintaining DeFi’s transparency, and it will further undermine the market share of CeFi. Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision. 🐩 @SoxPt50 📅 15 June 2022 Reference [1] https://medium.com/aave/aave-borrowing-rates-upgraded-f6c8b27973a7 [2] https://docs.aave.com/risk/asset-risk/risk-parameters

DeFi Lending In A Nutshell

The DeFi lending platforms offer crypto loans in a trustless manner and allow users to enlist the crypto coins they have in the DeFi lending platforms for lending purposes. With this decentralized platform, a borrower can directly take a loan, called DeFi P2P lending. Moreover, the lending protocol even allows the lender to earn interest.

Why we need DeFi Lending

Getting profits from assets while still maintaining exposure to the upside of the assets.

A long-term crypto hodler who doesn’t want to sell their crypto investment but still needs to pay for their bills may take out a loan on their crypto so they can pay for their daily needs. BTC mining farm operators may want to take out loans to buy new equipment instead of selling their BTC when the price is low.

Leveraged trading

A leveraged trade is a trade made with borrowed money. Margin refers to the collateral used to make the leveraged trade. Margin trading not only amplifies the investor’s loss or gain made on trade but also enables risk hedging like diversifying concentrated portfolios.

Earn Interest

Lending protocols serve holders with a better interest rate compared with traditional bank accounts, and it makes more customers access this passive earning opportunity.

Advantages of DeFi Lending

Transparency

DeFi Lending is facilitated by blockchain technology, which implies that it inherits the advantages of blockchain as well. As it is deployed on a public blockchain, the exact content of smart contracts is freely available and auditable for public users.

In addition, users’ historical interactions with protocols and their lend and borrow positions are transparently recorded on the blockchain. The Market information is public to everybody.

Transparency ensures the security and authenticity of the system.

Liquidity

In conventional finance, the liquidity in lending mechanisms is sub-optimal. There are many factors like lending period and credit rating that make the liquidity of both demand and supply sides not properly served. The inefficient system makes the oversupply of liquidity in one submarket unable to be promptly transferred to serve the demand from another submarket.

In DeFi Lending, funds supplied to a lending protocol are pooled together and can be utilized efficiently. Because of smart contracts and blockchain, lending can be performed inexpensively and instantaneously.

Trustless & Permissionless

Regulations keep many people from lending services in traditional finance, and its centralized lending services result in a high intermediary cost, which causes high market friction and inefficient usage of market liquidity.

In DeFi Lending, lenders do not need to trust borrowers’ solvency at the most time, as the smart contracts automatically enforce the liquidation when default risk is present. By eliminating the middleman, DeFi Lending can limitlessly reach people around the world.

Popular DeFi Lending Platforms

Aave

Aave is a dual-token DeFi protocol based on the Ethereum network, considered the fastest-growing DeFi protocol. While it provides comprehensive financial services, Aave is prominent for its lending and borrowing among other offerings. It has the protocol native asset $LEND and interest asset aToken.

It has a hybrid lending system, which provides its users with both short-term stable and floating interest rate lending. The floating interest rate is determined by the offer and demand in Aave, and the short-term stable interest rate [1] is unchanged until the average borrow rate is lower than 25% APY and the utilization rate is over 95%.

In Aave, lenders will not enter into a deal with a specific borrower but they will send their assets directly into a liquidity pool. They receive the aToken as the return, which can be redeemed in a 1:1 ratio for the deposited asset. The number of aToken will gradually increase as the lender earns the interest rate.

For borrowers, they need to deposit collateral first before they can borrow from the protocol, which determines the maximum loan amount. The ratio is regulated by the LTV (Loan to Value) coefficient [2]. For example, If the LTV is 70%, the user needs to deposit 100 $ETH to borrow 70 $ETH .

Maker / Oasis

Maker is a dual-token system built on Ethereum using $MKR as a governance token, and the stablecoin $DAI for issuing loans. $DAI is created to function like actual money. As a stablecoin, it is soft-pegged to the US Dollar.

Unlike Aave and most other DeFi Lending protocols, Oasis only provides a borrowing service, and users can borrow its stablecoin $DAI from the protocol.

The protocol is governed by the Collateralized Debt Positions (CDP) system, which is a set of smart contracts that hold onto the collateral deposited by users to generate $DAI for borrowing. The presence of the debt locks the user’s collateral assets within the smart contract until the user is ready to pay back the amount of the $DAI they borrowed.

For borrowers, Maker offers stability in the volatile crypto market. Since $DAI is soft-pegged to the US Dollar and backed by a surplus of collateral locked in the protocol. When the market is highly volatile, Maker allows its users to store the value without abandoning their crypto positions.

TrueFi

TrueFi is a DeFi Lending protocol for uncollateralized on-chain lending.

Prior to TrueFi, most of DeFi lending protocols were utilizing over-collateralized lending, which is antithetical to the idea of borrowing and limited the mainstream adoption of DeFi.

The introduction of credit scoring to crypto is considered a paradigm shift for DeFi, without any doubt, it will improve the mainstream adoption of DeFi.

In TrueFi protocol, lenders can directly add their assets ($BUSD, $USDC, and $USDT) into a lending pool to be used to fund new loans, for which lenders earn interest and $TRU, the network’s native asset. Any unused capital is sent to high-yield DeFi protocols (like Aave) for maintaining returns.

Borrowers (currently reserved to institutions only) are whitelisted through a rigorous onboarding process that involves a deep review of their business, the signing of an enforceable lending agreement, and $TRU community‘s approval. Once they are approved, borrowers will need to submit a request for capital at an interest rate and credit limit determined by their credit score, which is subject to further $TRU community approval. The Borrower must return the principal and interest on or before the term expired. Delinquent borrowers will face legal action pursuant to the loan agreement signed during onboarding.

For each loan, the $TRU community is required to signal their opinion on the loan by voting “Yes” or “No”. They are incentivized to vote on loans carefully, as their staked $TRU may be liquidated to protect lenders in case of default.

Notional

Notional protocol provides a fixed-interest rate for its investors, its working principle is similar to the zero-coupon bond in conventional finance markets that is issued and priced at a deep discount from its face value at maturity. Since its face value is predetermined, Investors can make a fixed return on their investment by purchasing a zero-coupon bond at the discounted price and redeeming it later for its full face value.

fCash token and its AMM liquidity pool are two major elements that compose the Notional protocol.

fCash is the zero-coupon bond in Notional protocol, which represents a claim on a positive or negative cash flow at a specific point in the future.

In Notional protocol, lenders and borrowers don’t trade against each other directly, they trade against liquidity providers. These liquidity providers ensure that there is always cash and fCash available for either lenders or borrowers at any point in time and receive trading fees in return.

Conclusions

DeFi lending rapidly evolved after the DeFi summer. These protocols are continuing to extend the boundaries of their financial services in order to cover more customers that were abandoned by conventional finance. The paradigm shift from off-chain to on-chain financial services is still in its early days, but like the on-chain credit score system brought by TrueFi, the potential of the DeFi system is unlimited. The future of DeFi lending itself and the financial system incorporating with DeFi could lead to widespread adoption of DeFi by Fintech firms and the government. This DeFi would certainly lead to an open and simple financial system in the near future.

Updates After the Celsius Suspended its Fund Withdrawal (June.14.2022)

We have seen the collapse of Terra last month ends the fanaticism of algorithm stablecoin, and the Celsius now seems frustrating the future of CeFi.

Celsius offers varied interest rates based on the LTV ratio of the collateral provided by borrowers, the more collateral they provided, the lower rate they can access.

They managed their investors’ deposits privately by distributing them into yield farming protocols (lending protocols)like Anchor. There are some rumors said Celsius lost a lot of funds during the collapse of Luna. As a CeFi company, it is impossible for investors to check what is happening internally at Celsius.

In this case, investors made a bank run against Celsius when hearing bad rumors under a panicky bear market, and forced Celsius to suspend its fund withdrawal.

The necessity of transparent fund management received more concerns when the Federal tightened their pockets and the inflation rate hits the moon. Vigilant investors in a bear market are now seeking protocols that they have more controllability rather than higher yields.

And the emergence of lending aggregators, like Fuji DAO, improves capital efficiency while maintaining DeFi’s transparency, and it will further undermine the market share of CeFi.

Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision.

🐩 @SoxPt50

📅 15 June 2022

Reference

[1] https://medium.com/aave/aave-borrowing-rates-upgraded-f6c8b27973a7

[2] https://docs.aave.com/risk/asset-risk/risk-parameters
Investment DAOs: New players in the crypto investment marketThe fascination of the crypto world lies in that it uses blockchain technology to disrupt the traditional marketplace that people are used to — DeFi for finance, NFT for art, and DAO for traditional corporations. DAO refers to Decentralized Autonomous Organization, with uniform rules for automated execution, transparent and not controlled by a centralized organization, where stakeholders can express their interests. These features of DAO can solve various problems faced by investment funds in traditional markets, such as opaqueness and vulnerability to manipulation by decision makers, so Investment DAOs were born, and DAO-based investment groups began to emerge in the crypto world and truly represent the interests of the majority of members in DAOs. This article provides an overview of the current mainstream Investment DAOs in three categories classified by investment directions: NFT focused, primary market focused, and secondary market focused. NFT focused DAOs With the explosion of NFT, the collection and investment of NFT has become a new craze, and many wealthy crypto whales spend considerable money on buying NFTs. On the other hand, it is difficult for individual investors to participate in the collection and investment of some sky-high NFTs, but DAOs formed by community members with crowdfunded money make it possible to purchase NFTs, and many DAOs have been established precisely because of the process of bidding for NFTs together. Flamingo DAO Flamingo is an NFT-focused DAO designed to explore emerging blockchain-based asset investment opportunities. It provides members with the ability to develop and deploy NFT-focused investment strategies, with member management at the core, and relying on DApps and associated smart contracts to facilitate the purchase of NFTs. Membership in Flamingo is currently limited to qualified investors as defined by U.S. law, with a total membership cap of 100 and membership seats sold for 60 ETH, each with 1% of the voting rights in Flamingo and 1% of the right to receive a pro-rata share of investment returns. Flamingo members have the power to decide to collect, buy, hold or acquire NFTs. They can create pools of money to make purchase, collection or other acquisition decisions. For example, they vote on the NFTs they want to purchase, and if the vote passes, service providers or members can purchase them on behalf of Flamingo. Flamingo uses smart contracts such as MolochDAO’s v2 to govern the community. The usages include collecting members’ initial contributions to Flamingo, voting, delegating voting rights to third parties, funding investments, distributing proceeds, etc. PleasrDAO PleasrDAO was founded in March 2021 to buy the animated NFT (x*y=k) created by crypto artist pplpleasr at auction to celebrate the release of Uniswap V3, with proceeds from the auction going to the Asian American and Pacific Islander communities and other minority groups as a charitable donation. The more than 40 members brought together at the auction did not disband after the successful auction and developed into a decentralized organization that purchases high-value NFTs through smart contracts. PleasrDAO distributes ownership of the DAO in the form of tokens, and each member participates in the governance of the DAO through a group chat. Apart from investing in NFTs, the team’s vision also includes being an angel investor and an incubator of early stage NFT projects. Whale DAO WhaleShark, a major player in the NFT market, issued the token WHALE in May 2020, backed by its NFT holdings, and the revenue generated by the WHALE project was directly invested to accumulate more valuable NFT assets, driving capital appreciation and community engagement. The community was gradually decentralized after its inception, and Whale DAO is responsible for managing the assets in the treasury. The NFT assets in the WHALE funding pool include Gods Unchained, Sandbox and so on. WHALE token holders can purchase NFTs created by the WHALE community, rent artwork from the funding pool, participate in WHALE DAO voting and decision making, take part in WHALE liquidity mining, and attend WHALE organized conferences. Primary market focused DAOs While it is difficult for individual investors to invest directly in the primary market, the emergence of Investment DAO brings such an opportunity to individual investors, where members of the community can come together to invest in the primary market in the form of a DAO. Some DAOs, which focus on the crypto primary market, have not only proven the viability of the solution in its development, but also have connected more projects to community members. The LAO Launched in late April 2020, The LAO co-founder Aaron Wright drafted the now-passed Wyoming DAO Act, which gives DAOs legal recognition in Wyoming. the LAO is centered on membership management and relies on DApps and related smart contracts to facilitate investment in projects that use digital assets. Membership in The LAO is currently limited to qualified investors under U.S. law, with a total membership cap of 99. Investors can join by purchasing “LAO Units” representing ownership of The LAO at a price of 310 ETH per unit, which gives members 0.9% of The LAO’s voting rights and the right to receive 0.9% of the investment income. Each member can only purchase a maximum of 9 LAO Units. The LAO’s initial service provider, OpenLaw, has partnered with MolochDAO to develop the smart contract of MolochDAO V2. OpenLaw is a project that uses smart contract to implement legal agreements, providing an easy way for everyone to reference and trigger Ethereum-based smart contracts to manage contractual commitments. MolochDAO V2 has a governance model and game-theoretic design, it is a voting-weighted and multi-signature smart contract with a mechanism called “rage quit”. Members can choose to rage quit at any time if they are not satisfied with LAO’s performance or management. MetaCartel The MetaCartel community has existed since September 2018, and on June 5, 2019 MetaCartel DAO’s smart contract was officially deployed to Ethereum mainnet with initial backers including Matic Network (now Polygon), NuCypher, Gnosis and The Graph, as well as more than a dozen other individual investors. MetaCartel DAO went live in July 2019 and now boasts more than 80 DAO participants, as well as more than 800 community members. MetaCartel Ventures (MCV) is a for-profit investment and legal entity of MetaCartel DAO, investing in early-stage decentralized applications. Membership in MetaCartel requires support and evaluation by internal MCV members with submitted proposals and votes on the chain. In MetaCartel DAO, there are three main membership categories, the first category is “Mages” which contains members who are not in the status of qualified investors, the second category is “Goblins” which contains qualified investor members, and the third category is “Summoners” which are operating representatives in the DAO who are responsible for the approval of Mages, legal oversight and guidance, finances and coordination related services. It is worth noting that the operating representatives are not necessarily members. Unlike many DAOs that believe “code is law”, MCV has not abandoned the legal framework. To better resolve disputes over membership withdrawal, MCV has adopted the Grimoire legal framework, a voluntary and legally binding agreement among DAO members. In addition, MCV uses the MolochDAO v2 smart contract developed by OpenLaw and MolochDAO for on-chain governance, which provides flexibility and freedom to process withdrawals so that MCV members can immediately receive their individual assets in proportion to their share of the DAO. While MCV’s fundraising and asset management occurs on-chain, many of its decisions are coordinated through off-chain community communication channels, such as group chats, video conferences, and face-to-face meetings, allowing for some consensus among members prior to on-chain proposals. Secondary market focused DAOs In addition to some Investment DAOs that are directly involved in the early stage financing of projects, there are also DAOs that provide liquidity and earn revenue by exploring investment opportunities in various projects in the DeFi space, raising funds as DAOs and voting in accordance with community members. AladdinDAO AladdinDAO is a decentralized network, launched in June 2021, aiming to transform crypto investment from a form of venture capital to one that brings together the wisdom of the masses through collective value discovery. The core of AladdinDAO is the “Boule Committee”, the first Boule members are recommended by Genesis members and elected by the DAO. The founding members of AladdinDAO include Polychain, DCG, 1kx, Multicoin Capital, CMS, Nascent and Alameda etc. For other Boule members, 80% are nominated by the first headhunters and 20% are nominated directly by the community and elected by the AladdinDAO community through a decentralized governance process. AladdinDAO is constantly screening and recruiting talented individuals. The Boule members vote to identify, analyze and deliver high-quality DeFi projects to the community, and receive AladdinDAO tokens for their participation. This mechanism is designed to encourage responsible voting by DeFi experts, who will be rewarded for voting yes on high-performing projects and no on underperforming projects. Force DAO Force DAO is a quantitative hedge fund of DeFi. It is a DAO of DeFi investment strategies designed to generate incentives and rewards through liquidity mining on Ethereum and Polygon, following a strategy proposed by the community. Holders of Force DAO’s governance token are able to initiate proposals and votes. Force DAO uses a template from Aragon (a DAO solution provider) as the basis for token holders to manage the organization, but due to the high gas cost of maintaining on-chain governance on Ethereum, voting is temporarily available on the Snapshot page. The token holders can make a request in the forum, which becomes a proposal after reaching the required 40 votes. Summary As we can see from the above, Investment DAOs have been growing faster and faster in the past few years, and many communities have emerged to invest through DAO. Moreover, DAOs which focus on NFT investment have witnessed explosive growth. Below is a conclusive table of the DAOs introduced above in terms of exit mechanism, investment direction, governance mechanism etc. The model adopted by most Investment DAOs is still based on having a pool of funds and then members voting on how the funds are invested, but DAOs vary in terms of membership barriers and governance mechanisms. In addition, not all DAOs have governance tokens. Some DAOs with higher barriers to entry can ensure community member participation without tokens, while in some organizations with lower barriers, it is easier for the community to proliferate and grow with token holders as members. Solo travel is fast, but the crowd travels far. In the future, it will be very exciting to see Investment DAOs provide ordinary investors with more and more opportunities, bring a paradigm shift to crypto venture capitals, and even exert overturning impact on traditional investment industry. Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision. 🐩 @chestersigned 📅 10 June 2022 Links: https://www.flamingodao.xyz/ https://pleasr.org/ https://www.whale.me/ https://www.thelao.io/ https://metacartel.xyz/ https://www.aladdin.club/zh https://www.forcedao.com/

Investment DAOs: New players in the crypto investment market

The fascination of the crypto world lies in that it uses blockchain technology to disrupt the traditional marketplace that people are used to — DeFi for finance, NFT for art, and DAO for traditional corporations.

DAO refers to Decentralized Autonomous Organization, with uniform rules for automated execution, transparent and not controlled by a centralized organization, where stakeholders can express their interests.

These features of DAO can solve various problems faced by investment funds in traditional markets, such as opaqueness and vulnerability to manipulation by decision makers, so Investment DAOs were born, and DAO-based investment groups began to emerge in the crypto world and truly represent the interests of the majority of members in DAOs.

This article provides an overview of the current mainstream Investment DAOs in three categories classified by investment directions: NFT focused, primary market focused, and secondary market focused.

NFT focused DAOs

With the explosion of NFT, the collection and investment of NFT has become a new craze, and many wealthy crypto whales spend considerable money on buying NFTs. On the other hand, it is difficult for individual investors to participate in the collection and investment of some sky-high NFTs, but DAOs formed by community members with crowdfunded money make it possible to purchase NFTs, and many DAOs have been established precisely because of the process of bidding for NFTs together.

Flamingo DAO

Flamingo is an NFT-focused DAO designed to explore emerging blockchain-based asset investment opportunities. It provides members with the ability to develop and deploy NFT-focused investment strategies, with member management at the core, and relying on DApps and associated smart contracts to facilitate the purchase of NFTs.

Membership in Flamingo is currently limited to qualified investors as defined by U.S. law, with a total membership cap of 100 and membership seats sold for 60 ETH, each with 1% of the voting rights in Flamingo and 1% of the right to receive a pro-rata share of investment returns.

Flamingo members have the power to decide to collect, buy, hold or acquire NFTs. They can create pools of money to make purchase, collection or other acquisition decisions. For example, they vote on the NFTs they want to purchase, and if the vote passes, service providers or members can purchase them on behalf of Flamingo.

Flamingo uses smart contracts such as MolochDAO’s v2 to govern the community. The usages include collecting members’ initial contributions to Flamingo, voting, delegating voting rights to third parties, funding investments, distributing proceeds, etc.

PleasrDAO

PleasrDAO was founded in March 2021 to buy the animated NFT (x*y=k) created by crypto artist pplpleasr at auction to celebrate the release of Uniswap V3, with proceeds from the auction going to the Asian American and Pacific Islander communities and other minority groups as a charitable donation.

The more than 40 members brought together at the auction did not disband after the successful auction and developed into a decentralized organization that purchases high-value NFTs through smart contracts. PleasrDAO distributes ownership of the DAO in the form of tokens, and each member participates in the governance of the DAO through a group chat. Apart from investing in NFTs, the team’s vision also includes being an angel investor and an incubator of early stage NFT projects.

Whale DAO

WhaleShark, a major player in the NFT market, issued the token WHALE in May 2020, backed by its NFT holdings, and the revenue generated by the WHALE project was directly invested to accumulate more valuable NFT assets, driving capital appreciation and community engagement. The community was gradually decentralized after its inception, and Whale DAO is responsible for managing the assets in the treasury. The NFT assets in the WHALE funding pool include Gods Unchained, Sandbox and so on.

WHALE token holders can purchase NFTs created by the WHALE community, rent artwork from the funding pool, participate in WHALE DAO voting and decision making, take part in WHALE liquidity mining, and attend WHALE organized conferences.

Primary market focused DAOs

While it is difficult for individual investors to invest directly in the primary market, the emergence of Investment DAO brings such an opportunity to individual investors, where members of the community can come together to invest in the primary market in the form of a DAO. Some DAOs, which focus on the crypto primary market, have not only proven the viability of the solution in its development, but also have connected more projects to community members.

The LAO

Launched in late April 2020, The LAO co-founder Aaron Wright drafted the now-passed Wyoming DAO Act, which gives DAOs legal recognition in Wyoming. the LAO is centered on membership management and relies on DApps and related smart contracts to facilitate investment in projects that use digital assets.

Membership in The LAO is currently limited to qualified investors under U.S. law, with a total membership cap of 99. Investors can join by purchasing “LAO Units” representing ownership of The LAO at a price of 310 ETH per unit, which gives members 0.9% of The LAO’s voting rights and the right to receive 0.9% of the investment income. Each member can only purchase a maximum of 9 LAO Units.

The LAO’s initial service provider, OpenLaw, has partnered with MolochDAO to develop the smart contract of MolochDAO V2. OpenLaw is a project that uses smart contract to implement legal agreements, providing an easy way for everyone to reference and trigger Ethereum-based smart contracts to manage contractual commitments. MolochDAO V2 has a governance model and game-theoretic design, it is a voting-weighted and multi-signature smart contract with a mechanism called “rage quit”. Members can choose to rage quit at any time if they are not satisfied with LAO’s performance or management.

MetaCartel

The MetaCartel community has existed since September 2018, and on June 5, 2019 MetaCartel DAO’s smart contract was officially deployed to Ethereum mainnet with initial backers including Matic Network (now Polygon), NuCypher, Gnosis and The Graph, as well as more than a dozen other individual investors. MetaCartel DAO went live in July 2019 and now boasts more than 80 DAO participants, as well as more than 800 community members.

MetaCartel Ventures (MCV) is a for-profit investment and legal entity of MetaCartel DAO, investing in early-stage decentralized applications. Membership in MetaCartel requires support and evaluation by internal MCV members with submitted proposals and votes on the chain.

In MetaCartel DAO, there are three main membership categories, the first category is “Mages” which contains members who are not in the status of qualified investors, the second category is “Goblins” which contains qualified investor members, and the third category is “Summoners” which are operating representatives in the DAO who are responsible for the approval of Mages, legal oversight and guidance, finances and coordination related services. It is worth noting that the operating representatives are not necessarily members.

Unlike many DAOs that believe “code is law”, MCV has not abandoned the legal framework. To better resolve disputes over membership withdrawal, MCV has adopted the Grimoire legal framework, a voluntary and legally binding agreement among DAO members. In addition, MCV uses the MolochDAO v2 smart contract developed by OpenLaw and MolochDAO for on-chain governance, which provides flexibility and freedom to process withdrawals so that MCV members can immediately receive their individual assets in proportion to their share of the DAO. While MCV’s fundraising and asset management occurs on-chain, many of its decisions are coordinated through off-chain community communication channels, such as group chats, video conferences, and face-to-face meetings, allowing for some consensus among members prior to on-chain proposals.

Secondary market focused DAOs

In addition to some Investment DAOs that are directly involved in the early stage financing of projects, there are also DAOs that provide liquidity and earn revenue by exploring investment opportunities in various projects in the DeFi space, raising funds as DAOs and voting in accordance with community members.

AladdinDAO

AladdinDAO is a decentralized network, launched in June 2021, aiming to transform crypto investment from a form of venture capital to one that brings together the wisdom of the masses through collective value discovery.

The core of AladdinDAO is the “Boule Committee”, the first Boule members are recommended by Genesis members and elected by the DAO. The founding members of AladdinDAO include Polychain, DCG, 1kx, Multicoin Capital, CMS, Nascent and Alameda etc. For other Boule members, 80% are nominated by the first headhunters and 20% are nominated directly by the community and elected by the AladdinDAO community through a decentralized governance process.

AladdinDAO is constantly screening and recruiting talented individuals. The Boule members vote to identify, analyze and deliver high-quality DeFi projects to the community, and receive AladdinDAO tokens for their participation. This mechanism is designed to encourage responsible voting by DeFi experts, who will be rewarded for voting yes on high-performing projects and no on underperforming projects.

Force DAO

Force DAO is a quantitative hedge fund of DeFi. It is a DAO of DeFi investment strategies designed to generate incentives and rewards through liquidity mining on Ethereum and Polygon, following a strategy proposed by the community.

Holders of Force DAO’s governance token are able to initiate proposals and votes. Force DAO uses a template from Aragon (a DAO solution provider) as the basis for token holders to manage the organization, but due to the high gas cost of maintaining on-chain governance on Ethereum, voting is temporarily available on the Snapshot page. The token holders can make a request in the forum, which becomes a proposal after reaching the required 40 votes.

Summary

As we can see from the above, Investment DAOs have been growing faster and faster in the past few years, and many communities have emerged to invest through DAO. Moreover, DAOs which focus on NFT investment have witnessed explosive growth. Below is a conclusive table of the DAOs introduced above in terms of exit mechanism, investment direction, governance mechanism etc.

The model adopted by most Investment DAOs is still based on having a pool of funds and then members voting on how the funds are invested, but DAOs vary in terms of membership barriers and governance mechanisms. In addition, not all DAOs have governance tokens. Some DAOs with higher barriers to entry can ensure community member participation without tokens, while in some organizations with lower barriers, it is easier for the community to proliferate and grow with token holders as members.

Solo travel is fast, but the crowd travels far. In the future, it will be very exciting to see Investment DAOs provide ordinary investors with more and more opportunities, bring a paradigm shift to crypto venture capitals, and even exert overturning impact on traditional investment industry.

Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision.

🐩 @chestersigned

📅 10 June 2022

Links:

https://www.flamingodao.xyz/

https://pleasr.org/

https://www.whale.me/

https://www.thelao.io/

https://metacartel.xyz/

https://www.aladdin.club/zh

https://www.forcedao.com/
A brief overview of DAODefinition DAO, short for Decentralized Autonomous Organization, is an organization represented by rules encoded as a computer program. It is controlled and supervised by the originator, shareholders or token holders and is not influenced by a central authority. A DAO keeps itself running through smart contracts and encodes transactions and rules on the blockchain, achieving openness, fairness, uninterruptibility and autonomous operation. In contrast to traditional organizations, the actions and financial flows that occur in a DAO are visible to all, greatly reducing the risk of corruption and censorship. Compared to the situation that most shareholders have no access to the current financial status of a company, DAOs have a balance sheet on a public blockchain where every transaction is recorded and transparent. DAOs are internet-based organizations, collectively owned and organized by their members. DAOs give us a way of collaborating with other people all over the world without ever meeting them. Members can establish their own rules and vote on key decisions using blockchain, and without bureaucracy or hierarchy. As DAOs are built onto public blockchains, they are open source. Anyone can inspect their treasuries as all financial transactions are stored on the blockchain. From a high level, DAOs can be thought of as social coordination mechanisms to solve a common goal. Paired with an on-chain treasury management system responsible for dictating how decisions, funding and upgrades are made, DAOs provide a technical framework to pool funding in a secure fashion. Development In 2013, Daniel Larimer first proposed the concept of DAC (Decentralized Autonomous Corporation), which is considered an early prototype of DAO. The following year, Vitalik Buterin explained DAO in detail, which made the definition of DAO become clear for the first time. Vitalik described DAO as follows: “Instead of a hierarchical structure managed by a set of humans interacting in person and controlling property via the legal system, a decentralized organization involves a set of humans interacting with each other according to a protocol specified in code, and enforced on the blockchain.” After that, the idea of DAO accelerated its spread. By modern standards, the first DAO is Dash DAO, whose emergence dates back to August 2015. Dash DAO allowed stakeholders to vote through its governance mechanism. This marked the official march of DAO from a general concept to a concrete implementation. Meanwhile, the year 2015 also saw the official launch of Ethereum mainnet and the emergence of smart contracts that incorporate DAO. After years of development, as of now (May 24, 2022), there are 1078 DAOs managing more than $10.4 billion in assets and nearly 2 million members. These range from DAOs that help manage some of the largest crypto protocols to smaller DAOs organized around causes such as investments, social communities, media and philanthropy. 91 of these DAOs have a market cap of more than $1 million, representing 8.4 percent of the market cap of DAOs, and 16 DAOs have a market cap of over $100 million, representing 1.5 percent of the total market cap. Some of the top ranked DAOs include Uniswap DAO, Gnosis DAO, BitDAO, UXDProtocol DAO and OlympusDAO, most of which are built upon Ethereum. Advantages The advent of Web3 is driving a transformational change in human activity and value, and collaboration among distributed communities and stakeholders is becoming increasingly important. DAO bridges the gap between participants with the same mission or goal around the globe, and the real meaning behind DAO is that capital and talent from all over the world can collaborate at scale in an instant. Fast and borderless convenience DAOs are more efficient and convenient than traditional organizations in facilitating globalization. Instead of operating in one or a few jurisdictions, DAOs seek to expand globally, bringing together thousands of members, regardless of their geographic location, background, or beliefs. DAOs offer a solution that allows everyone to work on equal terms by adhering to the same set of standard rules, and provide a system of parity for the formation and operation of organizations. Democratic and highly participatory voting Traditional organizations tend to make important decisions through meetings of a few people, which can result in selected results not necessarily representing the majority opinion of the organization. DAOs can change this by allowing anyone in the organization to vote on issues they care about. Besides, DAOs do not use a system that ignores or disregards opinions from members of the organization, but rather ensures that all votes are counted and displayed to everyone. Autonomy and non-tamperable rules A DAO is often seen as a decentralized enterprise or company. While it is certainly more commercially and practically relevant to understand DAOs at the company level, we need to be clear that DAOs are not a specific organizational entity. Unlike the traditional “top-down” management model, DAOs are more “flat managed” and “autonomous” in nature. The policies and rules of a traditional organization determine how the organization will operate, but they are often changed by setting exceptions to this rule. In a DAO, the rules are coded to ensure that they apply to everyone, and the set of rules established within a DAO cannot be tampered with unless the voter base agrees to do so. Classification In recent years, DAOs with different functions have gradually started to emerge: infrastructure, incubator, protocol, investment, service, collection, social, and media, etc. They have also entered the current exploration period, exploring the utility and value of DAOs within different purposes, application scenarios, and ecosystems. There are many different types of DAOs, all of which serve as a governance mechanism to ensure that important notions are discussed, proposed and implemented in accordance with the communities’ best wishes. Below, a few key types of DAOs are listed, along with some specific and notable examples. Protocol DAOs Protocol DAOs transition power from a core team into the hands of the community, spawning a new way for projects to issue fungible tokens into the market. These DAOs are used as an ownership and governance mechanism for lending platforms, yield optimizers and more, to maintain and evolve the underlying platform in a fair and decentralized manner. Some of the most major protocol DAOs include those behind DeFi platforms such as: Uniswap DAO — Being the largest decentralized exchange on Ethereum, Uniswap launched its governance system and token in September 2020. UNI holders vote or delegate votes which control the protocol’s direction, fees, treasury and so on. Maker DAO — One of the original DAOs of the DeFi ecosystem. As the governance mechanism for the famous stablecoin (DAI), Maker utilizes a DAO framework to vote on things like adjusting the DSR interest rate, the stability fee, or even triggering an emergency shutdown. Investment DAOs With Protocol DAOs bringing new tokens into the world, it only seemed logical that groups band together to invest in them. Investment DAOs are similar to traditional investment funds that operate with pooled capital. However, rather than a single centralized party calling the shots, investment DAO token holders can vote on what the pool of funds is invested in. Examples of investments DAOs include: MetaCartel Ventures — A “sub-DAO” under the broader MetaCartel DAO. It is one of the first for-profit DAOs and invests in nascent DApps. It is also one of the first experiments where non-accredited individuals can have a real say in early-stage investments, which were previously restricted to accredited investors, venture capitalists and angel investors. The LAO — Structured as a member-directed venture capital fund in the US, the LAO is an interesting take on DAOs. It is registered as a Delaware limited-liability company and compliant with US laws, but carries out its functions via a DApp and smart contracts. Social DAOs While social media turned everybody into a media company, Social DAOs turn every group chat into a digital business. Members gain entry into a certain social DAO by paying a membership fee in the form of purchasing a certain number of the DAO’s tokens. These social DAOs can take the form of any social circle, which often shares a common interest, such as: Friends with Benefits DAO — An exclusive social club, which one can only join by owning a certain number of the FWB token. Consisting primarily of web3 artists, operators and enthusiasts, Friends With Benefits DAO members gain exclusive access to token-gated events, mastermind discussions and more. Bored Ape Yacht Club DAO — To gain entry into the club, you must own a BAYC NFT. Each of these NFTs is linked to a unique “bored ape” profile picture. Although the actual DAO and governance portion of the project is still in the pipeline, several celebrities have recently purchased BAYC NFTs, making it in the public eye. Others Due to the limitation of space, there are many other types of DAO that cannot be introduced one by one. For example, Grants DAOs to fund and foster new ventures or projects, Collector DAOs seeking to purchase and curate valuable collections including NFTs, Media DAOs aiming to reshape the way content producers and consumers interact with media, Service DAOs exploring the future of work and reshaping the way people work, etc. Limitations Despite the booming growth that has been achieved, DAO still faces huge challenges and has a long way to go. The major shortcomings of DAO are not negligible and need to be addressed urgently. Lack of legal and regulatory clarity Given that DAOs do not exist anywhere and do not operate like corporations, they cannot fully fit into the existing regulatory framework. DAOs must address a variety of thorny regulatory and legal issues. Lack of effective coordination mechanism The reason centralized organizations don’t involve every member in every decision is that — it’s a very inefficient way. Many of today’s DAOs exist under somewhat crude governance structures, which can lead to chaotic decision-making processes and unsatisfactory results. Lack of developed infrastructure Traditional organizations benefit from a highly developed operational infrastructure for their clear structure and efficient decision-making processes. The current phase of DAO is to build from scratch an infrastructure similar to that of traditional organizations for governance, compensation, money management, communication, etc. Immaturity and security risks The immutability of smart contracts is both an advantage and a disadvantage for DAO due to the immaturity of the technology itself and the existence of security risks. Once the contract rules are encoded, the system will be up and running, making it difficult to change them. In addition, any known security issues can be exploited because the code is visible to everyone and difficult to repair. Conclusion The emergence of DAOs in the Web3 world has innovatively solved many of the problems in corporate governance, demonstrating advancement compared to traditional centralized organizations. These advantages were previously unimaginable: the elimination of hierarchy and power consciousness, the achieving of community maintenance, shared decision making and full benefit sharing, all the progress can be marked as a big breakthrough. With the emergence and upgrading of different types of DAOs in recent years, we firmly believe that the application of DAO in the future will develop from the bottom up, and continue to adapt and improve, until it is strong enough to make a huge difference to the whole society. Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision. 🐩 @chestersigned 📅 30 May 2022 Link: https://mirror.xyz/bitwhalecapital.eth/JziaIjDP9ca3tTPLrE-RmtpxFWT1kL9EesaRU2X_uhQ https://defirate.com/daos/ https://medium.com/iosg-ventures/overview-of-decentralized-autonomous-organization-dao-f9ac47051d07 https://coopahtroopa.mirror.xyz/_EDyn4cs9tDoOxNGZLfKL7JjLo5rGkkEfRa_a-6VEWw

A brief overview of DAO

Definition

DAO, short for Decentralized Autonomous Organization, is an organization represented by rules encoded as a computer program. It is controlled and supervised by the originator, shareholders or token holders and is not influenced by a central authority. A DAO keeps itself running through smart contracts and encodes transactions and rules on the blockchain, achieving openness, fairness, uninterruptibility and autonomous operation.

In contrast to traditional organizations, the actions and financial flows that occur in a DAO are visible to all, greatly reducing the risk of corruption and censorship. Compared to the situation that most shareholders have no access to the current financial status of a company, DAOs have a balance sheet on a public blockchain where every transaction is recorded and transparent.

DAOs are internet-based organizations, collectively owned and organized by their members. DAOs give us a way of collaborating with other people all over the world without ever meeting them. Members can establish their own rules and vote on key decisions using blockchain, and without bureaucracy or hierarchy. As DAOs are built onto public blockchains, they are open source. Anyone can inspect their treasuries as all financial transactions are stored on the blockchain.

From a high level, DAOs can be thought of as social coordination mechanisms to solve a common goal. Paired with an on-chain treasury management system responsible for dictating how decisions, funding and upgrades are made, DAOs provide a technical framework to pool funding in a secure fashion.

Development

In 2013, Daniel Larimer first proposed the concept of DAC (Decentralized Autonomous Corporation), which is considered an early prototype of DAO. The following year, Vitalik Buterin explained DAO in detail, which made the definition of DAO become clear for the first time.

Vitalik described DAO as follows:

“Instead of a hierarchical structure managed by a set of humans interacting in person and controlling property via the legal system, a decentralized organization involves a set of humans interacting with each other according to a protocol specified in code, and enforced on the blockchain.”

After that, the idea of DAO accelerated its spread. By modern standards, the first DAO is Dash DAO, whose emergence dates back to August 2015. Dash DAO allowed stakeholders to vote through its governance mechanism. This marked the official march of DAO from a general concept to a concrete implementation. Meanwhile, the year 2015 also saw the official launch of Ethereum mainnet and the emergence of smart contracts that incorporate DAO.

After years of development, as of now (May 24, 2022), there are 1078 DAOs managing more than $10.4 billion in assets and nearly 2 million members. These range from DAOs that help manage some of the largest crypto protocols to smaller DAOs organized around causes such as investments, social communities, media and philanthropy. 91 of these DAOs have a market cap of more than $1 million, representing 8.4 percent of the market cap of DAOs, and 16 DAOs have a market cap of over $100 million, representing 1.5 percent of the total market cap. Some of the top ranked DAOs include Uniswap DAO, Gnosis DAO, BitDAO, UXDProtocol DAO and OlympusDAO, most of which are built upon Ethereum.

Advantages

The advent of Web3 is driving a transformational change in human activity and value, and collaboration among distributed communities and stakeholders is becoming increasingly important. DAO bridges the gap between participants with the same mission or goal around the globe, and the real meaning behind DAO is that capital and talent from all over the world can collaborate at scale in an instant.

Fast and borderless convenience

DAOs are more efficient and convenient than traditional organizations in facilitating globalization. Instead of operating in one or a few jurisdictions, DAOs seek to expand globally, bringing together thousands of members, regardless of their geographic location, background, or beliefs. DAOs offer a solution that allows everyone to work on equal terms by adhering to the same set of standard rules, and provide a system of parity for the formation and operation of organizations.

Democratic and highly participatory voting

Traditional organizations tend to make important decisions through meetings of a few people, which can result in selected results not necessarily representing the majority opinion of the organization. DAOs can change this by allowing anyone in the organization to vote on issues they care about. Besides, DAOs do not use a system that ignores or disregards opinions from members of the organization, but rather ensures that all votes are counted and displayed to everyone.

Autonomy and non-tamperable rules

A DAO is often seen as a decentralized enterprise or company. While it is certainly more commercially and practically relevant to understand DAOs at the company level, we need to be clear that DAOs are not a specific organizational entity. Unlike the traditional “top-down” management model, DAOs are more “flat managed” and “autonomous” in nature. The policies and rules of a traditional organization determine how the organization will operate, but they are often changed by setting exceptions to this rule. In a DAO, the rules are coded to ensure that they apply to everyone, and the set of rules established within a DAO cannot be tampered with unless the voter base agrees to do so.

Classification

In recent years, DAOs with different functions have gradually started to emerge: infrastructure, incubator, protocol, investment, service, collection, social, and media, etc. They have also entered the current exploration period, exploring the utility and value of DAOs within different purposes, application scenarios, and ecosystems.

There are many different types of DAOs, all of which serve as a governance mechanism to ensure that important notions are discussed, proposed and implemented in accordance with the communities’ best wishes. Below, a few key types of DAOs are listed, along with some specific and notable examples.

Protocol DAOs

Protocol DAOs transition power from a core team into the hands of the community, spawning a new way for projects to issue fungible tokens into the market. These DAOs are used as an ownership and governance mechanism for lending platforms, yield optimizers and more, to maintain and evolve the underlying platform in a fair and decentralized manner.

Some of the most major protocol DAOs include those behind DeFi platforms such as:

Uniswap DAO — Being the largest decentralized exchange on Ethereum, Uniswap launched its governance system and token in September 2020. UNI holders vote or delegate votes which control the protocol’s direction, fees, treasury and so on.

Maker DAO — One of the original DAOs of the DeFi ecosystem. As the governance mechanism for the famous stablecoin (DAI), Maker utilizes a DAO framework to vote on things like adjusting the DSR interest rate, the stability fee, or even triggering an emergency shutdown.

Investment DAOs

With Protocol DAOs bringing new tokens into the world, it only seemed logical that groups band together to invest in them. Investment DAOs are similar to traditional investment funds that operate with pooled capital. However, rather than a single centralized party calling the shots, investment DAO token holders can vote on what the pool of funds is invested in.

Examples of investments DAOs include:

MetaCartel Ventures — A “sub-DAO” under the broader MetaCartel DAO. It is one of the first for-profit DAOs and invests in nascent DApps. It is also one of the first experiments where non-accredited individuals can have a real say in early-stage investments, which were previously restricted to accredited investors, venture capitalists and angel investors.

The LAO — Structured as a member-directed venture capital fund in the US, the LAO is an interesting take on DAOs. It is registered as a Delaware limited-liability company and compliant with US laws, but carries out its functions via a DApp and smart contracts.

Social DAOs

While social media turned everybody into a media company, Social DAOs turn every group chat into a digital business. Members gain entry into a certain social DAO by paying a membership fee in the form of purchasing a certain number of the DAO’s tokens. These social DAOs can take the form of any social circle, which often shares a common interest, such as:

Friends with Benefits DAO — An exclusive social club, which one can only join by owning a certain number of the FWB token. Consisting primarily of web3 artists, operators and enthusiasts, Friends With Benefits DAO members gain exclusive access to token-gated events, mastermind discussions and more.

Bored Ape Yacht Club DAO — To gain entry into the club, you must own a BAYC NFT. Each of these NFTs is linked to a unique “bored ape” profile picture. Although the actual DAO and governance portion of the project is still in the pipeline, several celebrities have recently purchased BAYC NFTs, making it in the public eye.

Others

Due to the limitation of space, there are many other types of DAO that cannot be introduced one by one. For example, Grants DAOs to fund and foster new ventures or projects, Collector DAOs seeking to purchase and curate valuable collections including NFTs, Media DAOs aiming to reshape the way content producers and consumers interact with media, Service DAOs exploring the future of work and reshaping the way people work, etc.

Limitations

Despite the booming growth that has been achieved, DAO still faces huge challenges and has a long way to go. The major shortcomings of DAO are not negligible and need to be addressed urgently.

Lack of legal and regulatory clarity

Given that DAOs do not exist anywhere and do not operate like corporations, they cannot fully fit into the existing regulatory framework. DAOs must address a variety of thorny regulatory and legal issues.

Lack of effective coordination mechanism

The reason centralized organizations don’t involve every member in every decision is that — it’s a very inefficient way. Many of today’s DAOs exist under somewhat crude governance structures, which can lead to chaotic decision-making processes and unsatisfactory results.

Lack of developed infrastructure

Traditional organizations benefit from a highly developed operational infrastructure for their clear structure and efficient decision-making processes. The current phase of DAO is to build from scratch an infrastructure similar to that of traditional organizations for governance, compensation, money management, communication, etc.

Immaturity and security risks

The immutability of smart contracts is both an advantage and a disadvantage for DAO due to the immaturity of the technology itself and the existence of security risks. Once the contract rules are encoded, the system will be up and running, making it difficult to change them. In addition, any known security issues can be exploited because the code is visible to everyone and difficult to repair.

Conclusion

The emergence of DAOs in the Web3 world has innovatively solved many of the problems in corporate governance, demonstrating advancement compared to traditional centralized organizations. These advantages were previously unimaginable: the elimination of hierarchy and power consciousness, the achieving of community maintenance, shared decision making and full benefit sharing, all the progress can be marked as a big breakthrough.

With the emergence and upgrading of different types of DAOs in recent years, we firmly believe that the application of DAO in the future will develop from the bottom up, and continue to adapt and improve, until it is strong enough to make a huge difference to the whole society.

Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision.

🐩 @chestersigned

📅 30 May 2022

Link:

https://mirror.xyz/bitwhalecapital.eth/JziaIjDP9ca3tTPLrE-RmtpxFWT1kL9EesaRU2X_uhQ

https://defirate.com/daos/

https://medium.com/iosg-ventures/overview-of-decentralized-autonomous-organization-dao-f9ac47051d07

https://coopahtroopa.mirror.xyz/_EDyn4cs9tDoOxNGZLfKL7JjLo5rGkkEfRa_a-6VEWw
Luna, Back to EarthTimeline On May 8th 2022, the CISO of Polygon Network, Mudit Gupta tweeted about serial token movements that caused the $UST to lose its peg slightly down to $0.985 on Saturday. He also mentioned the suspicious timing of a transaction that bridged $84M worth of $UST to Ethereum and dumped it right after the LFG’s $UST removal, which may have triggered the market sell-off. While the $UST once recovered back to $1 on Sunday, it dropped all the way to $0.88 on May 9th 2022. Such a huge decline forced the Luna Foundation Guard (LFG) Council to issue $1.5 billion in loans ($750 million $BTC & $750 million $UST) to save its stablecoin. The depegging of $UST during the weekend, has greatly impacted the price of $Luna , which tanked to $28.03 on Monday. The decline of $Luna sparks a domino effect, brought a large amount of liquidation on the Anchor protocol, which lets the $UST lost hold of its dollar peg, down to $0.61 on Tuesday morning. As things got worse, LFG emptied its bag and unloaded 28K $BTC to buy up $UST. Bad news ensues on Tuesday after the most recent report coming from the Federal Reserve Board that states their worries about stablecoin in its latest Financial Stability Report [1] and Binance announced it has suspended the Terra Network withdrawals. The price of $UST dropped to $0.3 early this Wednesday, and $Luna was free-falling to $0.47. However, the LFG and Do Kwon have not accepted this defeat, as he endorsed a community proposal 1164 [2] to try to save the community. Some Thoughts The crash of $UST shows the instability and vulnerability of the algorithm stablecoin, and the incompleteness of Terra’s mechanisms. The liquidity issue came first in the falling of $UST. On May 7th, After the LFG drew liquidity from the curve for 4pool, the $UST exit liquidity became vulnerable and can be cleared with just $350 million $UST, and that is exactly what happened next. Someone cleared the liquidity from the curve, dumped $UST in CEXs, and caused the depeg. This would not happen if the 4pool was alive, since it requires $3 billion to clear the liquidity, which is unprofitable. The automated stabilizing mechanism of $UST is pretty simple, here is an example: if $Luna is worth 50 dollars, you can always redeem it to receive 50 $UST, and vice versa. So, when the price of $UST dropped below $1, arbitrageurs are willing to buy it and redeem $1 worth of $Luna. Basically, if there are more $UST in the market, there will be less $Luna. As the adoption of $UST increases, the holder of $Luna will want to diamond-hand it, which explains why the Market cap of $Luna should be higher than the $UST. In this case, every $UST (in the market) will be backed by $Luna. On the other hand, if the market cap of $Luna drops below the $UST, trouble occurs, there will be FUD that $UST is undercollateralized. Let’s consider an edge case here. If the price of $Luna was pretty high at some point, someone burned his $Luna and got tons of $UST. Then, suddenly, the price of $Luna just crashed, and this person now bought tons of $Luna with a penny (In $UST). Now, there is a great amount of fresh $Luna joins the market circulation and ready to be dumped. That’s basically how the price of $Luna got crashed during these days. To further explain the reasons behind the crash of $UST, I have to introduce two other cornerstones of the Terra ecosystem. Anchor protocol and the Luna Foundation Guard. Anchor protocol is the lending protocol on Terra, which offers APY around 20% for users who lock their $UST. Luna Foundation Guard is an organization created by Do Kwon, the founder of Terraform Labs, aims to support/protect the Terra ecosystem, and we have seen several efforts made by them this week to try to support the price of $UST. Previously, LFG was replenishing the fund of Anchor by burning its reserve to maintain its high APY. The plan is to expand the adoption of $UST, make the system self-sustainable, and boost the price of $Luna to the moon before they run out of reserves. In April 2022, LFG bought over $1.5B worth of $BTC as an extra insurance to the $UST. If the price of $UST falls below $1, sell the $BTC to buy $UST, thus restoring the peg. That’s exactly what LFG did on Monday. I believe the reaction that LFG took is a blunder, especially when under a downward trend of the whole crypto market. It set $UST as a vulnerable target for market makers to attack. Predators can easily earn profit by just shorting the $UST/$BTC pair. Massive bearish sentiment causes purchasing power of $BTC fell, and less $UST could be bought. Then panic retail investors start to undersell their $UST, which makes the condition of Terra even worse. Its automated stabilizing mechanism also shows its weakness at this point. Initially, people swap 1 $UST for $1 worth of $Luna, within its daily swap cap. After exceeding it, a worse exchange rate will be offered and dissuade investors from the panic sales. But when it comes to a bank run, it means the excess $UST will be burned much slower, which hinders the $UST to regain its peg. Chain scalability also plays a crucial role in this event. When the price of $Luna got dumped quickly, people who borrowed $Luna from Anchor Protocol cannot reduce their exposure / increase collateral due to the network congestion, which causes more cascading liquidations. As of the time of this research, the price of $Luna is $0.096 and $UST is $0.512, and I would say it’s already bankrupt at this point. I’m not sure if the proposal 1164 can somehow save the $UST, but $Luna is certainly dead now. If the $UST wants to re-peg, keeping $UST redemptions at $1 worth of $Luna, the exit liquidity for $Luna is the major issue. LFG still need billions in exit liquidity to save it. The failure of $Luna also leads to some worries on other stablecoins, especially after the reports made by the Federal Reserve. As the stablecoin with market cap more than the sum of the second to fifth stablecoins, if regulation hits the $USDT unexpected, will cause catastrophic impact to the crypto. Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision. 🐩 @SoxPt50 📅 12 May 2022 References & Source: [1] https://www.federalreserve.gov/publications/files/financial-stability-report-20220509.pdf [2]https://agora.terra.money/t/proposal-help-ust-pegging-increase-estimated-minting-capacity-to-1200m/6287 https://www.coindesk.com/markets/2022/05/10/usts-bitcoin-reserve-too-late-in-coming-to-save-dollar-peg/ https://www.bloomberg.com/news/newsletters/2022-04-19/crypto-news-do-kwon-of-luna-terra-fame-has-10-billion-bitcoin-plan https://www.reddit.com/r/terraluna/comments/umcabh/postmortem_on_what_happened_with_ust_depegging/

Luna, Back to Earth

Timeline

On May 8th 2022, the CISO of Polygon Network, Mudit Gupta tweeted about serial token movements that caused the $UST to lose its peg slightly down to $0.985 on Saturday.

He also mentioned the suspicious timing of a transaction that bridged $84M worth of $UST to Ethereum and dumped it right after the LFG’s $UST removal, which may have triggered the market sell-off.

While the $UST once recovered back to $1 on Sunday, it dropped all the way to $0.88 on May 9th 2022.

Such a huge decline forced the Luna Foundation Guard (LFG) Council to issue $1.5 billion in loans ($750 million $BTC & $750 million $UST) to save its stablecoin.

The depegging of $UST during the weekend, has greatly impacted the price of $Luna , which tanked to $28.03 on Monday.

The decline of $Luna sparks a domino effect, brought a large amount of liquidation on the Anchor protocol, which lets the $UST lost hold of its dollar peg, down to $0.61 on Tuesday morning.

As things got worse, LFG emptied its bag and unloaded 28K $BTC to buy up $UST.

Bad news ensues on Tuesday after the most recent report coming from the Federal Reserve Board that states their worries about stablecoin in its latest Financial Stability Report [1] and Binance announced it has suspended the Terra Network withdrawals.

The price of $UST dropped to $0.3 early this Wednesday, and $Luna was free-falling to $0.47. However, the LFG and Do Kwon have not accepted this defeat, as he endorsed a community proposal 1164 [2] to try to save the community.

Some Thoughts

The crash of $UST shows the instability and vulnerability of the algorithm stablecoin, and the incompleteness of Terra’s mechanisms.

The liquidity issue came first in the falling of $UST. On May 7th, After the LFG drew liquidity from the curve for 4pool, the $UST exit liquidity became vulnerable and can be cleared with just $350 million $UST, and that is exactly what happened next. Someone cleared the liquidity from the curve, dumped $UST in CEXs, and caused the depeg.

This would not happen if the 4pool was alive, since it requires $3 billion to clear the liquidity, which is unprofitable.

The automated stabilizing mechanism of $UST is pretty simple, here is an example: if $Luna is worth 50 dollars, you can always redeem it to receive 50 $UST, and vice versa.

So, when the price of $UST dropped below $1, arbitrageurs are willing to buy it and redeem $1 worth of $Luna.

Basically, if there are more $UST in the market, there will be less $Luna. As the adoption of $UST increases, the holder of $Luna will want to diamond-hand it, which explains why the Market cap of $Luna should be higher than the $UST. In this case, every $UST (in the market) will be backed by $Luna. On the other hand, if the market cap of $Luna drops below the $UST, trouble occurs, there will be FUD that $UST is undercollateralized.

Let’s consider an edge case here. If the price of $Luna was pretty high at some point, someone burned his $Luna and got tons of $UST. Then, suddenly, the price of $Luna just crashed, and this person now bought tons of $Luna with a penny (In $UST). Now, there is a great amount of fresh $Luna joins the market circulation and ready to be dumped. That’s basically how the price of $Luna got crashed during these days.

To further explain the reasons behind the crash of $UST, I have to introduce two other cornerstones of the Terra ecosystem. Anchor protocol and the Luna Foundation Guard.

Anchor protocol is the lending protocol on Terra, which offers APY around 20% for users who lock their $UST. Luna Foundation Guard is an organization created by Do Kwon, the founder of Terraform Labs, aims to support/protect the Terra ecosystem, and we have seen several efforts made by them this week to try to support the price of $UST.

Previously, LFG was replenishing the fund of Anchor by burning its reserve to maintain its high APY. The plan is to expand the adoption of $UST, make the system self-sustainable, and boost the price of $Luna to the moon before they run out of reserves.

In April 2022, LFG bought over $1.5B worth of $BTC as an extra insurance to the $UST. If the price of $UST falls below $1, sell the $BTC to buy $UST, thus restoring the peg. That’s exactly what LFG did on Monday.

I believe the reaction that LFG took is a blunder, especially when under a downward trend of the whole crypto market. It set $UST as a vulnerable target for market makers to attack. Predators can easily earn profit by just shorting the $UST/$BTC pair. Massive bearish sentiment causes purchasing power of $BTC fell, and less $UST could be bought. Then panic retail investors start to undersell their $UST, which makes the condition of Terra even worse.

Its automated stabilizing mechanism also shows its weakness at this point. Initially, people swap 1 $UST for $1 worth of $Luna, within its daily swap cap. After exceeding it, a worse exchange rate will be offered and dissuade investors from the panic sales. But when it comes to a bank run, it means the excess $UST will be burned much slower, which hinders the $UST to regain its peg.

Chain scalability also plays a crucial role in this event. When the price of $Luna got dumped quickly, people who borrowed $Luna from Anchor Protocol cannot reduce their exposure / increase collateral due to the network congestion, which causes more cascading liquidations.

As of the time of this research, the price of $Luna is $0.096 and $UST is $0.512, and I would say it’s already bankrupt at this point. I’m not sure if the proposal 1164 can somehow save the $UST, but $Luna is certainly dead now. If the $UST wants to re-peg, keeping $UST redemptions at $1 worth of $Luna, the exit liquidity for $Luna is the major issue. LFG still need billions in exit liquidity to save it.

The failure of $Luna also leads to some worries on other stablecoins, especially after the reports made by the Federal Reserve. As the stablecoin with market cap more than the sum of the second to fifth stablecoins, if regulation hits the $USDT unexpected, will cause catastrophic impact to the crypto.

Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision.

🐩 @SoxPt50

📅 12 May 2022

References & Source:

[1] https://www.federalreserve.gov/publications/files/financial-stability-report-20220509.pdf

[2]https://agora.terra.money/t/proposal-help-ust-pegging-increase-estimated-minting-capacity-to-1200m/6287

https://www.coindesk.com/markets/2022/05/10/usts-bitcoin-reserve-too-late-in-coming-to-save-dollar-peg/

https://www.bloomberg.com/news/newsletters/2022-04-19/crypto-news-do-kwon-of-luna-terra-fame-has-10-billion-bitcoin-plan

https://www.reddit.com/r/terraluna/comments/umcabh/postmortem_on_what_happened_with_ust_depegging/
From IPFS to Ceramic: The future of Web3 data networkThe flaw of IPFS When it comes to decentralized data storage, the InterPlanetary File System, known as IPFS, is a project that cannot be bypassed. As one of the most notable decentralized storage projects, IPFS uses the data structure of the Merkle DAG (Directed Acyclic Graph), a modification based on the Merkle Tree. With this data structure, IPFS implements content addressing and fragment downloading of files. Specifically speaking, IPFS assigns a unique hash value to each file, similar to a file fingerprint. Each root file points to multiple node files, and once the content of a node file changes, the hash value changes accordingly, causing the hash of the root file to change as well. In this way, IPFS stores and finds files in a unique content-based rather than an address-based addressing. This means that if you are looking for a file, you don’t need to know where it is, just what it contains. IPFS generates a unique hash for each file, and when the user needs to retrieve this file, they only need to ask IPFS who has this hash to complete the retrieval. Because hashes prevent duplicate storage, files with the same content are not duplicated by IPFS. This approach optimizes storage and improves network performance. The mechanism of content addressing is a major advantage of IPFS, but every coin has two sides, it also brings a drawback. In IPFS, once a file is stored it cannot be modified in the system because modifying the file content changes the file hash and the user cannot find the changed file by the original hash value. This is a widely criticized pain point: IPFS is not good at storing files that need to be updated and changed from time to time. Although IPFS performs well for storing static files, it lacks the computation and state management capacity for more advanced database-like features such as mutability, version control, access control, and programmable logic, which are required to enable developers to build fully-featured decentralized applications. Therefore, there is an urgent need for an efficient and decentralized solution to store dynamic data — Ceramic solves this issue with a NoSQL-like database for developers to store structured and mutable content. Built for mutable content Ceramic’s storage design is based upon IPFS and extends it with a decentralized dynamic storage layer. On Ceramic, every piece of information is represented as an append-only log of commits, called a “Stream”, which is shown as the combination of gray squares in the figure below. The Stream is similar in concept to Git trees: the initial state (Genesis Commit) and each subsequent change (Commit) are all stored in IPLD (InterPlanetary Linked Data, IPFS’s layer dedicated to data structures), and these records are combined to form a Stream. Since Streams record “changes” rather than “snapshots” of the resulting state, it is only necessary to process all the events on the Stream to get the latest state of the log. For example, Ceramic’s record pattern is as follows: initially, Alice and Bob each have $10; on the second day, Alice transfers $5 to Bob; on the third day, Bob transfers $3 to Alice. This is also very much like a blockchain ledger, where the ledger does not state the balance of each user, and all intermediate processes need to be calculated to get the final user balance. Comparatively, the traditional record pattern of IPFS is: in file a, Alice and Bob each have $10; in file b, Alice has $5 and Bob has $15; and in file c, Alice has $8 and Bob has $12. Here, each record is a snapshot of the resulting state, and a new snapshot needs to be generated as soon as there is a change. Ceramic ensures by this design that each log has a unique Stream ID, with a global uniform naming, and no name changes due to content changes. Each write requires user authorization, and the whole process is similar to blockchain bookkeeping, except that what is written is not transaction data, but other mutable contents, such as user account information. Data composability Ceramic achieves cross-application data composability primarily through its use of a novel abstraction, called data models. Data models typically represent a single, logical application feature such as a user profile, a social graph, or a blog. For instance, you can imagine that every decentralized Twitter implementation would run on a few shared data models: one for each user’s tweets, one for their social graph, one for their DMs, etc. By adopting the same underlying data models, applications are able to natively interoperate on the same data. In a way, you can compare Ceramic’s use of data model standards to the use of token standards for asset ledgers. On Ethereum, for example, the introduction of the ERC20 fungible token and ERC721 non-fungible token standards have given rise to entire ecosystems of tokens and financial applications that natively interoperate. Ceramic brings this same concept to data. Ceramic takes a community-driven approach to creating these data models, allowing any developer to easily define, share, and reuse their models with other developers in the ecosystem. As more data models are created by the community, you will see a continuous expansion in the quantity and variety of applications that are built with composable data. Composability done this way also makes the developer experience better. Building an application on Ceramic looks like browsing a marketplace of data models, plugging them into your app, and automatically gaining access to all data on the network that is stored in these models. Using Ceramic, developers won’t need to worry about bootstrapping their application with their own siloed users and data. The rate of compounding innovation across developers is going to accelerate dramatically. Scalablility Ceramic achieves scalability through a sharded execution environment. All streams on Ceramic maintain their state independently and network nodes execute stream transactions in parallel. This approach, unlike most blockchains, allows Ceramic to operate with the scalability required for decentralized versions of social applications like Twitter or Facebook. Unlike traditional blockchain systems where scalability is limited to a single global virtual execution environment and the state of a single ledger is shared between all nodes, each Ceramic node acts as an individual execution environment for performing computations and validating transactions on streams — there is no global ledger. This “built-in” execution sharding enables the Ceramic Network to scale horizontally to parallelize the processing of an increasing number of simultaneous stream transactions as the number of nodes on the network increases. Such a design is needed to handle the scale of the world’s data, which is orders of magnitude greater than the throughput needed on a financial blockchain. Another benefit of this design is that a Ceramic node can perform stream transactions in an offline-first environment and then later sync updates with the rest of the network when it comes back online. DID Solution Ceramic also offers a flexible and robust identity solution called IDX, the first fully functional decentralized identity (DID) solution. IDX is a cross-chain identity protocol for open applications with decentralized identity and interoperable user data, which lets users build up a unified digital identity consisting of all of their data while enabling developers to break down silos and freely share user data between applications. As shown in the figure below, it provides a decentralized index which allows structured data to be associated to a decentralized identifier (DID), and data is defined by definitions and stored in records. In addition, IDX can be used with any kind of datastore such as Ceramic, Textile, OrbitDB, IPFS, Sia, Arweave, blockchain registries, or even centralized databases and supports authentication from any kind of Web3 wallet. IDX is great for associating user profiles, portable social graphs, reputation scores, verifiable claims, user-generated content, application data, settings, domain names, blockchain addresses, and social Web2 accounts to a user in a decentralized way. Conclusion In summary, the emergence of Ceramic has greatly empowered the construction of Web3 and unlocked new features for Web3 developers. No matter which public blockchain (Ethereum, BSC, Polygon, Avalanche, etc.) developers are building on, they can simultaneously leverage Ceramic for data-centric functions to make their applications better. Furthermore, through Ceramic’s flexible DID-based account system, Ceramic naturally interfaces with account and key systems of current major blockchains, which provides users with great convenience. It is pleasing to see that there are already many DID and Web3 social platform projects developed on Ceramic. Among them to name a few noteworthy projects: CyberConnect, a social graph middleware platform; Orbis, a Web3 Twitter platform; and The Convo Space, an instant messaging platform, etc. We are looking forward to the new possibilities that Ceramic’s infrastructure of data network can bring to the Web3 application layer. Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision. 🐩 @chestersigned 📅 8 May 2022 Links: https://developers.ceramic.network/learn/welcome/ https://blog.ceramic.network/what-is-ceramic/ https://multicoin.capital/2022/02/16/the-composable-web3-data-network/ https://blog.ipfs.io/2021-07-13-ceramic-mainnet-launch/

From IPFS to Ceramic: The future of Web3 data network

The flaw of IPFS

When it comes to decentralized data storage, the InterPlanetary File System, known as IPFS, is a project that cannot be bypassed.

As one of the most notable decentralized storage projects, IPFS uses the data structure of the Merkle DAG (Directed Acyclic Graph), a modification based on the Merkle Tree. With this data structure, IPFS implements content addressing and fragment downloading of files.

Specifically speaking, IPFS assigns a unique hash value to each file, similar to a file fingerprint. Each root file points to multiple node files, and once the content of a node file changes, the hash value changes accordingly, causing the hash of the root file to change as well.

In this way, IPFS stores and finds files in a unique content-based rather than an address-based addressing. This means that if you are looking for a file, you don’t need to know where it is, just what it contains. IPFS generates a unique hash for each file, and when the user needs to retrieve this file, they only need to ask IPFS who has this hash to complete the retrieval. Because hashes prevent duplicate storage, files with the same content are not duplicated by IPFS. This approach optimizes storage and improves network performance.

The mechanism of content addressing is a major advantage of IPFS, but every coin has two sides, it also brings a drawback. In IPFS, once a file is stored it cannot be modified in the system because modifying the file content changes the file hash and the user cannot find the changed file by the original hash value. This is a widely criticized pain point: IPFS is not good at storing files that need to be updated and changed from time to time.

Although IPFS performs well for storing static files, it lacks the computation and state management capacity for more advanced database-like features such as mutability, version control, access control, and programmable logic, which are required to enable developers to build fully-featured decentralized applications. Therefore, there is an urgent need for an efficient and decentralized solution to store dynamic data — Ceramic solves this issue with a NoSQL-like database for developers to store structured and mutable content.

Built for mutable content

Ceramic’s storage design is based upon IPFS and extends it with a decentralized dynamic storage layer.

On Ceramic, every piece of information is represented as an append-only log of commits, called a “Stream”, which is shown as the combination of gray squares in the figure below. The Stream is similar in concept to Git trees: the initial state (Genesis Commit) and each subsequent change (Commit) are all stored in IPLD (InterPlanetary Linked Data, IPFS’s layer dedicated to data structures), and these records are combined to form a Stream. Since Streams record “changes” rather than “snapshots” of the resulting state, it is only necessary to process all the events on the Stream to get the latest state of the log.

For example, Ceramic’s record pattern is as follows: initially, Alice and Bob each have $10; on the second day, Alice transfers $5 to Bob; on the third day, Bob transfers $3 to Alice. This is also very much like a blockchain ledger, where the ledger does not state the balance of each user, and all intermediate processes need to be calculated to get the final user balance.

Comparatively, the traditional record pattern of IPFS is: in file a, Alice and Bob each have $10; in file b, Alice has $5 and Bob has $15; and in file c, Alice has $8 and Bob has $12. Here, each record is a snapshot of the resulting state, and a new snapshot needs to be generated as soon as there is a change.

Ceramic ensures by this design that each log has a unique Stream ID, with a global uniform naming, and no name changes due to content changes. Each write requires user authorization, and the whole process is similar to blockchain bookkeeping, except that what is written is not transaction data, but other mutable contents, such as user account information.

Data composability

Ceramic achieves cross-application data composability primarily through its use of a novel abstraction, called data models.

Data models typically represent a single, logical application feature such as a user profile, a social graph, or a blog. For instance, you can imagine that every decentralized Twitter implementation would run on a few shared data models: one for each user’s tweets, one for their social graph, one for their DMs, etc. By adopting the same underlying data models, applications are able to natively interoperate on the same data.

In a way, you can compare Ceramic’s use of data model standards to the use of token standards for asset ledgers. On Ethereum, for example, the introduction of the ERC20 fungible token and ERC721 non-fungible token standards have given rise to entire ecosystems of tokens and financial applications that natively interoperate. Ceramic brings this same concept to data.

Ceramic takes a community-driven approach to creating these data models, allowing any developer to easily define, share, and reuse their models with other developers in the ecosystem. As more data models are created by the community, you will see a continuous expansion in the quantity and variety of applications that are built with composable data.

Composability done this way also makes the developer experience better. Building an application on Ceramic looks like browsing a marketplace of data models, plugging them into your app, and automatically gaining access to all data on the network that is stored in these models. Using Ceramic, developers won’t need to worry about bootstrapping their application with their own siloed users and data. The rate of compounding innovation across developers is going to accelerate dramatically.

Scalablility

Ceramic achieves scalability through a sharded execution environment. All streams on Ceramic maintain their state independently and network nodes execute stream transactions in parallel. This approach, unlike most blockchains, allows Ceramic to operate with the scalability required for decentralized versions of social applications like Twitter or Facebook.

Unlike traditional blockchain systems where scalability is limited to a single global virtual execution environment and the state of a single ledger is shared between all nodes, each Ceramic node acts as an individual execution environment for performing computations and validating transactions on streams — there is no global ledger. This “built-in” execution sharding enables the Ceramic Network to scale horizontally to parallelize the processing of an increasing number of simultaneous stream transactions as the number of nodes on the network increases. Such a design is needed to handle the scale of the world’s data, which is orders of magnitude greater than the throughput needed on a financial blockchain. Another benefit of this design is that a Ceramic node can perform stream transactions in an offline-first environment and then later sync updates with the rest of the network when it comes back online.

DID Solution

Ceramic also offers a flexible and robust identity solution called IDX, the first fully functional decentralized identity (DID) solution.

IDX is a cross-chain identity protocol for open applications with decentralized identity and interoperable user data, which lets users build up a unified digital identity consisting of all of their data while enabling developers to break down silos and freely share user data between applications. As shown in the figure below, it provides a decentralized index which allows structured data to be associated to a decentralized identifier (DID), and data is defined by definitions and stored in records.

In addition, IDX can be used with any kind of datastore such as Ceramic, Textile, OrbitDB, IPFS, Sia, Arweave, blockchain registries, or even centralized databases and supports authentication from any kind of Web3 wallet.

IDX is great for associating user profiles, portable social graphs, reputation scores, verifiable claims, user-generated content, application data, settings, domain names, blockchain addresses, and social Web2 accounts to a user in a decentralized way.

Conclusion

In summary, the emergence of Ceramic has greatly empowered the construction of Web3 and unlocked new features for Web3 developers. No matter which public blockchain (Ethereum, BSC, Polygon, Avalanche, etc.) developers are building on, they can simultaneously leverage Ceramic for data-centric functions to make their applications better. Furthermore, through Ceramic’s flexible DID-based account system, Ceramic naturally interfaces with account and key systems of current major blockchains, which provides users with great convenience.

It is pleasing to see that there are already many DID and Web3 social platform projects developed on Ceramic. Among them to name a few noteworthy projects: CyberConnect, a social graph middleware platform; Orbis, a Web3 Twitter platform; and The Convo Space, an instant messaging platform, etc. We are looking forward to the new possibilities that Ceramic’s infrastructure of data network can bring to the Web3 application layer.

Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision.

🐩 @chestersigned

📅 8 May 2022

Links:

https://developers.ceramic.network/learn/welcome/

https://blog.ceramic.network/what-is-ceramic/

https://multicoin.capital/2022/02/16/the-composable-web3-data-network/

https://blog.ipfs.io/2021-07-13-ceramic-mainnet-launch/
Specialized data availability layer: A prototype of modular blockchainsModular blockchain paradigm In physics, coupling is the phenomenon whereby two forms of motion affect each other through interaction. Decoupling, on the other hand, is the mathematical separation of two forms of motion to deal with the problem. In the field of programming, the idea of modularity is very similar: a software design technique called modular programming, emphasizes separating the functionality of a program into independent, interchangeable modules, such that each contains everything necessary to execute only one aspect of the desired functionality. Blockchains are also computer programs. Fundamentally, when applying the idea of modularity to a public blockchain, we can break it down into three core components: Execution — This is the computation required to update the chain, which involves getting the current state, adding a bunch of new transactions and transitioning to the new state. Consensus or Settlement or Security — This provides security and agreement for transactions and their ordering. Data Availability (DA) — This is to ensure that the transaction data behind the block header is published and available so that anyone can easily calculate the state and check the state transitions. Instead of bringing all these three components together in one monolithic chain as current major blockchains do, we can split them into specialized chains or layers, each playing the role of one module, to more efficiently implement the functionality of a blockchain. This is the paradigm of modular blockchain. The benefits of modular blockchains As stated earlier, modular blockchains are the result of separating the core components of a single blockchain and running them on separate layers. So, what are the advantages of modular blockchains versus those non-modular ones? The following are the benefits listed. Scalability Layers that specialize in a pair of core features will allow for much greater scalability without the limitation of making tradeoffs that come with a monolithic blockchain. For instance, a modular data availability layer with DA sampling can scale linearly with the number of users. Interoperability Blockchains can employ a modular shared security layer to enable trust-minimized bridging between blockchains in the same cluster. This improves both the security and the level at which several blockchains can communicate with each other. Bootstrapping New blockchains can be created with minimal cost and time. Things like Rollup Software Development Kits will not only aid this but deliver a way to bootstrap without needing a consensus mechanism, validators, or token distribution mechanisms. Experimentation Blockchains can easily be created and used to test new innovative technologies that bring further scalability and optimizations to the entire stack of chains. Doing this allows new chains to focus directly on the elements they want to innovate on, which can then be spread to the rest of the chains. The DA layer in rollups Rollups, which is Ethereum’s layer 2 solution aiming to scale effectively, is fundamentally a modular approach as well. The rollup takes the execution layer out of Ethereum layer 1 and forms an execution-exclusive second layer focusing on processing transactions as fast as possible. On the other hand, Ethereum layer 1 is still in charge of the challenging work of both security and data availability. So, there are two questions here. First, why is data availability so important? Second, can we use the idea of modularity again to decouple DA out from layer 1? Please let me take time to explain. For the first question, I need to discuss the situation separately. In the case of Optimistic Rollup, when the sequencer does evil and settles an incorrect account balance state, anyone can retrieve the original transaction data from layer 1 and reconstruct the correct account state to execute a challenge. So, the role of DA is like an archive bureau which provides original evidence in the event of a challenge, ensuring that the challenge is correct and valid. As for Zero Knowledge (ZK) Rollup, the sequencer cannot do evil due to the cryptographic proof of zero knowledge. But in extreme cases that the sequencer goes down or is unavailable for any reason, users need to escape the layer 2 chain and get their money back to layer 1, when DA is essential to ensure that anyone can reconstruct the final status based on the data. Therefore, the role of DA is like a safety door to guarantee a smooth escape for users in the event of an extreme situation. For the second question, the short answer is yes. Although DA is so vital that best place to store it is Ethereum layer 1, the most secure chain is also the most expensive one — the gas cost of settlement transactions and data storage is way much higher than that on any other chain. That is why both Matter Labs and Starkware provide users with a cheap version of ZK Rollup, namely ZK-Porter and Validium, compromising on not storing DA on layer 1. As a result, a specialized and modular DA layer is beginning to enter the picture. Projects of specialized DA layer Celestia Formerly named LazyLedger, Celestia is a modular consensus and data network, built to enable anyone to easily deploy their own blockchain with minimal overhead. By decoupling the consensus and DA layers with application execution, Celestia modularizes the blockchain technology stack and unlocks new possibilities for decentralized application builders. Under Celestia-enabled modular architecture, developers can effortlessly define their own virtual execution environments. Each application gets its own sovereign execution space and can be updated without main chain hard forks. Celestia’s main highlight lies in addressing one of the fundamental constraints of blockchain scaling — the “data availability problem”. This problem asks: how can nodes be sure that when a new block is produced, all of the data in that block was actually published to the network? The dilemma is that if a sequencer or block producer does not release all of the data in a block, no one could detect if there is a malicious transaction hidden within that block. How does Celestia resolve this? It uses erasure codes to create data availability proofs that guarantee, with high probability, that all that data was posted on-chain. Specifically, Celestia uses a 2-dimensional reed-solomon encoding scheme to encode block data when the sequencer creates a new block. This scheme ensures that only a small sample of data is enough to verify with statistical certainty that the entire block has been published. Only light clients are needed to perform an operation called Data Availability Sampling (DAS) — randomly downloading a small piece of data from the block to detect if the sequencer is misbehaving and withholding data. This method is very effective in the view of probability statistics. In fact, by the seventh time a light client tries to randomly check a chunk of data, the chance of it failing to detect that the sequencer is misbehaving would become less than 1%. And once it is discovered that any one small piece of data is incorrectly encoded, the network is notified via a fraud proof. This makes sure that the throughput is not bottlenecked by state execution like traditional blockchains, allowing Celestia’s throughput to scale with the number of users involved. Initially focusing on the EVM and Cosmos SDK, Celestia does not exclude integration into the ecosystems of Ethereum and Cosmos. Celestia has released a plan called Celestium, which allows any Ethereum L2 chain to use Celestia for data availability and Ethereum for settlement and dispute resolution. Celestiums will provide high throughput data availability for Ethereum L2s, with a higher level of security than other off-chain data availability techniques. Moreover, Celestia is partnering with Evmos to build a settlement layer for EVM rollups called Cevmos. The settlement chain will be implemented as a Celestia rollup using Optimint, and leverages Celestia as the DA layer to provide a fully EVM-equivalent stack optimized solely for rollups, resulting in lower fees and better scale in rollups. Polygon Avail Avail, one of the scaling solutions in Polygon’s toolkit, is a general-purpose, scalable data availability-focused blockchain targeted for standalone chains, side-chains, and off-chain scaling solutions. Avail provides a robust data availability layer by using an extremely secure mathematical primitive — data availability checks using erasure codes with a key innovation — they use Kate polynomial commitments to create a 2D data availability scheme that avoids fraud proofs, does not require honest majority assumptions, and has no reliance on honest full node peer to gain confidence that the data is available. Avail’s approach to solve the data availability problem is similar to Celestia’s DAS but differs only on small points. Avail uses KZG polynomial commitments instead of fraud proofs to avoid sequencers’ misbehaving. KZG commitments, also known as Kate commitments, was introduced by Aniket Kate, Gregory M. Zaverucha, and Ian Goldberg in 2010, providing a way to commit to polynomials in a succinct manner. In simple terms, the entire data in a block is arranged as a two-dimensional matrix. The data redundancy is brought in by erasure coding each column of the matrix to double the size of the original one. Kate commitments are used to commit to each of the rows and the commitment is included in the block header. The scheme makes it easy to catch a data hiding attempt as any light client with access to only block headers can query random cells of the matrix and get short proofs that can be checked against the block headers. The data redundancy forces the sequencer to hide a large part of the block even if it wants to hide just a single transaction, making it susceptible to getting caught on random sampling. Thus, the need for fraud proofs is avoided as the binding nature of the Kate commitments makes it very computationally infeasible for sequencers to construct wrong commitments and not get caught. Summary Specialized data availability layer, as a prototype of modular blockchains, aims to facilitate a complete change in how we approach design of blockchain systems, enabling modular blockchain paradigm where various execution environments can utilize this specialized layer for data ordering and availability. Although a small concern may arise that rollups using specialized DA layer might not be as secure as those with Ethereum mainnet, it is an essential tradeoff since modular blockchains can bring a more flexible, scalable, customizable and interoperable network. In the long run, specialized DA layer will become the cornerstone of modular blockchain paradigm, leading the way towards the endgame of blockchain scaling. Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision. 🐩 @chestersigned 📅 18 April 2022 Link: https://medium.com/blockchain-capital-blog/wtf-is-data-availability-80c2c95ded0f https://polynya.medium.com/the-lay-of-the-modular-blockchain-land-d937f7df4884 https://rileygmi.substack.com/p/celestia?s=r&utm_campaign=post&utm_medium=web https://coinyuppie.com/focus-on-the-data-availability-layer-to-understand-the-new-public-chain-celestia/ https://polygon.technology/blog/introducing-avail-by-polygon-a-robust-general-purpose-scalable-data-availability-layer-98bc9814c048 https://polygon.technology/solutions/polygon-avail/

Specialized data availability layer: A prototype of modular blockchains

Modular blockchain paradigm

In physics, coupling is the phenomenon whereby two forms of motion affect each other through interaction. Decoupling, on the other hand, is the mathematical separation of two forms of motion to deal with the problem.

In the field of programming, the idea of modularity is very similar: a software design technique called modular programming, emphasizes separating the functionality of a program into independent, interchangeable modules, such that each contains everything necessary to execute only one aspect of the desired functionality.

Blockchains are also computer programs. Fundamentally, when applying the idea of modularity to a public blockchain, we can break it down into three core components:

Execution — This is the computation required to update the chain, which involves getting the current state, adding a bunch of new transactions and transitioning to the new state.

Consensus or Settlement or Security — This provides security and agreement for transactions and their ordering.

Data Availability (DA) — This is to ensure that the transaction data behind the block header is published and available so that anyone can easily calculate the state and check the state transitions.

Instead of bringing all these three components together in one monolithic chain as current major blockchains do, we can split them into specialized chains or layers, each playing the role of one module, to more efficiently implement the functionality of a blockchain. This is the paradigm of modular blockchain.

The benefits of modular blockchains

As stated earlier, modular blockchains are the result of separating the core components of a single blockchain and running them on separate layers. So, what are the advantages of modular blockchains versus those non-modular ones? The following are the benefits listed.

Scalability

Layers that specialize in a pair of core features will allow for much greater scalability without the limitation of making tradeoffs that come with a monolithic blockchain. For instance, a modular data availability layer with DA sampling can scale linearly with the number of users.

Interoperability

Blockchains can employ a modular shared security layer to enable trust-minimized bridging between blockchains in the same cluster. This improves both the security and the level at which several blockchains can communicate with each other.

Bootstrapping

New blockchains can be created with minimal cost and time. Things like Rollup Software Development Kits will not only aid this but deliver a way to bootstrap without needing a consensus mechanism, validators, or token distribution mechanisms.

Experimentation

Blockchains can easily be created and used to test new innovative technologies that bring further scalability and optimizations to the entire stack of chains. Doing this allows new chains to focus directly on the elements they want to innovate on, which can then be spread to the rest of the chains.

The DA layer in rollups

Rollups, which is Ethereum’s layer 2 solution aiming to scale effectively, is fundamentally a modular approach as well. The rollup takes the execution layer out of Ethereum layer 1 and forms an execution-exclusive second layer focusing on processing transactions as fast as possible. On the other hand, Ethereum layer 1 is still in charge of the challenging work of both security and data availability.

So, there are two questions here. First, why is data availability so important? Second, can we use the idea of modularity again to decouple DA out from layer 1? Please let me take time to explain.

For the first question, I need to discuss the situation separately. In the case of Optimistic Rollup, when the sequencer does evil and settles an incorrect account balance state, anyone can retrieve the original transaction data from layer 1 and reconstruct the correct account state to execute a challenge. So, the role of DA is like an archive bureau which provides original evidence in the event of a challenge, ensuring that the challenge is correct and valid.

As for Zero Knowledge (ZK) Rollup, the sequencer cannot do evil due to the cryptographic proof of zero knowledge. But in extreme cases that the sequencer goes down or is unavailable for any reason, users need to escape the layer 2 chain and get their money back to layer 1, when DA is essential to ensure that anyone can reconstruct the final status based on the data. Therefore, the role of DA is like a safety door to guarantee a smooth escape for users in the event of an extreme situation.

For the second question, the short answer is yes. Although DA is so vital that best place to store it is Ethereum layer 1, the most secure chain is also the most expensive one — the gas cost of settlement transactions and data storage is way much higher than that on any other chain. That is why both Matter Labs and Starkware provide users with a cheap version of ZK Rollup, namely ZK-Porter and Validium, compromising on not storing DA on layer 1.

As a result, a specialized and modular DA layer is beginning to enter the picture.

Projects of specialized DA layer

Celestia

Formerly named LazyLedger, Celestia is a modular consensus and data network, built to enable anyone to easily deploy their own blockchain with minimal overhead. By decoupling the consensus and DA layers with application execution, Celestia modularizes the blockchain technology stack and unlocks new possibilities for decentralized application builders.

Under Celestia-enabled modular architecture, developers can effortlessly define their own virtual execution environments. Each application gets its own sovereign execution space and can be updated without main chain hard forks.

Celestia’s main highlight lies in addressing one of the fundamental constraints of blockchain scaling — the “data availability problem”. This problem asks: how can nodes be sure that when a new block is produced, all of the data in that block was actually published to the network? The dilemma is that if a sequencer or block producer does not release all of the data in a block, no one could detect if there is a malicious transaction hidden within that block.

How does Celestia resolve this? It uses erasure codes to create data availability proofs that guarantee, with high probability, that all that data was posted on-chain.

Specifically, Celestia uses a 2-dimensional reed-solomon encoding scheme to encode block data when the sequencer creates a new block. This scheme ensures that only a small sample of data is enough to verify with statistical certainty that the entire block has been published. Only light clients are needed to perform an operation called Data Availability Sampling (DAS) — randomly downloading a small piece of data from the block to detect if the sequencer is misbehaving and withholding data. This method is very effective in the view of probability statistics. In fact, by the seventh time a light client tries to randomly check a chunk of data, the chance of it failing to detect that the sequencer is misbehaving would become less than 1%. And once it is discovered that any one small piece of data is incorrectly encoded, the network is notified via a fraud proof. This makes sure that the throughput is not bottlenecked by state execution like traditional blockchains, allowing Celestia’s throughput to scale with the number of users involved.

Initially focusing on the EVM and Cosmos SDK, Celestia does not exclude integration into the ecosystems of Ethereum and Cosmos. Celestia has released a plan called Celestium, which allows any Ethereum L2 chain to use Celestia for data availability and Ethereum for settlement and dispute resolution. Celestiums will provide high throughput data availability for Ethereum L2s, with a higher level of security than other off-chain data availability techniques.

Moreover, Celestia is partnering with Evmos to build a settlement layer for EVM rollups called Cevmos. The settlement chain will be implemented as a Celestia rollup using Optimint, and leverages Celestia as the DA layer to provide a fully EVM-equivalent stack optimized solely for rollups, resulting in lower fees and better scale in rollups.

Polygon Avail

Avail, one of the scaling solutions in Polygon’s toolkit, is a general-purpose, scalable data availability-focused blockchain targeted for standalone chains, side-chains, and off-chain scaling solutions.

Avail provides a robust data availability layer by using an extremely secure mathematical primitive — data availability checks using erasure codes with a key innovation — they use Kate polynomial commitments to create a 2D data availability scheme that avoids fraud proofs, does not require honest majority assumptions, and has no reliance on honest full node peer to gain confidence that the data is available.

Avail’s approach to solve the data availability problem is similar to Celestia’s DAS but differs only on small points. Avail uses KZG polynomial commitments instead of fraud proofs to avoid sequencers’ misbehaving. KZG commitments, also known as Kate commitments, was introduced by Aniket Kate, Gregory M. Zaverucha, and Ian Goldberg in 2010, providing a way to commit to polynomials in a succinct manner.

In simple terms, the entire data in a block is arranged as a two-dimensional matrix. The data redundancy is brought in by erasure coding each column of the matrix to double the size of the original one. Kate commitments are used to commit to each of the rows and the commitment is included in the block header. The scheme makes it easy to catch a data hiding attempt as any light client with access to only block headers can query random cells of the matrix and get short proofs that can be checked against the block headers. The data redundancy forces the sequencer to hide a large part of the block even if it wants to hide just a single transaction, making it susceptible to getting caught on random sampling. Thus, the need for fraud proofs is avoided as the binding nature of the Kate commitments makes it very computationally infeasible for sequencers to construct wrong commitments and not get caught.

Summary

Specialized data availability layer, as a prototype of modular blockchains, aims to facilitate a complete change in how we approach design of blockchain systems, enabling modular blockchain paradigm where various execution environments can utilize this specialized layer for data ordering and availability. Although a small concern may arise that rollups using specialized DA layer might not be as secure as those with Ethereum mainnet, it is an essential tradeoff since modular blockchains can bring a more flexible, scalable, customizable and interoperable network. In the long run, specialized DA layer will become the cornerstone of modular blockchain paradigm, leading the way towards the endgame of blockchain scaling.

Disclaimer: This research is for information purposes only. It does not constitute investment advice or a recommendation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision.

🐩 @chestersigned

📅 18 April 2022

Link:

https://medium.com/blockchain-capital-blog/wtf-is-data-availability-80c2c95ded0f

https://polynya.medium.com/the-lay-of-the-modular-blockchain-land-d937f7df4884

https://rileygmi.substack.com/p/celestia?s=r&utm_campaign=post&utm_medium=web

https://coinyuppie.com/focus-on-the-data-availability-layer-to-understand-the-new-public-chain-celestia/

https://polygon.technology/blog/introducing-avail-by-polygon-a-robust-general-purpose-scalable-data-availability-layer-98bc9814c048

https://polygon.technology/solutions/polygon-avail/
Explore the latest crypto news
âšĄïž Be a part of the latests discussions in crypto
💬 Interact with your favorite creators
👍 Enjoy content that interests you
Email / Phone number

Latest News

--
View More

Trending Articles

View More
Sitemap
Cookie Preferences
Platform T&Cs