Original author: @Web3 Mario

Introduction: EigenLayer AVS has been online for a while. In addition to its official long-guided EigenDA and Layer 2 and other related use cases, the author found a very interesting phenomenon, that is, EigenLayer AVS seems to be very attractive to projects in the privacy computing track. Among the 9 AVSs that have been online, three belong to this track, including two ZK coprocessor projects Brevis and Lagrange, and a trusted execution environment project Automata. Therefore, I decided to conduct a detailed investigation to explore the significance of EigenLayer AVS to related products and future development trends.

The appeal of “cheap security” is the key to the success or failure of the EigenLayer AVS ecosystem

With TVL having officially exceeded 15 billion USD, EigenLayer has had a very fantastic start. Of course, I think most of the funds are for the purpose of obtaining potential airdrop income, but this undoubtedly lays a solid foundation for EigenLayer to enter the next stage. The key to the next stage lies in the success or failure of the AVS ecosystem, because the scale of AVS's transaction fee income determines the timing of EigenLayer's transition from the subsidy period to the mature period.

There are already many articles introducing the relevant technical details of EigenLayer, so I will not repeat them here. Simply put, EigenLayer creates a cheap consensus layer protocol by reusing the consensus capability of Ethereum Pos, also known as Restaking. First of all, I would like to discuss the core value of EigenLayer. In my opinion, the core value of EigenLayer mainly has three aspects:

* Decouple the consensus layer from the execution layer so that it can better cope with large-scale or high-cost data processing and consensus: Generally speaking, mainstream blockchain protocols are considered to be a solution with high execution costs but low execution efficiency. The reason for its high execution cost is "competition for block space", which is a fashionable word. We know that the execution environment based on blockchain usually uses market mechanisms to adjust the allocation of its node computing resources, that is, those who bid higher have priority to be executed, and the executors are in a competitive relationship. When demand increases, the fair price will continue to rise, and the execution cost will inevitably increase. The low execution efficiency comes from the fact that the original design of blockchain technology is to become an electronic currency settlement system, and the processing of transaction data is time-sensitive. Therefore, the execution layer has to be designed in a serial way, which makes it less efficient when dealing with most scenarios that are not sensitive to timing, such as social networks, AI training and other scenarios.

Decoupling the consensus layer from the execution layer, on the one hand, allows application developers to design a dedicated execution environment, often referred to as an application chain or Layer 3, so that its users can get rid of the competitive relationship with users of other applications and reduce the cost of use. On the other hand, it allows developers to develop a more suitable execution layer according to different application scenarios to improve execution efficiency.

* Consensus as a service, by productizing or resourceizing consensus, fully exploring the potential needs of the market: I think those who have experienced the era of the Hundred Schools of Thought in Layer 1 will have a unified sigh. The establishment of the consensus layer is usually expensive and difficult. In order to maintain their own consensus security guarantees, it may be computing power or pledged funds. Before sufficient profitability is generated, they are in the subsidy stage, and the cost is not low. Usually, the subject of the subsidy is the token income from mining. Only a few successful protocols can successfully transition to relying on their own revenue capacity, that is, the handling fee income, to maintain sufficient consensus capacity. For example, the transformation of the Ethereum economic model. This high startup cost has deterred many innovative applications, because the cost of establishing an execution environment suitable for their own applications, or building an application chain by themselves, is too high and faces great risks. This makes the Matthew effect of the Web3 industry very obvious. The evolution of the current Web3 technical solution has basically been engulfed by the technical route of Ethereum.

By turning consensus into a service or product, innovative applications have another option, which is to purchase consensus services based on demand. For example, for an innovative application, assuming that the amount of funds entrusted to the entire application in the early stage is $1 million, then this means that as long as more than $1 million of PoS consensus is purchased, the security of its execution environment can be guaranteed, because the economic cost of doing evil is negative. As the application develops, consensus services can be purchased flexibly and quantitatively. This reduces the startup cost of innovative applications, reduces their risks, and fully explores market potential.

* Cheap consensus source: The last point is that EigenLayer's consensus source reuses Ethereum's PoS funds, which means that for PoS stakers who can only capture one layer of income, participating in EigenLayer can get an extra layer of income. This allows EigenLayer to cleverly transform itself from a competitive relationship with the industry leader Ethereum to a symbiotic relationship, reducing its own cost of attracting consensus funds. This also gives it an advantage in pricing, such as the consensus purchase fee of the AVS protocol, over other protocols, making it more attractive to innovative applications. It has to be said that this is a really smart move.

The above three points allow EigenLayer to provide a "cheaper security" source for Web3 applications compared to other Web3 execution environments, giving it lower execution costs, better scalability, and a more flexible business model. Therefore, I believe that the key to the active EigenLayer AVS ecosystem lies in whether Web3 applications can be impressed by this cheap security and migrate to the ecosystem in large quantities.

The cost of use is the fundamental reason that restricts the development of Web3 privacy computing track

After discussing the core value of EigenLayer, let's take a look at the dilemma of the Web3 privacy computing track. The author is not an expert in the relevant field, so he focused on the current status of the track where the privacy computing-related projects in the currently launched AVS are located. That is, the so-called ZK coprocessor. I believe that most cryptographic products that use zero-knowledge proof algorithms face the same dilemma, that is, the high cost of use hinders the promotion of usage scenarios.

It seems that where the concept of ZK coprocessor comes from is not very important. As the name suggests, the original intention of the related products in this track is to use the zero-knowledge proof algorithm to provide coprocessor services for the current mainstream blockchain system, so that it can offload complex and expensive computing operations to the off-chain execution, and the correctness of the execution results is guaranteed by zero-knowledge proof. The most classic example of this modular idea is the relationship between CPU and GPU. By handing over parallel computing operations such as image processing and AI training, which the CPU architecture is not good at, to another independent module, GPU, to improve execution efficiency.

The technical architecture of a classic ZK coprocessor project is basically as follows. This is the simplified technical architecture of Axiom, one of the leaders in this field. Simply put, when a user has a demand for a complex calculation, you can use Axiom's off-chain service to calculate the result and generate the relevant ZK Proof proof, and then Axiom will use the result and proof as parameters to call Axiom's on-chain verification contract. The contract relies on the execution result, execution proof, and the key block information of the entire chain provided by Axiom to the chain, such as transaction merkle root (the process of maintaining the key information of the entire chain is also trustless). These three parts of data verify the correctness of the result through the on-chain verification algorithm. After verification, the result will be notified to the target contract through the callback function to trigger subsequent operations.

It is generally believed that the process of proof generation is a computationally intensive operation, while proof verification is relatively light. From Axiom's documentation, we know that the verification Gas fee required for an on-chain ZK Proof verification operation is approximately 420,000, which means that assuming the Gas Price is 10 Gwei, the user needs to pay a verification cost of 0.0042 ETH. Assuming the market price of ETH is $3,000, the cost is about $12. Such a cost is still too high for ordinary C-end users, which greatly limits the construction of the potential use scenarios of this product.

Take the Uniswap VIP program, a usage scenario often promoted by the ZK coprocessor project, for example. Uniswap can use the ZK coprocessor to set up a loyalty program similar to CEX for its traders. When the cumulative trading volume of a trader reaches a certain level over the past period of time, the protocol will rebate or reduce the transaction fee of the trader. Considering that the calculation of cumulative trading volume is a complex operation, Uniswap can use the ZK coprocessor solution to offload the calculation to the off-chain, reducing the computing cost while avoiding large-scale modifications to the on-chain protocol.

Let's do a simple calculation. Suppose Uniswap has set up a VIP event where anyone who can prove that their cumulative transaction volume in the past month exceeds $1,000,000 can enjoy a free transaction fee. A trader chooses to trade in Uniswap's 0.01% transaction fee pool. When the user's single transaction volume is $100,000, the transaction fee is $10, but the cost of verification is $12). This will undermine the user's motivation to participate in this service and raise the threshold for participating in the event. In the end, it will only benefit the whales.

It should not be difficult to find similar cases in related pure ZK architecture products. The use cases and technical architecture are great, but I think the cost of use is the core constraint that prevents related products from expanding their usage scenarios.

From Brevis’ transformation, we can see the siphoning effect of EigenLayer’s “cheap security” on related products

So let's take a look at how Brevis, one of the first AVSs to go live, is affected by EigenLayer. I hope this will illustrate that EigenLayer has obvious appeal to related cryptographic products with its "cheap security".

Brevis positions itself as a ZK coprocessor. When it goes online in early 2023, it is still positioned as a full-chain data computing and verification platform. Of course, this is essentially no different from the ZK coprocessor, except that the latter looks cooler. For a long time in the past, Brevis has been operating using the so-called "Pure-ZK" solution mentioned above. This makes it seem powerless to promote usage scenarios, and in a blog post on April 11, it announced a collaboration with EigenLayer and a new "cryptoeconomics + ZK proof" solution, Brevis coChain. In this solution, the verification layer is moved from the Ethereum mainnet to a coChain maintained by AVS.

When a user has a computing need, the client circuit calculates the result and generates the relevant ZK Proof proof, and sends a computing request to Brevis coChain through the on-chain smart contract. After listening to the request, AVS verifies the correctness of the calculation, and after passing it, it packages the relevant data and compresses it and sends it to the Ethereum mainnet, and asserts the correctness of the result. In the next period of time, like other "optimistic verification" schemes, it will enter the challenge period, when the challenger can object to a certain result and strive to confiscate the evildoer by submitting the corresponding ZK fraud proof. After the confiscation period, AVS will use the callback of the target contract through the on-chain contract to complete the subsequent operations. Considering that most privacy computing topics consider how to trust through mathematics, I would like to call this scheme "optimistic trustlessness."

Similarly, Lagrange and Automata must have gone through the same mental journey, and finally turned to launch an optimistic trustless solution using AVS. The advantage of this solution is that it greatly reduces the verification cost. Because in the process of obtaining the correct result, it is no longer necessary to verify the calculation on the chain with high cost, and instead optimistically trust the processing results of the consensus layer of EigenLayer and the security brought by the ZK fraud proof. Of course, the shift from trust in mathematics to trust in human nature will definitely face some challenges in the field of Web3. But I think this is an acceptable result compared to the practicality it brings. Moreover, this solution will effectively break the constraints of verification costs on the promotion of usage scenarios. I believe that many more interesting products will be launched soon.

This solution also has a demonstration effect for other privacy computing products. Considering that this track is still in the blue ocean stage, it should be more conducive to the promotion of the new paradigm compared to the fiercely competitive rollup-related tracks. I believe that the AVS ecosystem will be the first to usher in the outbreak of the privacy computing track. Since the author is not related to cryptography, there are inevitably omissions in the writing process, and I hope experts can correct me.