Editor | Wu Talks about Blockchain

The interviewee in this article is Ande Zheng, a core contributor to Pyth. The content only represents the personal views of the interviewee, does not represent the opinions of Wu Shuo, and does not provide any financial advice.

How does Pyth’s oracle solution differ from Chainlink?

This is indeed a frequently asked question. Although oracle is a relatively basic level, it is extremely critical to the entire DeFi field, because the vast majority of DeFi protocols (about 80%-90%) need to rely on oracles to obtain external data.

The development of oracles can be roughly divided into two stages. In the initial stage, oracles were mainly used to serve the basic needs of DeFi in the early stage, such as DEX, simple token exchange and lending. These early applications usually require low data frequency and low time sensitivity. For example, when a lending agreement liquidates a user's position, it only needs to obtain the asset price at a certain point in time. Therefore, oracles mainly adopt the push model in this stage. Under this model, the oracle will update the data at a set time interval (such as every 10 minutes, every half hour or every hour), or when the market price fluctuates greatly (for example, more than a certain percentage), the oracle will trigger an update and push the data to the chain.

It actually supports the development of the initial stage of DeFi protocols, including the lending and oracle-based DEX we mentioned earlier. Since these protocols are less dependent on time-sensitive data to execute user transactions, the initial oracles, such as Chainlink, as pioneers in the oracle industry, have pioneered such solutions.

As DeFi applications become more complex and data demands become more frequent and precise, a pure push model may not be able to meet the needs of high-frequency trading and real-time pricing.

Entering the second phase, we are currently experiencing an era of DeFi protocols that have extremely high demands for time sensitivity and data accuracy. A typical example is contract trading, which has a large trading volume, high-frequency trading amount and a large number of orders. When conducting such high-frequency transactions, they need to rely on a data source that can continuously provide high-frequency and accurate price data updates. This high-frequency data must be provided by upstream oracles, and these oracles can ensure that the frequency and accuracy of the data provided meet the requirements of downstream DeFi protocols, thereby supporting the development of DeFi.

Pyth keeps pace with the second phase of DeFi, growing and innovating together. At this stage, the innovation of the oracle is the introduction of a solution we call the Pull Model. A distinctive feature of the pull model is that it does not continuously push data to any target protocol, but updates data on the application chain it manages at a very high frequency. In the case of Pyth, the data is updated every 400 milliseconds. After the update, when the DeFi protocol on any external chain needs to use this data, they will send a data request to Pyth. After receiving the request, Pyth will then send the data to the corresponding blockchain and DeFi protocol.

Since Pyth can update data every 400 milliseconds, this time interval can meet the needs of protocols that need to update orders or prices multiple times per second. This approach also ensures the accuracy of the data source, because the oracle needs to obtain data from a reliable data source for processing and transmission, and Pyth has more than 100 first-hand data publishers, who are the owners of the data, not data aggregators like CoinGecko or CoinMarketCap.

How Pyth participates in the Solana meme craze

We are indeed one of the earlier products in the Solana ecosystem, and as part of the infrastructure, especially in the field of data aggregation, we use Solana's SVM, namely Pythnet, to process data. The entire data processing process is completed on our own application chain SVM, and then we will send this data to all the blockchains we support as needed, including multiple EVM chains, Solana, and Move ecosystem blockchains.

Regarding how to participate in the craze of Solana Meme ecology, first of all, we are part of the Solana community and one of the contributors. Therefore, we are very happy to see the development of Meme in Solana ecology. As an infrastructure provider, we can participate by supporting the construction of on-chain trading markets. For example, when Solana launches some popular Memes, such as POPCAT, they may have trading pools on DEX or be used as collateral on lending platforms.

In these processes, if on-chain trading markets are established, these markets will need oracles to provide asset data so that transactions can be completed. We will closely observe which Meme coins have actual trading demand in the market, and whether we can obtain accurate data from data owners and meet our requirements for data accuracy and update frequency. If these conditions are met and the market demand is large enough, we will support the data of these Meme assets.

Once this data is provided, downstream DeFi protocols can innovate and build Meme trading markets. Therefore, our participation is actually to create trading opportunities and innovation opportunities by providing data support to Meme coins with market demand.

You can check the upcoming data on the Pyth official website, where there may not only be Meme coins from Solana, but also Meme coins from other ecosystems. This is roughly how we are connected to the Meme craze.

How did the collaboration between Laser Digital, Nomura Securities’ digital asset subsidiary, and Pyth come about?

Nomura Securities' digital asset subsidiary Laser Digital has joined the Pyth Network as a data provider, which is indeed a landmark event in the cooperation between traditional finance and the crypto field. As a top financial institution, Nomura Securities covers investment banking, asset management and other financial services. Their participation is not only due to their attention to the development of the crypto field, but also because they hope to establish their influence in this emerging field.

As a digital asset subsidiary of Nomura, Laser Digital has extensive asset management and trading experience in the digital asset field, so they generate a large amount of high-quality first-hand data. This data is extremely valuable in traditional finance, and is often seen in large exchanges and financial institutions that make high profits through data sales. Therefore, when Pyth introduced our network and data processing methods to Laser Digital, they saw an opportunity to participate and promote the development of the industry.

The addition of Laser Digital means they will begin providing unique digital asset data to the Pyth network, which will support the data reliability and accuracy of our network. This collaboration has significant benefits for both parties: for Pyth, gaining a strong, compliant traditional financial institution as a data provider not only enhances the diversity of the network and the quality of the data, but also improves It increases the credibility and attractiveness of the entire platform; for Laser Digital and its parent company Nomura Securities, this marks their strategic layout in the field of crypto assets, expands their business boundaries, and may attract more people seeking traditional financial stability and emerging Market innovation combined with customers.

In addition, with the participation of traditional financial institutions, we expect more traditional real world asset data to be integrated, which will not only bring more asset types to the crypto market, but also provide greater opportunities for product innovation and market depth for the entire industry. Therefore, we are optimistic about the future potential of this cooperation and look forward to more similar traditional financial institutions participating in our network.

What is the use of Pyth's new product Entropy?

Entropy is a new product in the Pyth network, and its main function is to serve as an on-chain random number generator. Unlike other traditional random number generators, Entropy is completely based on blockchain technology. It is designed to improve the reliability of random number generation and ensure its true randomness, thereby providing technical updates and improvements.

The need for random numbers is very common in blockchain applications, especially in the fields of games and DeFi. Games may require random numbers to determine the results of in-game events, and DeFi applications, such as sweepstakes, lotteries, or gambling applications, also require reliable random numbers to ensure fairness and transparency. Although random number services are available on some mainstream blockchains, many protocols still seek reliable random number generation services, which is why Entropy was developed.

The process of using Entropy is quite simple: when a user needs a random number, just send a request to Entropy for random number generation, and it will return a random number on the chain. This product is designed specifically for generating random numbers, so it is relatively simple to integrate into other applications, mainly through a simple API interface. When using Entropy to generate random numbers, users need to pay a certain fee, which is also the source of income for Entropy's operations. Currently, the fee for each update of Entropy is set at 101 Wei (block unit fee).

How does the Python application chain work?

Pythnet application chain is an innovation of the Pyth network, which solves the scalability problem of providing data to multiple blockchain environments. The original model requires setting up a separate data push destination for each newly supported blockchain. As the number of blockchains that need to be supported increases, the workload and cost will increase exponentially.

Pythnet is the first dedicated application chain (SVM) based on Solana, and its core function is to process and aggregate data provided by upstream. Although this application chain is technically a public blockchain, it is dedicated to processing data and will not deploy other protocols or applications. Pythnet's goal is to centrally process all data, making data processing and publishing more efficient and scalable.

The workflow of Python is as follows:

● Data provision: Data providers send their data to Pythnet.

● Data Aggregation: On Pythnet, data about the same asset (e.g. the price of Bitcoin) collected from multiple data sources will be algorithmically processed to produce a single, aggregated price.

● Data publishing: This aggregated price is updated every 400 milliseconds and pushed to all supported blockchains on demand. Since Solana can update up to 400 milliseconds, this allows Pythnet to quickly respond to market changes and update prices in a timely manner.

The main changes and advantages brought by Python include:

● Enhanced scalability: By centrally processing and publishing data, Pythnet allows the Pyth network to easily scale to support more blockchains without the need to complexly configure and maintain independent data push systems for each new blockchain.

● Improve efficiency: Centralized aggregation of data reduces duplication of work and potential errors, and improves the efficiency of data processing.

● Data consistency: Through a unified aggregation mechanism, the consistency and accuracy of Pyth data on all blockchains are guaranteed, enhancing the credibility of the data.

● Fast response speed: The data update frequency of 400 milliseconds is ideal for high-frequency trading and other applications that are sensitive to real-time data, such as DeFi and games.

The practical application of Python is mainly as a centralized data processing and distribution platform, which greatly improves the scalability and efficiency of blockchain data services through the "pull on demand" model. This model allows multiple blockchains and DeFi protocols to fetch the latest data from Python based on actual demand, rather than continuously pushing data to all chains. This approach significantly reduces operating costs and solves the problem of cost and complexity increasing with the number of chains in the traditional push model.

In specific operations, when a DeFi protocol on any chain needs Pyth data, it can send price update requests to Pythnet through cross-chain protocols (such as Wormhole). Pythnet responds to these requests and sends the latest aggregated price data back to the requesting chain. This process can be completed within a few seconds, greatly improving the real-time nature of the data and the responsiveness of the application.

In addition, Pythnet's on-demand model also solves the infrastructure challenges faced by emerging chains and small blockchain ecosystems. In the traditional model, small or emerging chains often have difficulty obtaining the necessary oracle services due to cost and resource constraints, which limits their development and application innovation. The Pythnet model enables even these emerging chains to obtain necessary data when needed to support the deployment and operation of DeFi or other applications, because they only need to pull data from Pythnet when needed, without the need for continuous data push.

The implementation of Pythnet has indeed brought about a series of far-reaching changes, especially in improving the scalability of oracle services. By supporting 50 to 60 blockchains in a short period of time, Pythnet has demonstrated its excellent adaptability and expansion capabilities, far exceeding the number of chains that many other oracle protocols can currently support.

What is the Pyth governance model?

Pyth’s governance model involves several key components designed to ensure transparency, efficiency, and community participation across the network. Here is a detailed description of this governance structure:

1. PythDAO

PythDAO is the core governance entity of the entire Pyth network, consisting of all PYTH token holders. These holders can obtain governance voting rights by staking their PYTH tokens. In this way, they can participate in voting on key decisions of the network, such as the approval and execution of governance proposals.

2. Pythian Committee

This is a committee of active governance participants that handles key operational and policy issues in the Pyth network. Committee members are usually elected by the community, and their main task is to ensure that proposals are properly discussed and executed. They also participate in some specific operational decisions, such as updates to the protocol, the addition of new features, or other key changes.

3. Price Feed Committee

Focused on managing and overseeing the quality and accuracy of price data in the Pyth network. The committee's responsibilities include deciding which new price feeds can be added to the network and whether existing price feeds need to be removed from the network. The committee ensures that all data sources maintain high standards of data quality to maintain the credibility and efficiency of the Pyth network.

Governance Process

● Proposal Phase: Any PYTH token holder can propose governance proposals, which may involve protocol updates, new feature additions, or other important decisions.

● Discussion and Review: The proposal will be submitted to the appropriate committee (Pythian Committee or Price Feed Committee), which will review and discuss the proposal.

● Voting: After review, the proposal will be submitted to all PYTH token holders for voting. Voting is usually conducted in proportion to token holdings, ensuring that voting rights are consistent with shareholding ratios.

● Implementation: Once proposals are approved, the relevant technical team or committee will be responsible for implementing them.

● This governance model is designed to promote broad community participation and ensure that all important decisions reflect the wishes of the majority of coin holders. At the same time, the establishment of specific committees ensures professionalism and efficiency, allowing key operational decisions to respond quickly to market and technological changes. This structure maintains the characteristics of decentralization while effectively managing and maintaining the operational quality and security of the entire network.

Pyth's Ecosystem Incentive Plan

The Pyth network offers several incentive programs to support community members and promote the development of the ecosystem. These programs are designed to motivate and reward individuals and teams who have made positive contributions to the Pyth ecosystem. The following are the main incentive methods:

1. Community Incentive Program

This program is mainly run through Pyth's Discord community, and its purpose is to reward members who actively participate in community activities. Participation includes managing the community, participating in cultural construction, creating artwork, or designing Pyth-related creative content. In this way, community members can get PYTH tokens as rewards through active contributions.

2. Research Incentive Program

This program encourages individuals or teams who are interested in oracle technology, the Pyth network and its development to submit research proposals. After the proposal is reviewed and approved by the community, the researcher can receive PYTH token rewards. This program aims to promote the exploration and theoretical research of the technological frontier and promote the development and improvement of Pyth technology.

3. Developer Incentive Program

This program provides funding for developers who want to contribute code or technical solutions to the Pyth network. Developers can receive funding to support their projects and technology development by submitting development proposals and after the community reviews and approves them. This is intended to encourage participation in technological innovation and ecosystem construction.

way of participation

● Community activities: Actively participate in Pyth's community discussions and activities, such as in the Discord channel.

● Proposal submission: Submit a research or development proposal detailing the research topic or development project’s objectives, expected outcomes, and its potential contribution to the Pyth ecosystem.

● Technical contribution: Directly participate in code contributions or provide technical solutions, and demonstrate work results through GitHub or other platforms.

● Monitoring updates

● Stay tuned to the latest news from the Python community, especially through the official Discord channel, community forums, and official website. New incentive programs and updates are often announced through these channels.

Pyth New Staking Mechanism

The Pythian community is discussing new staking mechanisms that aim to enhance the dynamics of token economics and further incentivize community members to participate in data verification and governance of the network. These changes will appear in the form of community proposals, and the community will vote on whether to implement them, ensuring the democracy and transparency of the entire process.

Main features of the new staking mechanism

● Staking Rewards: Future staking mechanisms may include providing rewards to users who stake PYTH tokens on data provider accounts. This reward is intended to encourage users to support data providers who provide accurate and reliable data.

● Risk and reward sharing: If the supported data provider maintains the accuracy and reliability of the data, the staker can share the rewards obtained by the data provider. Conversely, if the data provider publishes malicious data or there are problems with the accuracy of the data, the PYTH tokens staked on the provider’s account will face penalties, thereby reducing the risk of malicious behavior.

● Increase token liquidity: By encouraging users to stake PYTH tokens, the new mechanism will increase the liquidity and usage scenarios of tokens, thereby potentially increasing the value and market demand of tokens.

The new staking mechanism may bring about the following economic effects:

● Increase the value of tokens: With the implementation of the staking mechanism, more tokens are locked for staking, reducing the circulating supply in the market and potentially increasing the scarcity and value of tokens.

● Enhance network security and data quality: Through rewards and penalties for data providers, the new mechanism will incentivize data providers to maintain high standards of data quality and enhance the reliability and credibility of the entire Pyth network.

● Promote community participation: The new mechanism encourages a wider community to participate in the governance and operation of the Pyth network, enhancing the community’s activity and cohesion.

Regarding the large unlocking in May

In fact, we are not particularly worried about this matter. If you look closely at the token distribution in Tokenomics, you will find that there are five main different distribution objects.

First, the rewards for data publishers account for 22% of the total number of tokens, which are directly awarded to data publishers. In the future, publishers may need to stake their PYTH tokens to continue publishing data. In this way, the community can also exercise certain supervision over data publishers. Even if publishers need to unlock these tokens to their accounts, they still need to stake them, so we do not think that these tokens will have a significant impact on the market environment.

The second part, what we call ecological growth, accounts for 52% of the tokens, which means almost half of the tokens are related to ecological growth. All incentive plans come from this part, including rewards for ecological contributors, such as data publishers and downstream institutional agreements. These incentive plans are gradual, and even if there are tokens unlocked, they will not be issued all at once.

Most of the tokens related to ecosystem growth are not issued to individuals, but to ecosystem builders or institutions. These participants have long-term cooperation plans. For example, some DeFi protocols have been using Pyth services and receiving ecosystem growth rewards. In the future, they will also feed back to the use of Pyth through staking and other means. We do not think they will be under pressure to sell in large quantities.

The other two parts include the tokens needed for protocol development, which account for 8.5% and the private sales part, which account for 10%. So we think that even though a large part of the liquidity is about to be unlocked, we don’t think it will have a significant impact on the market due to the native demand behind these token allocations. Even if there is an impact, it will be a long-term process, so it is not something we need to pay special attention to.

Overall, this is a gradual process. At the same time, since unlocking to circulation is a gradual process, not all unlocked tokens will flow into the market on May 20th. In fact, only a small part will enter the market at that time. Therefore, we are more looking forward to how the increase in liquidity after unlocking will support the construction and development of the entire ecosystem. This is our understanding of this token unlocking.