In recent years, with the release of the GPT series of products, artificial intelligence is changing all walks of life. We have witnessed various artificial intelligence applications entering our daily work and life. It has improved people's work efficiency, changed their living habits, and reduced the operating costs of enterprises. We have to admit that AI is becoming the starting point of the next technological revolution.

The core of the AI ​​technology revolution lies in sufficient computing power reserves, algorithm models and a large amount of training data. Currently, high-performance GPU computing power is in short supply and expensive, algorithms tend to be homogenized, and there are data compliance and privacy protection issues for model training data. Blockchain technology has the characteristics of decentralization and distributed storage, which can make blockchain technology well applied in the development, deployment and operation of AI models.

1. Using the characteristics of blockchain to solve AI computing power problems


In order to solve the problems of GPU computing power shortage and high usage costs in the AI ​​development process, some blockchain projects are trying to solve them through the blockchain model.

Render Network is a high-performance distributed rendering platform that uses industry-leading otoy software to build a bridge between those who demand GPU computing power and those who provide idle GPU resources, allowing idle GPU computing resources to be supplied to high-demand computing fields such as artificial intelligence and virtual reality at a lower cost.

In the entire ecological operation, the providers of idle GPUs connect their devices to the Render Network to complete various rendering tasks, and the demanders reward the GPU providers by paying tokens. This decentralized approach maximizes resource utilization efficiency, creates value for participants, and reduces the development and operation costs of artificial intelligence. Last December, Render achieved a major technological leap, migrating its infrastructure from the Ethereum chain to the high-TPS Solana. Solana's high performance and higher scalability brought Render processing capabilities including real-time streaming and state compression.

Rendering on Render Network

Akash is a decentralized computing platform that aggregates idle CPUs, GPUs, storage, bandwidth, dedicated IP addresses and other network resources around the world through the platform and rents them out to companies and individuals engaged in artificial intelligence or other areas that require high computing power, so that they can fully realize their value. Users who provide GPU rental resources can obtain AKT tokens, and the demand side obtains computing power at a low cost. In addition to being used to pay for and settle rented network resources, the platform token AKT is also used as an incentive to encourage validators to participate in ecological governance and network security maintenance. The platform will charge a certain transaction fee in the payment and settlement of network resources. Through this model, all participants in the entire platform ecosystem can benefit, promoting the long-term survival and sustainable growth of the platform's business model.

Akash Network’s real-time statistics of network resources

Livepeer is a video infrastructure network platform for live and on-demand streaming. Users can join the network by running the platform software and use their own computer GPU, bandwidth and other resources to transcode and distribute videos. This model can improve the reliability of video streaming while reducing the costs associated with video transcoding and distribution by up to 50 times. On this basis, the Livepeer project is introducing AI video computing tasks into the Livepeer network, using the GPU network run by its orchestrator to generate high-quality AI videos, thereby reducing the cost of creating video content.

Judging from the blockchain projects introduced above, blockchain can take advantage of its decentralized and distributed characteristics and make full use of idle network resources to solve the current problems of insufficient AI computing power and high costs. Once this model is verified and recognized in more real-life scenarios and AI start-ups in the future, it will greatly alleviate the computing power problem.

2. Combination of AI and Blockchain Data

Data is the foundation of AI models, and the data used to train models determines the differences between various AI models. Blockchain data is of higher quality than other data sources, and the on-chain data is public. Combining AI with blockchain data can identify users on the chain.

Arkham is a platform that uses AI technology to provide on-chain data and intelligence analysis for rewards. Its proprietary artificial intelligence engine ULTRA can label on-chain addresses with real-world users, and decentralized on-chain anonymous addresses will be identified by their actual controllers in reality. After obtaining a large amount of label data of on-chain anonymous addresses through the AI ​​model, users can use Arkham to mine the entity's on-chain transaction information. As we all know, the biggest pain point in investigating virtual currency crimes is to identify anonymous addresses for fund transfers. Regulators can use the label data provided by Arkham to trace back and investigate criminal activities such as money laundering and fraud through virtual currencies.

Arkham platform’s on-chain data visualization graph

In addition, Arkham also has the function of on-chain intelligence information trading. Arkham's inter exchange function makes on-chain addresses and real off-chain information interchangeable. Users can collect on-chain information intelligence through bounty rewards on the platform, and some valuable on-chain information can also be auctioned on the platform (for specific product analysis, please click on the previous article Can Arkham become a powerful tool for on-chain supervision?). Arkham's artificial intelligence engine ULTRA was developed with the support of Palantir, a big data analysis and intelligence service company that provides artificial intelligence services to the US government, and the founder of OpenAI. Because of such a powerful AI model training data source and technical support, Arkham has the most powerful on-chain data label library in the industry.

For the high storage cost of large amounts of data for AI model training, blockchain storage projects such as Arweave, Filecoin, and Storj have also provided solutions. Whether it is Arweave's one-time payment for permanent storage or Filecoin's efficient pay-as-you-go, both greatly reduce the cost of data storage. In addition, for some traditional data storage methods, decentralized storage can also solve the problem of single-point storage data damage and loss caused by natural disasters.

Although using ChatGPT can improve work efficiency, in order to optimize the model and improve the accuracy of AI dialogue, a large amount of user usage data is required for training and tuning, so there is a risk of sensitive data and personal privacy data leakage. Zama is an open source cryptography company that builds the most advanced fully homomorphic encryption (FHE) solutions for blockchain and artificial intelligence. Zama Concrete ML can process sensitive data in a secure way, realize data collaboration between different institutions, keep each other confidential, improve efficiency and data security, and encrypt privacy data such as personal medical records during training to ensure that each user can only see the final result, not other people's sensitive data.

3. Combination of AI Agent and Blockchain Project

OpenAI defines AI Agent as a system that uses a large language model (LLM) as a brain to drive autonomous understanding, perception, planning, memory, and the ability to use tools to automatically perform complex tasks. With the release of OpenAI's GPTS, more and more AI Agent applications are being implemented.

Fetch.ai is a self-learning blockchain network that mainly promotes economic activities between offline AI agents. Fetch.ai consists of four parts: AI Agents, Agentverse, AI Engine, and Fetch network. Users can create, develop, and deploy their own AI agents on Agentverse using the AI ​​agent use cases provided by the platform, and can also publish their own AI agents on the platform to promote them to other users. DeltaV is an artificial intelligence-based chat interface in Fetch.ai. Users enter requests through the chat interface, and the AI ​​Engine reads the user input, converts it into actionable tasks, and selects the most suitable AI Agent in Agentverse to perform the task. Currently, Bosch of Germany and Fetch.ai are also working together to study the combination of AI Agent technology with mobile travel and smart homes to jointly open the door to the Internet of Things economy in the Web3 era.

Composition of the Fetch.ai Ecosystem

In addition, the AI ​​Agent application QnA3.AI introduces the AI ​​question-and-answer robot, technical analysis robot and asset trading capabilities of the crypto industry into the Web3 world. Through the QnA3 Bot, the user's information collection, information analysis and actual trading behavior when trading crypto assets are realized through the product functions of "question-and-answer", "technical analysis" and "real-time trading", which minimizes the interference of users' subjective emotions when making trading decisions.

IV. Possible legal risks

1. Data outbound risks

In the above introduction, it is mentioned that some decentralized storage projects are solving the data storage problem of AI model training at a lower cost. This can lower the threshold for some individuals and start-ups committed to AI entrepreneurship, but this decentralized storage method may have the risk of data outflow.

The Cyberspace Administration of China issued the "Guidelines for Data Outbound Security Assessment and Declaration (First Edition)", which clearly stipulates that data outbound activities include:

(1) Data processors transmit and store data collected and generated during domestic operations overseas;

(2) The data collected and generated by the data processor is stored in China, but can be queried, retrieved, downloaded or exported by institutions, organizations or individuals outside China;

(3) Other data export activities as required by the Cyberspace Administration of China.

So what is the definition of exiting the country? Article 89 of the Exit and Entry Administration Law of the People's Republic of China clearly stipulates that exiting the country means going from mainland China to other countries or regions, from mainland China to the Hong Kong Special Administrative Region and the Macao Special Administrative Region, and from mainland China to Taiwan. It can be seen that the judgment of whether exiting the country is based on the jurisdiction.

For decentralized storage projects, users store data in decentralized distributed networks such as IPFS. Files stored in the network are divided into several small pieces of data and encrypted and distributed in various nodes, and the storage nodes are spread all over the world. Imagine if domestic AI startups store AI model training data in such decentralized project nodes, there will be a risk of data outflow.

2. Risk of leakage of sensitive privacy data

In AI Agnet applications such as QnA3.AI, users can obtain transaction information of encrypted assets by talking to AI and then execute transactions. If the private data generated by this personal question-and-answer dialogue is used by the project party for model training and tuning, there will be a risk of privacy data leakage. If the leakage of such transaction data is used by someone with ulterior motives, it may lead to investment failure and greater losses.

The original text can be searched on WeChat public account: Chuyan