According to BlockBeats, the AO Ecosystem announced its 'AI on AO' plan on June 21. The initiative aims to launch on-chain open-source Large Language Models (LLMs), with the goal of bringing any AI model, not just LLMs, on-chain. The plan is based on the Apus Network and utilizes Arweave's permanent on-chain storage to build a decentralized, trustless GPU network. This network is dedicated to providing reliable, efficient, and low-cost computing power for AI training and inference.
AI data on AO can be uploaded to Arweave's model via ArDrive. This move is a significant step towards decentralizing AI and making it more accessible and efficient. By bringing AI models on-chain, the AO Ecosystem is paving the way for a new era of AI development and usage. The use of Arweave's permanent on-chain storage further ensures the durability and longevity of these AI models, making them a reliable resource for future AI training and inference.
The 'AI on AO' plan is a testament to the AO Ecosystem's commitment to advancing AI technology and its applications. By providing a decentralized, trustless GPU network, the ecosystem is not only enhancing the efficiency of AI training and inference but also promoting transparency and trust in AI technology. This initiative is expected to have a significant impact on the AI industry, potentially revolutionizing the way AI models are developed and used.