According to TechFlow, the AO ecosystem has officially launched the "AI on AO" plan, launching an open-source large language model (LLMs) on the chain, aiming to bring any AI model (not just LLMs) to the chain. Based on the Apus Network, using Arweave's permanent on-chain storage, a decentralized, trustless GPU network is built, dedicated to providing reliable, efficient, and low-cost computing power for AI training and reasoning. AI data on AO can be uploaded to Arweave's model via ArDrive.