Original Title: (Crypto x AI: 10 Categories We're Watching in 2025)
Author: Archetype
Compiled by: Deep Tide TechFlow
1. Agent-to-Agent Interaction
Blockchain, due to its inherent transparency and composability, has become an ideal platform for seamless interaction between agents. In this interaction, agents developed by different entities for different purposes can collaborate to complete tasks. There have already been some exciting attempts, such as transferring assets between agents and issuing tokens together. We look forward to further expanding interactions between agents: on one hand, creating entirely new application scenarios, such as agent-driven social platforms; on the other hand, optimizing existing business workflows, such as platform certification, micropayments, cross-platform workflow integration, etc., thereby simplifying today's complex and cumbersome operational processes. - Danny, Katie, Aadharsh, Dmitriy
Aethernet and Clanker jointly issued tokens on Warpcast
2. Decentralized Agentic Organizations
Large-scale multi-agent collaboration is another exciting research direction. How do multi-agent systems collaborate to complete tasks, solve problems, and even manage protocols and systems? In an early 2024 article (The Promises and Challenges of Crypto + AI Applications), Vitalik proposed the idea of using AI agents for prediction markets and adjudication. He believes that in large-scale applications, multi-agent systems have tremendous potential in 'truth' discovery and autonomous governance. We look forward to seeing how the capabilities of such multi-agent systems will be further explored, and how 'collective intelligence' will reveal more possibilities in experiments.
Additionally, collaboration between agents and humans is a direction worth exploring. For example, how communities interact around agents, or how agents organize humans to complete collective actions. We want to see more experiments with agents aimed at large-scale human collaboration. Of course, this needs to be equipped with some verification mechanism, especially in cases where tasks are completed off-chain. But this exploration could lead to some unexpectedly wonderful results. - Katie, Dmitriy, Ash
3. Agentic Multimedia Entertainment
The concept of digital virtual personas has existed for many years. For example, Hatsune Miku (2007) held sold-out concerts in venues with 20,000 seats; Lil Miquela (2016) has over 2 million followers on Instagram. Recent examples include AI virtual streamer Neuro-sama (2022), who has surpassed 600,000 subscriptions on Twitch; and the anonymous Kpop boy band PLAVE (2023), whose views on YouTube have exceeded 300 million in less than two years. With advancements in AI technology and the application of blockchain in payments, value transfer, and open data platforms, these agents are expected to become more autonomous and may usher in a brand new mainstream entertainment category by 2025. - Katie, Dmitriy
From the top left clockwise: Hatsune Miku, Luna from Virtuals, Lil Miquela, and PLAVE
4. Generative/Agentic Content Marketing
In some cases, agents themselves are the product, while in others, they can complement the product. In the attention economy, continuously producing engaging content is key to the success of any idea, product, or company. Generative/agent-driven content provides teams with a powerful tool to ensure a scalable, around-the-clock content creation channel. This field has accelerated due to discussions around the differences between 'memecoins and agents'. Agents are a powerful tool for the dissemination of memecoins, even if they have not yet fully realized 'agentification'.
Another example is that the gaming industry is increasingly pursuing dynamism to maintain user engagement. A classic approach is to guide user-generated content, while purely generative content (such as in-game items, NPCs, or even entirely generated levels) may become the next phase of this trend. We are curious to see how agents' capabilities will further expand the boundaries of content distribution and user interaction by 2025. - Katie
5. Next-Gen Art Tools/Platforms
In 2024, we launched the IN CONVERSATION WITH series, an interview program that dialogues with crypto artists in music, visual arts, design, curation, and other fields. This year's interviews made me notice a trend: artists interested in crypto technology are often passionate about cutting-edge technologies and hope that these technologies can integrate more deeply into their creative practices, such as AR/VR objects, code-generated art, and live coding.
The combination of Generative Art and blockchain technology has a long history, making blockchain an ideal carrier for AI art. In traditional platforms, it is very difficult to showcase and present these art forms. ArtBlocks provides a preliminary exploration of how digital art can achieve display, storage, monetization, and preservation through blockchain, greatly improving the experience for artists and audiences. Additionally, AI tools allow ordinary people to easily create their own art pieces. We are very much looking forward to how blockchain will further enhance these tools by 2025. - Katie
KC: Since you feel frustrated and disagree with certain aspects of crypto culture, what motivates you to still choose to participate in Web3? What value does Web3 bring to your creative practice? Is it experimental exploration, economic returns, or something else?
MM: For me, Web3 has a positive impact on both myself and other artists in multiple ways. Personally, platforms that support publishing generative art are particularly important for my creation. For instance, you can upload a JavaScript file that runs in real-time when someone mints or collects a piece, generating unique artworks within the system you designed. This real-time generation process is a core part of my creative practice. Introducing randomness into the systems I write and build profoundly influences my way of thinking about art, both conceptually and technically. However, if it is not showcased on platforms specifically designed for this art form or in traditional galleries, it is often difficult to convey this process to the audience.
In galleries, there may be an algorithm running in real-time through projections or screens, or works selected from multiple outputs generated by algorithms, transformed into physical forms for exhibition. However, for those audiences who are less familiar with code as an artistic medium, it is difficult for them to understand the significance of randomness in this creative process, and this randomness is an important part of all artistic practices that use software generatively. When the final presentation form of a work is merely an image published on Instagram or a printed physical piece, I sometimes find it difficult to emphasize the core idea of 'code as a creative medium' to the audience.
The emergence of NFTs excites me because they not only provide a platform for showcasing generative art but also help popularize the concept of 'code as an artistic medium', allowing more people to understand the uniqueness and value of this creative approach.
Excerpt from IN CONVERSATION WITH: Maya Man
6. Data Markets
Since Clive Humby proposed the view that 'data is the new oil', companies have taken measures to hoard and monetize user data. However, users are gradually realizing that their data is the cornerstone of these giant companies' survival, yet they have almost no control over how their data is used, nor have they derived any benefits from it. With the rapid development of powerful AI models, this contradiction has become increasingly sharp. On one hand, we need to address the issue of user data being misused; on the other hand, as larger-scale and higher-quality models deplete the public internet data 'resource', new data sources become particularly important.
To return data control to users, decentralized infrastructures offer a vast design space. This requires innovative solutions in various areas such as data storage, privacy protection, data quality assessment, value attribution, and monetization mechanisms. At the same time, in response to the issue of data supply shortages, we need to consider how to leverage technological advantages to build competitive solutions, for example, by creating higher-value data products through better incentive mechanisms and filtering methods. Especially in the current context where Web2 AI still dominates, exploring how to combine smart contracts with traditional service level agreements (SLAs) is a direction worth diving into. - Danny
7. Decentralized Compute
In AI development and deployment, computational power is equally a key element alongside data. In recent years, large data centers have dominated the development of deep learning and AI by relying on exclusive access to sites, energy, and hardware. However, with physical resource limitations and the development of open-source technologies, this pattern is gradually being broken.
The compute phase of decentralized AI v1 is similar to GPU clouds of Web2, but there is no significant advantage in hardware supply and demand. In the v2 phase, we see some teams beginning to build a more complete technology stack, including orchestration, routing, and pricing systems for high-performance computing, while developing proprietary features to attract demand and enhance inference efficiency. Some teams focus on optimizing inference routing across hardware through compiler frameworks, while others develop distributed model training frameworks on their compute networks.
Additionally, an emerging market known as AI-Fi is forming, which transforms computing power and GPUs into revenue assets through innovative economic mechanisms, or utilizes on-chain liquidity to provide hardware financing for data centers. However, whether decentralized computing can truly realize its potential still depends on whether the gap between ideals and actual needs can be bridged. - Danny
8. Compute Accounting Standards
Coordinating heterogeneous computing resources in decentralized high-performance computing (HPC) networks is an important challenge, and the lack of unified accounting standards complicates this issue. The outputs of AI models are diverse, such as model variants, quantization, randomness adjusted through temperature and sampling hyperparameters, etc. Furthermore, different GPU architectures and CUDA versions can also lead to differences in hardware output. These factors make it an urgent issue to accurately account for the capacity of models and computing markets in heterogeneous distributed systems.
Due to the lack of these standards, this year we have repeatedly seen the quality and quantity of model performance and computing resources inaccurately accounted for in the compute markets of Web2 and Web3. This has forced users to validate the actual performance of AI systems by running their own benchmarks or limiting the usage rates of compute markets.
The crypto space has consistently emphasized 'verifiability', so we hope that by 2025, the combination of crypto and AI will make system performance more transparent. Ordinary users should be able to easily compare key output characteristics of models or compute clusters, thus auditing and evaluating the actual performance of the system. - Aadharsh
9. Probabilistic Privacy Primitives
Vitalik mentioned a unique contradiction in the article (The Promises and Challenges of Crypto + AI Applications): 'In cryptography, open source is the only way to achieve security, but in AI, public models (even training data) greatly increase the risk of adversarial machine learning attacks.'
Although privacy protection is not a new research direction in blockchain, the rapid development of AI is accelerating the application of privacy-related cryptographic technologies. Significant progress has been made this year in privacy-enhancing technologies such as zero-knowledge proofs (ZK), fully homomorphic encryption (FHE), trusted execution environments (TEE), and multi-party computation (MPC). These technologies are used in scenarios such as private shared states for general computation on encrypted data. At the same time, technology giants like Nvidia and Apple are also leveraging proprietary TEE technology to enable federated learning and private AI inference while maintaining consistency across hardware, firmware, and models.
In the future, we will focus on how to protect privacy in random state transitions and how these technologies facilitate the practical application of decentralized AI in heterogeneous systems, such as decentralized private inference, storage and access pipelines for encrypted data, and the construction of fully autonomous execution environments. - Aadharsh
Apple's Apple Intelligence Stack and Nvidia's H100 GPU
10. Agentic Intents and Next-Gen User Trading Interfaces
An important application of AI agents is to help users autonomously complete transactions on-chain. However, in the past 12-16 months, the definitions of terms such as 'agent intent', 'agent behavior', and 'solver' have remained ambiguous, and the distinction from traditional 'robot' development has not been clear enough.
In the coming year, we expect to see more complex language systems integrating various data types and neural network architectures, driving the development of this field. Will agents continue to use existing on-chain systems to complete transactions, or will they develop entirely new tools and methods? Will large language models (LLMs) remain the core of these systems, or will they be replaced by other technologies? At the user interface level, will users interact with the system using natural language to complete transactions? Will the classic 'wallet as browser' theory become a reality? These are all questions worth exploring. - Danny, Katie, Aadharsh, Dmitriy