Original author: Archetype
Original translation: TechFlow
1. Agent-to-Agent Interaction
Blockchain, due to its natural transparency and composability, has become an ideal platform for seamless interaction between intelligent agents. In this interaction, intelligent agents developed by different institutions for different purposes can collaborate to complete tasks. There are already some exciting attempts, such as transferring money between intelligent agents and issuing tokens together. We look forward to further expansion of the interaction between intelligent agents: on the one hand, creating new application scenarios, such as new social platforms driven by intelligent agents; on the other hand, optimizing existing enterprise workflows, such as platform authentication, micropayments, cross-platform workflow integration, etc., thereby simplifying today's complex and cumbersome operational processes. - Danny, Katie, Aadharsh, Dmitriy
Aethernet and Clanker jointly issued a token on Warpcast.
2. Decentralized Agentic Organizations
Large-scale multi-agent collaboration is another exciting research direction. How do multi-agent systems collaborate to complete tasks, solve problems, or even manage protocols and systems? In an article at the beginning of 2024 (Promises and Challenges of Crypto + AI Applications), Vitalik proposed the idea of using AI agents for prediction markets and adjudication. He believes that in large-scale applications, multi-agent systems have enormous potential for 'truth' discovery and autonomous governance. We look forward to seeing how the capabilities of these multi-agent systems are further explored and how 'collective intelligence' reveals more possibilities in experiments.
Moreover, collaboration between agents and humans is also a direction worth exploring. For example, how communities interact around agents or how agents organize humans to complete collective actions. We want to see more experiments with agents aimed at large-scale human collaboration. Of course, this requires some sort of verification mechanism, especially in cases where tasks are completed off-chain. But this exploration could yield some unexpectedly wonderful results. - Katie, Dmitriy, Ash
3. Agentic Multimedia Entertainment
The concept of digital virtual personalities has existed for many years. For example, Hatsune Miku (2007) held sold-out concerts in venues with 20,000 seats; Lil Miquela (2016) has over 2 million followers on Instagram. Recent examples include the AI virtual streamer Neuro-sama (2022), which has over 600,000 subscribers on Twitch; and the anonymous Kpop boy group PLAVE (2023), which has surpassed 300 million views on YouTube in less than two years. With the advancement of AI technology and the application of blockchain in payment, value transfer, and open data platforms, these agents are expected to become more autonomous and may open a new mainstream entertainment category in 2025. - Katie, Dmitriy
From the top left, clockwise: Hatsune Miku, Virtuals' Luna, Lil Miquela, and PLAVE
4. Generative/Agentic Content Marketing
In some cases, the agent itself is the product, while in others, the agent can complement the product. In the attention economy, continuously outputting engaging content is key to the success of any idea, product, or company. Generative/agentic content provides teams with a powerful tool to ensure a scalable, always-on content creation channel. This field has accelerated due to discussions around 'the difference between memecoins and agents.' Agents are a powerful tool for the dissemination of memecoins even if they have not fully realized 'agentification.'
Another example is that the gaming industry is increasingly pursuing dynamic engagement to maintain user involvement. A classic approach is to guide user-generated content, and purely generative content (such as in-game items, NPCs, or even fully generated levels) may become the next phase of this trend. We are curious to see how the capabilities of agents in 2025 will further expand the boundaries of content distribution and user interaction. - Katie
5. Next-Gen Art Tools/Platforms
In 2024, we launched the IN CONVERSATION WITH series, an interview program that engages with crypto artists from fields such as music, visual arts, design, and curation. This year's interviews made me notice a trend: artists interested in crypto technology often have a passion for cutting-edge technology and hope that these technologies can be more deeply integrated into their creative practice, such as AR/VR objects, code-generated art, and live coding.
The combination of generative art and blockchain technology has a long history, making blockchain an ideal medium for AI art. It is very difficult to showcase and present these art forms on traditional platforms. ArtBlocks provides a preliminary exploration of how digital art can be displayed, stored, monetized, and preserved through blockchain, greatly improving the experience for artists and audiences. Additionally, AI tools also enable ordinary people to easily create their own artworks. We look forward to seeing how blockchain further enhances the capabilities of these tools in 2025. - Katie
KC: Since you feel frustrated and have some disagreements with crypto culture, what motivates you to still participate in Web3? What value does Web3 bring to your creative practice? Is it experimental exploration, economic return, or something else?
MM: For me, Web3 has a positive impact on both myself and other artists in multiple ways. Personally, platforms that support the publication of generative art are particularly important to my creative work. For example, you can upload a JavaScript file that runs in real-time when someone mints or collects a piece, generating unique artworks within the system you designed. This real-time generation process is a core part of my creative practice. Introducing randomness into the systems I write and build has profoundly influenced my thinking about art, both conceptually and technically. However, if it is not presented on a platform specifically designed for this art form or displayed in traditional galleries, it is often difficult to convey this process to the audience.
In galleries, there may be displays of algorithms running in real-time via projection or screens, or works selected from multiple outputs generated by algorithms, transformed into physical forms for exhibition. However, for audiences unfamiliar with code as an artistic medium, it can be difficult to understand the significance of randomness in this creative process, which is a crucial part of all artist practices that use software generatively. When the final presentation of the work is merely an image posted on Instagram or a printed physical piece, I sometimes find it challenging to emphasize the core concept of 'code as a creative medium' to the audience.
The emergence of NFTs excites me because they not only provide a platform to showcase generative art but also help popularize the concept of 'code as an artistic medium,' allowing more people to understand the uniqueness and value of this creative approach.
Excerpt from IN CONVERSATION WITH: Maya Man
6. Data Markets
Since Clive Humby proposed the idea that 'data is the new oil,' companies have taken measures to hoard and monetize user data. However, users are gradually realizing that their data is the cornerstone of these giant companies' survival, yet they have almost no control over how their data is used and have failed to gain benefits from it. With the rapid development of powerful AI models, this contradiction has become increasingly acute. On one hand, we need to address the issue of user data being misused; on the other hand, as larger and higher-quality models exhaust public internet data as a 'resource,' new data sources have become particularly important.
To return control of data to users, decentralized infrastructure provides a vast design space. This requires innovative solutions across multiple areas, including data storage, privacy protection, data quality assessment, value attribution, and monetization mechanisms. At the same time, addressing the issue of data supply shortages requires us to think about how to leverage technological advantages to build competitive solutions, such as creating higher-value data products through better incentive mechanisms and filtering methods. Especially in the current context where Web2 AI still dominates, how to combine smart contracts with traditional service level agreements (SLAs) is a direction worth exploring in depth. - Danny
7. Decentralized Compute
In the development and deployment of AI, computational power is also a key element in addition to data. In recent years, large data centers have dominated the development of deep learning and AI by relying on exclusive access to sites, energy, and hardware. However, this pattern is gradually being broken with the limitations of physical resources and the development of open-source technology.
The v1 stage of decentralized AI computation is similar to Web2's GPU cloud but does not have a significant advantage in hardware supply and demand. In the v2 stage, we see some teams starting to build a more complete tech stack, including orchestration, routing, and pricing systems for high-performance computing, while developing proprietary features to attract demand and enhance inference efficiency. Some teams focus on optimizing inference routing across hardware through compiler frameworks, while others develop distributed model training frameworks on their computing networks.
Moreover, an emerging market called AI-Fi is forming, which converts computing power and GPUs into revenue-generating assets through innovative economic mechanisms or provides new ways to finance hardware for data centers using on-chain liquidity. However, whether decentralized computing can truly realize its potential still depends on whether the gap between ideas and real needs can be bridged. - Danny
8. Compute Accounting Standards
In decentralized high-performance computing (HPC) networks, coordinating heterogeneous computing resources is an important challenge, and the current lack of unified accounting standards complicates this issue. The output of AI models is diverse, such as model variants, quantization, and randomness adjusted through temperature and sampling hyperparameters. Additionally, different GPU architectures and CUDA versions can lead to discrepancies in hardware output results. These factors make accurately accounting for the capacity of models and computing markets in heterogeneous distributed systems an urgent problem to solve.
Due to the lack of these standards, this year we have repeatedly seen the performance of models and the quality and quantity of computational resources miscalculated in both the Web2 and Web3 computing markets. This has forced users to validate the actual performance of AI systems by running their own benchmarks or limiting the usage rate of the computing market.
The crypto space has always emphasized 'verifiability,' so we hope that by 2025, the combination of crypto and AI will make system performance more transparent. Ordinary users should be able to easily compare the key output characteristics of models or compute clusters, allowing them to audit and evaluate the actual performance of the system. - Aadharsh
9. Probabilistic Privacy Primitives
Vitalik mentioned a unique contradiction in his article (Promises and Challenges of Crypto + AI Applications): 'In cryptography, open source is the only way to achieve security, but in AI, making models (and even training data) public significantly increases the risk of adversarial machine learning attacks.'
Although privacy protection is not a new research direction for blockchain, privacy-related cryptographic technologies are accelerating their application with the rapid development of AI. Significant progress has already been made in privacy-enhancing technologies this year, such as zero-knowledge proofs (ZK), fully homomorphic encryption (FHE), trusted execution environments (TEE), and secure multi-party computation (MPC). These technologies are used in scenarios such as private shared states for general computation on encrypted data. At the same time, tech giants like Nvidia and Apple are utilizing proprietary TEE technology to achieve federated learning and private AI inference while maintaining consistency between hardware, firmware, and models.
In the future, we will focus on how to protect privacy in random state transitions and how these technologies facilitate the practical application of decentralized AI in heterogeneous systems, such as decentralized private inference, storage and access pipelines for encrypted data, and the construction of fully autonomous execution environments. - Aadharsh
Apple's Apple Intelligence stack and Nvidia's H100 GPU
10. Agentic Intents and Next-Gen User Trading Interfaces
An important application of AI agents is to help users autonomously complete transactions on-chain. However, over the past 12-16 months, definitions of terms like 'agent intent,' 'agent behavior,' and 'solvers' have remained vague, and the distinction from traditional 'robot' development is not clear enough.
In the coming year, we look forward to seeing more complex language systems combined with various data types and neural network architectures, driving the development of this field. Will agents continue to use existing on-chain systems to complete transactions, or will they develop entirely new tools and methods? Will large language models (LLMs) still serve as the core of these systems, or will they be supplanted by other technologies? At the user interface level, will users interact with the system to complete transactions through natural language? Will the classic 'wallet as a browser' theory become a reality? These are all questions worth exploring. - Danny, Katie, Aadharsh, Dmitriy