Original: Archetype
Compiled by: Yuliya, PANews
In today's rapid development of artificial intelligence and blockchain technology, the intersection of these two fields is giving rise to exciting innovative possibilities. This article delves into the top ten important areas to watch in 2025, from the interactions of intelligent agents to decentralized computing, from the transformation of data markets to breakthroughs in privacy technologies.
1. Inter-agent Interaction
The inherent transparency and composability features of blockchain make it an ideal foundational layer for inter-agent interactions. Intelligent agents developed by different entities for different purposes can seamlessly interact on the blockchain. Some noteworthy experimental applications have already emerged, such as fund transfers between agents and joint issuance of tokens.
The future development potential of inter-agent interaction mainly lies in two aspects: first, creating entirely new application fields, such as new social scenarios driven by agent interactions; second, optimizing existing enterprise-level workflows, including traditionally cumbersome processes like platform certification and verification, micropayments, and cross-platform workflow integration.
Aethernet and Clanker implemented joint token issuance on the Warpcast platform
2. Decentralized Intelligent Agent Organizations
Large-scale multi-agent coordination is another exciting research area. It involves how multi-agent systems collaborate to accomplish tasks, solve problems, and govern systems and protocols. In an article published by Vitalik in early 2024 (Prospects and Challenges of Cryptocurrency and AI Applications), he mentioned the potential of using AI agents in prediction markets and arbitration. He believes that, from a macro perspective, multi-agent systems show significant potential in 'truth' discovery and autonomous governance systems.
The industry is continuously exploring and experimenting with the boundaries of multi-agent systems' capabilities and various forms of 'collective intelligence'. As an extension of coordination among agents, the coordination between agents and humans also constitutes an interesting design space, particularly regarding how communities interact around agents and how agents organize humans for collective action.
Researchers are particularly focused on agent experiments where the objective functions involve large-scale human coordination. Such applications require corresponding validation mechanisms, especially when human work is completed off-chain. This human-machine collaboration may give rise to some unique and interesting emergent behaviors.
3. Intelligent Agent Multimedia Entertainment
The concept of digital personas has existed for decades.
As early as 2007, Hatsune Miku was able to hold sold-out concerts in venues with 20,000 attendees;
The virtual influencer Lil Miquela, born in 2016, has over 2 million followers on Instagram.
AI virtual host Neuro-sama, launched in 2022, has amassed over 600,000 subscribers on the Twitch platform;
The virtual K-pop group PLAVE, established in 2023, has gained over 300 million views on YouTube in less than two years.
With advancements in AI infrastructure and the integration of blockchain in payment, value transfer, and open data platforms, these intelligent agents are expected to gain a higher degree of autonomy by 2025 and may pioneer a new mainstream entertainment category.
From the top left corner clockwise: Hatsune Miku, Luna from Virtuals, Lil Miquela, and PLAVE
4. Generative/Intelligent Agent Content Marketing
Unlike the situation where intelligent agents themselves are products as mentioned earlier, intelligent agents can also serve as complementary tools to products. In today's attention economy, continuously producing engaging content is crucial for the success of any creation, product, or company. Generative/intelligent agent content is becoming a powerful tool for teams to ensure 24/7 uninterrupted content production.
The development in this field has been accelerated by discussions on the boundaries between Meme coins and intelligent agents. Even if 'intelligence' has not yet been fully realized, intelligent agents have become a powerful means for Meme coins to gain traction.
The gaming sector provides another typical case. Modern games increasingly need to maintain dynamism to sustain user engagement. Traditionally, fostering user-generated content (UGC) has been the classic method for creating game dynamism. Purely generative content (including in-game items, NPC characters, fully generated levels, etc.) may represent the next stage of this evolution. Looking ahead to 2025, the capabilities of intelligent agents will greatly expand the boundaries of traditional distribution strategies.
5. Next-Generation Art Tools and Platforms
The 'In Conversation With' series launched in 2024 features interviews with artists active in the fields of music, visual arts, design, and curation at the cryptocurrency frontier and its margins. These interviews reveal an important observation: artists interested in cryptocurrency often also pay attention to broader cutting-edge technologies and tend to deeply incorporate these technologies into the aesthetics or core of their artistic practices, such as AR/VR objects, code-based art, and real-time programming art.
Generative art and blockchain technology have historically had a synergistic effect, making their potential as AI art infrastructure even more apparent. It is extremely difficult to appropriately display these new art media on traditional exhibition platforms. The ArtBlocks platform showcases a future scenario for digital art display, storage, monetization, and preservation using blockchain technology, significantly improving the overall experience for artists and audiences.
Beyond showcasing functionality, AI tools also expand the ability of the general public to create art. This trend of democratization is reshaping the landscape of artistic creation. Looking ahead to 2025, how blockchain technology will expand or empower these tools will be a highly attractive area of development.
Excerpted from (Dialogue: Maya Man)
6. Data Market
It has been 20 years since Clive Humby stated that 'data is the new oil', and companies have been taking strong measures to hoard and monetize user data. Users have realized that their data is the cornerstone of these multi-billion dollar companies, yet they have almost no control over how their data is used or the profits generated from it. With the rapid development of powerful AI models, this contradiction becomes even more pronounced.
The opportunities facing the data market can be viewed from two aspects: one is to solve the problem of user data being exploited, and the other is to address the shortage of data supply, as larger and better models are consuming the easily accessible 'oil field' of public internet data, necessitating new sources of data.
Data power returns to users
Regarding how to leverage decentralized infrastructure to return data power to users, this is a vast design space that requires innovative solutions across multiple fields. Some of the most urgent questions include:
Data storage locations and how to protect privacy during storage, transmission, and computation;
How to objectively assess, filter, and measure data quality;
What mechanisms to use for attribution and monetization (especially tracing value back to the source post-inference);
And what kind of orchestration or data retrieval systems to use in a diverse model ecosystem.
Supply Constraints
In addressing supply constraints, the key is not simply to replicate the Scale AI model with tokens but to understand where we can build advantages under favorable technological conditions and how to construct competitive solutions, whether in terms of scale, quality, or better incentive (and filtering) mechanisms, to create higher-value data products. This is especially important as most demand still comes from Web2 AI; thinking about how to combine smart contract execution mechanisms with traditional service level agreements (SLAs) and tools is an important area of research.
7. Decentralized Computing
If data is a fundamental element in the development and deployment of AI, then computing power is another critical component. The traditional large data center model, with its unique advantages in location, energy, and hardware, has largely dominated the trajectory of deep learning and AI development over the past few years. However, physical constraints and the development of open-source technologies are challenging this paradigm.
The first phase (v1) of decentralized AI computing is essentially a replica of Web2 GPU cloud services, lacking a real advantage on the supply side (hardware or data centers) and with limited organic demand.
In the second phase (v2), some outstanding teams are building a complete tech stack based on heterogeneous high-performance computing (HPC) supply, demonstrating unique capabilities in scheduling, routing, and pricing, while developing proprietary features to attract demand and cope with profit compression, especially on the inference side. Teams are also beginning to differentiate in usage scenarios and market strategies, with some focusing on integrating compiler frameworks for efficient inference routing across hardware, while others are pioneering distributed model training frameworks on their developed computing networks.
The industry is even beginning to see the rise of the AI-Fi market, which has led to innovative economic primitives that convert computing power and GPUs into revenue-generating assets or provide alternative funding sources for data centers through on-chain liquidity.
The main question here is to what extent decentralized AI will develop and deploy on decentralized computing infrastructure, or whether, like the storage domain, the gap between ideals and actual needs will always exist, making it difficult for this concept to fully realize its potential.
8. Computational Accounting Standards
In the incentive mechanisms of decentralized high-performance computing networks, a major challenge faced in coordinating heterogeneous computing resources is the lack of unified computational accounting standards. AI models add multiple unique complexities to the output space of high-performance computing, including model variants, quantization schemes, and levels of randomness adjustable through model temperature and sampling hyperparameters. Moreover, AI hardware may produce different output results due to variations in GPU architectures and CUDA versions. These factors ultimately necessitate the establishment of standards to regulate how models and computing markets measure their computational capacity in heterogeneous distributed systems.
Partly due to the lack of these standards, 2024 saw multiple cases in both the Web2 and Web3 domains where models and computing markets failed to accurately account for their computational quality and quantity. This led users to conduct their own comparative model benchmarks and execute proof of work by restricting the rate of the computing market to audit the real performance of these AI layers.
Looking ahead to 2025, the intersection of cryptography and AI is expected to achieve breakthroughs in verifiability, making it easier to verify compared to traditional AI. For ordinary users, it is crucial to be able to fairly compare all aspects of the defined models or output from computing clusters, which will aid in auditing and evaluating system performance.
9. Probabilistic Privacy Primitives
In 'Prospects and Challenges of Cryptocurrency and AI Applications', Vitalik pointed out a unique challenge in connecting cryptocurrency and AI: 'In the field of cryptography, open source is the only way to achieve real security, but in the AI field, the openness of models (and even their training data) greatly increases the risk of adversarial machine learning attacks.'
While privacy is not a new area of blockchain research, the rapid development of AI is accelerating the research and application of cryptographic primitives that support privacy. Significant progress in privacy-enhancing technologies has been made in 2024, including Zero-Knowledge Proofs (ZK), Fully Homomorphic Encryption (FHE), Trusted Execution Environments (TEEs), and Multi-Party Computation (MPC), which are used for common application scenarios like private sharing states of encrypted data computations. Meanwhile, centralized AI giants like NVIDIA and Apple are also using proprietary TEEs for federated learning and private AI inference, ensuring privacy while keeping hardware, firmware, and models consistent across systems.
Based on these developments, the industry is closely monitoring advances in privacy-preserving technologies in random state transitions and how these technologies accelerate the practical rollout of decentralized AI applications on heterogeneous systems. This includes various aspects such as decentralized private inference, storage/access pipelines for encrypted data, and fully sovereign execution environments.
Apple's AI tech stack and NVIDIA's H100 graphics processor
10. Agent Intent and Next-Generation User Transaction Interfaces
Over the past 12-16 months, there has been ambiguity in defining concepts such as intent, agent behavior, agent intent, solutions, and agent solutions, lacking clear delineation of how these concepts differ from traditional 'robot' development in recent years. Autonomous on-chain transactions by AI agents are one of the most viable application scenarios.
In the next 12 months, the industry expects to see a combination of more complex language systems with different data types and neural network architectures, thus advancing the overall design space. This raises several key questions:
Will agents use existing on-chain transaction systems, or will they develop their own tools and methods?
Will large language models continue to serve as the backend for these agent transaction systems, or will entirely new systems emerge?
At the interface level, will users begin to use natural language for transactions?
Will the classic concept of 'a wallet is a browser' eventually be realized?
The answers to these questions will profoundly influence the future development direction of cryptocurrency trading. With advancements in AI technology, agent systems may become more intelligent and autonomous, enabling them to better understand and execute user intentions.