Article Source: Yuliya
Original Text: Archetype
Compilation: Yuliya, PANews
In the rapidly evolving fields of artificial intelligence and blockchain technology, the intersection of these two domains is giving rise to exciting innovative possibilities. This article deeply analyzes the ten important areas to watch in 2025, from the interactions of intelligent agents to decentralized computing, from the transformation of data markets to breakthroughs in privacy technologies.
1. Inter-Agent Interaction
The inherent transparency and composability features of blockchain make it an ideal foundational layer for inter-agent interactions. Intelligent agents developed by different entities for different purposes can interact seamlessly on the blockchain. Some remarkable experimental applications have already emerged, such as fund transfers between agents and joint token issuance.
The potential for the future development of interactions between agents is primarily reflected in two aspects: first, pioneering entirely new application areas, such as novel social scenarios driven by agent interactions; second, optimizing existing enterprise-level workflows, including platform certification and verification, micropayments, and the traditionally cumbersome aspects of cross-platform workflow integration.
Aethernet and Clanker Implement Joint Token Issuance on Warpcast Platform
2. Decentralized Intelligent Agent Organizations
Large-scale multi-agent coordination is another exciting research area. This involves how multi-agent systems collaborate to complete tasks, solve problems, and govern systems and protocols. In early 2024, Vitalik mentioned the possibility of using AI agents in prediction markets and arbitration in his article (The Prospects and Challenges of Cryptocurrency and AI Applications). He believes that, from a macro perspective, multi-agent systems show significant potential in 'truth' discovery and autonomous governance systems.
The industry is continuously exploring and experimenting with the boundaries of multi-agent systems' capabilities and various forms of 'collective intelligence'. As an extension of coordination between agents, the coordination between agents and humans also constitutes an interesting design space, especially regarding how communities interact around agents and how agents organize humans to engage in collective actions.
Researchers pay particular attention to agent experiments where the objective function involves large-scale human coordination. Such applications require corresponding verification mechanisms, especially when human work is done off-chain. This human-machine collaboration may give rise to some unique and interesting emergent behaviors.
3. Intelligent Agent Multimedia Entertainment
The concept of digital personas has existed for decades.
As early as 2007, Hatsune Miku was able to hold sold-out concerts in venues with 20,000 attendees;
The virtual influencer Lil Miquela, born in 2016, has over 2 million followers on Instagram.
The AI virtual anchor Neuro-sama, launched in 2022, has accumulated over 600,000 subscribers on Twitch;
The virtual K-pop group PLAVE, established in 2023, has garnered over 300 million views on YouTube in less than two years.
With the advancement of AI infrastructure and the integration of blockchain in payment, value transfer, and open data platforms, these intelligent agents are expected to achieve a higher degree of autonomy by 2025, potentially creating a new mainstream category of entertainment.
From the top left corner clockwise: Hatsune Miku, Luna from Virtuals, Lil Miquela, and PLAVE
4. Generative/Intelligent Agent Content Marketing
Unlike the situation where intelligent agents themselves are products, intelligent agents can also serve as complementary tools to products. In today's attention economy, continuously producing engaging content is crucial for the success of any creative, product, or company. Generative/intelligent agent content is becoming a powerful tool for teams to ensure 24/7 uninterrupted content production.
The development of this field has been accelerated by discussions about the boundaries between meme coins and intelligent agents. Even though 'intelligence' has not been fully realized, intelligent agents have already become a powerful means for meme coins to gain traction.
The gaming sector provides another typical case. Modern games increasingly need to maintain dynamism to sustain user engagement. Traditionally, cultivating user-generated content (UGC) has been the classic method for creating game dynamism. Purely generative content (including in-game items, NPC characters, fully generated levels, etc.) may represent the next stage of this evolution. Looking ahead to 2025, the capabilities of intelligent agents will greatly expand the boundaries of traditional distribution strategies.
5. Next-Generation Art Tools and Platforms
The 'Dialogue Series' (IN CONVERSATION WITH), launching in 2024, interviews artists active in music, visual arts, design, and curation in the cryptocurrency field and its periphery. These interviews reveal an important observation: artists interested in cryptocurrency often pay attention to broader frontier technologies and tend to deeply integrate these technologies into the aesthetics or core of their artistic practices, such as AR/VR objects, code-based art, and live programming art.
Generative art and blockchain technology have historically had a synergistic effect, making their potential as AI art infrastructure more apparent. It is extremely difficult to properly showcase these new forms of artistic media on traditional display platforms. The ArtBlocks platform showcases a future vision of digital art display, storage, monetization, and preservation using blockchain technology, significantly improving the overall experience for artists and audiences.
In addition to showcasing capabilities, AI tools have also expanded the ability of the general public to create art. This trend towards democratization is reshaping the landscape of artistic creation. Looking ahead to 2025, how blockchain technology will expand or empower these tools will be a highly attractive development direction.
Excerpt from (Dialogue: Maya Man)
6. Data Market
It has been 20 years since Clive Humby stated that 'data is the new oil', and major companies have been taking strong measures to hoard and monetize user data. Users have realized that their data is the cornerstone of these billion-dollar companies, but they have little control over how their data is used and cannot share in the profits generated from it. With the rapid development of powerful AI models, this contradiction has become even more pronounced.
The opportunities facing data markets can be divided into two aspects: first, addressing the problem of user data exploitation, and second, solving the data supply shortage problem, as increasingly larger and better models are consuming public internet data, the easily accessible 'oil field', and require new data sources.
Data Power Returns to Users
On the question of how to utilize decentralized infrastructure to return data power to users, this is a broad design space that requires innovative solutions across multiple fields. Some of the most pressing issues include:
Data storage locations and how to protect privacy during storage, transmission, and computation processes;
How to objectively assess, filter, and measure data quality;
What mechanisms to use for attribution and monetization (especially in tracing value back to its source after inference);
And what kind of orchestration or data retrieval systems to use in a diverse model ecosystem.
Supply Constraints
In addressing supply constraints, the key is not simply replicating Scale AI's model with tokens, but understanding where we can build advantages under favorable technological conditions and how to construct competitive solutions, whether in terms of scale, quality, or better incentives (and filtering mechanisms) to create higher-value data products. Especially as most demand still comes from Web2 AI, considering how to combine smart contract execution mechanisms with traditional service level agreements (SLA) and tools is an important research area.
7. Decentralized Computing
If data is a fundamental element in AI development and deployment, then computing power is another key component. The traditional model of large data centers has largely dominated the trajectory of deep learning and AI development over the past few years due to its unique site, energy, and hardware advantages. However, physical limitations and the development of open-source technologies are challenging this paradigm.
The first phase (v1) of decentralized AI computation is essentially a replica of Web2 GPU cloud services, with no real advantages on the supply side (hardware or data centers) and limited organic demand.
In the second phase (v2), some outstanding teams have built a complete technology stack on heterogeneous high-performance computing (HPC) supply, demonstrating unique capabilities in scheduling, routing, and pricing, while developing proprietary features to attract demand and cope with profit compression, especially on the inference side. Different teams have also started to differentiate in usage scenarios and market strategies, with some focusing on integrating compiler frameworks for efficient inference routing across hardware, while others pioneer distributed model training frameworks on their constructed computing networks.
The industry is even beginning to see the rise of AI-Fi markets, with innovative economic primitives emerging that transform computing power and GPUs into income-generating assets, or utilize on-chain liquidity as alternative funding sources for data centers to acquire hardware.
The primary question here is to what extent decentralized AI will develop and deploy on decentralized computing infrastructure, or whether a gap between ideals and actual needs, as seen in the storage domain, will always exist, making it difficult for this concept to fully realize its potential.
8. Calculation Accounting Standards
In the incentive mechanisms of decentralized high-performance computing networks, a major challenge in coordinating heterogeneous computing resources is the lack of unified calculation accounting standards. AI models add multiple unique complexities to the output space of high-performance computing, including model variants, quantization schemes, and randomness levels adjustable via model temperature and sampling hyperparameters. Additionally, AI hardware may produce different output results due to variations in GPU architectures and CUDA versions. These factors ultimately necessitate the establishment of standards to regulate how models and computing markets measure their computational capabilities in heterogeneous distributed systems.
Partly due to a lack of these standards, 2024 saw multiple cases in the Web2 and Web3 fields where models and computing markets failed to accurately account for their computational quality and quantity. This led users to conduct their own comparative model benchmarking and execute work proof by limiting the rate of the computing market to audit the real performance of these AI layers.
Looking ahead to 2025, the intersection of cryptography and AI is expected to achieve breakthroughs in verifiability, making it easier to validate compared to traditional AI. For ordinary users, it is crucial to be able to make fair comparisons of all aspects of defined models or computational cluster outputs, which will aid in auditing and assessing system performance.
9. Probabilistic Privacy Primitives
In 'The Prospects and Challenges of Cryptocurrency and AI Applications,' Vitalik points out a unique challenge in connecting cryptocurrency and AI: 'In the field of cryptography, open source is the only way to achieve true security, but in the AI field, the openness of models (and even their training data) greatly increases the risk of adversarial machine learning attacks.'
While privacy is not a new area of blockchain research, the rapid development of AI is accelerating the research and application of cryptographic primitives that support privacy. Significant progress has been made in privacy-enhancing technologies in 2024, including Zero-Knowledge Proofs (ZK), Fully Homomorphic Encryption (FHE), Trusted Execution Environments (TEEs), and Multi-Party Computation (MPC), which are used in general application scenarios such as private sharing states for encrypted data computation. At the same time, centralized AI giants like Nvidia and Apple are also using proprietary TEEs for federated learning and private AI inference, ensuring privacy while maintaining hardware, firmware, and model consistency across systems.
Based on these developments, the industry is closely monitoring advancements in privacy-preserving technologies in random state transitions, and how these technologies accelerate the practical landing of decentralized AI applications on heterogeneous systems. This includes various aspects, from decentralized private inference to encrypted data storage/access pipelines, as well as fully sovereign execution environments.
Apple's AI Technology Stack and Nvidia's H100 Graphics Processor
10. Agent Intent and Next-Generation User Transaction Interfaces
Over the past 12-16 months, there has been a lack of clarity in defining concepts such as intent, agent behavior, agent intent, solutions, and agent solutions, and how these differ from the traditional 'robot' development in recent years has not been clearly defined. AI agents autonomously conducting on-chain transactions is one of the closest applications to realization.
In the next 12 months, the industry expects to see more complex language systems combined with different data types and neural network architectures, advancing the overall design space. This raises several key questions:
Will agents use existing on-chain transaction systems, or develop their own tools and methods?
Will large language models continue to serve as the backend for these agent transaction systems, or will entirely new systems emerge?
At the interface level, will users begin to use natural language for transactions?
Will the classic 'wallet as browser' concept eventually be realized?
The answers to these questions will profoundly impact the future direction of cryptocurrency transactions. As AI technology advances, agent systems may become more intelligent and autonomous, better able to understand and execute user intentions.