Author: 0xJeff
Compiled by: Deep Tide TechFlow
The tokenization of [_______] has always been a concept of great interest. Although it seems simple, new things being tokenized always attract rapid attention.
Here are some tokenization trends we have observed, their development history, and possible future directions:
Tokenization of assets
Asset tokenization is one of the earliest tokenization trends to emerge.
Bitcoin pioneered the first decentralized, secure, and transparent ledger system, laying the foundation for digital representation of assets.
Subsequently, the launch of Ethereum in 2015 further propelled this trend. Ethereum introduced smart contracts, allowing assets to be programmed, enabling more efficient management and trading of real estate, artwork, or decentralized finance (DeFi) through tokenization.
As of now, the fully diluted valuation (FDV) of $ETH has soared to $470 billion, proving the immense impact of tokenization in the asset realm.
Tokenization of art (NFTs)
The rise of NFTs has expanded the application of tokenization into the art domain.
In 2017, projects like CryptoPunks and CryptoKitties brought NFTs into the public eye.
By 2021, NFT trading volume surged to $13 billion, becoming the primary form of digital art and collectibles.
Some well-known series like CryptoPunks, BAYC (Bored Ape Yacht Club), and Art Blocks saw single pieces reaching prices of millions of dollars during the market peak in 2021.
Tokenization of revenue
Yield tokenization is another significant breakthrough in the field of tokenization.
In 2021, @pendle_fi first proposed the idea of tokenizing future revenues.
Through the Pendle platform, users can trade fixed and variable income, bringing greater flexibility and liquidity to the DeFi market.
Pendle experienced rapid growth in 2023 with the popularization of liquid staking tokens (LSTs) and the launch of the point market in early 2024.
As of now, the fully diluted valuation (FDV) of $PENDLE has reached $1.6 billion, showcasing the market potential of yield tokenization.
(Tweet details)
Tokenization of AI agents
Today, the tokenization of AI agents is becoming a new trend.
@virtuals_io launched a platform where users can create AI agents and tokenize them. This approach not only makes the development of AI agents more flexible but also effectively reduces development costs.
The concept of tokenization of AI agents began in October 2024, when Virtuals first built a market focused on agent ownership, allowing users to hold and trade rights to AI agents in token form.
To date, the fully diluted valuation (FDV) of $VIRTUAL has reached $2.5 billion.
(Tweet details)
Discovering trends
In all these categories—assets, art, revenue, AI agents—we can clearly see that pioneers in each field are often quickly accepted by the market and lead to significant price changes.
So, what will the next direction of tokenization be?
Here are some trends and ideas I am focusing on:
Tokenization of data
@withvana is actively exploring the possibilities of Data DAOs (DataDAOs) and Data Liquidity Pools (DLPs).
Users can contribute data to these pools while retaining ownership of the data and receive corresponding rewards based on the quality of the contributed data.
Essentially, this model transforms data into a highly liquid and tradable asset.
$VANA will officially launch on December 16 (listed on Binance). However, it is important to note that its fully diluted valuation (FDV) has not yet been clarified, but the concept of data tokenization ownership may have far-reaching impacts.
Tokenization of attention
@_kaitoai is attempting to tokenize attention and introduce it into the world of Web3. They demonstrate the ability to generate and attract more attention through platforms, mindshare dashboards, and the recently launched Yap-to-Earn feature (though due to Twitter's rate limiting, I have not been able to log in successfully; can anyone help resolve this?).
Their Yapper Leaderboard encourages thought leaders to 'Yap' more to earn Yap points, ultimately leading to an airdrop of $KAITO tokens.
Simply put, Yap equals attention, and attention converts to $KAITO.
This is an interesting attempt at how Web3 redefines user engagement.
Tokenization of AI applications
This trend seems to be a natural extension of the tokenization of AI agents.
With the popularity of tools like @Replit and the rapid development of the agent ecosystem, we are gradually approaching the creation of personalized software.
Tokenized AI applications allow users to participate in the initial stages of development and own a portion of the application's future revenue.
Key competitors in this field:
@alchemistAIapp and @myshell_ai are two leading platforms in this field.
Both platforms provide creators the ability to build income-generating AI applications, offering practical and scalable solutions.
Among them, Myshell goes a step further by allowing investors to directly invest in these AI applications and receive a share of the income they generate in the future. This model not only supports development but also establishes a symbiotic relationship between creators and investors.
Final thoughts
The trend of tokenization always spurs new waves of innovation and market adoption. But their appeal lies not just in the technology itself, but in how they bring people together and shift attention toward new opportunities.
What will be the next big tokenization trend? I can’t say for sure, but these ideas are undoubtedly worth paying attention to.
Disclaimer
This article is for informational and entertainment purposes only. The opinions expressed herein do not constitute investment advice or recommendations. Investors should conduct thorough due diligence before investing and make decisions based on their financial situation, investment objectives, and risk tolerance (factors not considered in this article). This article does not constitute an offer or solicitation to buy or sell any of the assets mentioned.