It is amazing that @hyperfy_io, the new standard framework for AI Agent 3D modeling praised by @shawmakesmagic, has been speed-run to nearly 200M. This is the opening lineup of the#ai16zfamily bucket. Why? The logic is actually very simple:
1) Hyperfy is building a MetaAIverse AI Agent metaverse, with a grand vision. If the Metaverse that emerged after the last round of NFT craze is too conceptual, the AI-enabled Metaverse may have greater room for imagination.
Because the Metaverse in the past was mainly troubled by the difficulty of content production, all 3D modeling had to rely on third-party platforms. Not only was the development cost high (creative ideas - 3D modeling, materials, animation, scene combination, etc.), but it was also easy to collapse due to poor experience.
The logic of the metaverse under the AI Agent trend is completely different: AI automatically generates 3D modeling content and can input commands in real time for intelligent scene linkage, which is likely to bring about a purely personalized and free virtual world without preset and fixed NPCs.
From a narrative perspective, the project does have a lot of room for imagination. It's a pity that the market value was quickly pushed to such a high level at the opening, which is beyond the judgment of technical logic and perspective. Therefore, I can't give any investment advice, DYOR.
2) However, from the perspective of business logic, Hyperfy should be classified as a framework and standard series, directly targeting $GAME in the#Virtualecosystem. After all, Virtual has attracted the attention of a large number of developers through the GAME game framework. Obviously, Shaw's introduction of Hyperfy into the family bucket will directly give birth to a framework that rivals or even surpasses GAME, further consolidating ai16z's moat, which is indeed a strategic consideration.
Hyperfy's technical architecture is not difficult to understand. It is mainly based on the underlying capabilities of Three.js, using webGL to implement 3D scene rendering using GPU. AI can automatically generate 3D scenes based on text descriptions, adjust scene layouts and details in real time, and even NPCs can automatically perceive user behavior and respond in a personalized manner, etc. The most important thing is that the entire complex workflow can be rendered in real time through the browser, achieving an experience with a complex backend but a very low threshold for the front end.
To learn about the specific technical maturity, you can visit Hyperfy’s official website to experience it. I don’t think the current experience can be much better, but there is too much room for imagination.
Shaw spared no effort to promote it, directly targeting Virtual's GAME framework, which is directly beneficial to the entire ai16z commercial IP. (If anchored on the valuation system of GAME, it is not surprising that it was quickly passed to this height.
3) Imagine that in the future, if the Metaverse world driven by AI agents matures, there will be many application scenarios. For example, users can input multimodal input prompts such as text, sketches, and voice to AI agents to get instant response 3D models and animations. They can also use the computer's visual system to automatically recognize gestures and even expressions to make corresponding responses...
Based on this, AI Agent can be used to simulate 3D educational environments, subvert the interactive experience of 3D games, build smart exhibition halls and AI virtual conference rooms, and so on.
You see, with the Metaverse wave and those beautiful scenarios that the market imagined, the AI Agent era has finally been continued, and the imagination space and feasibility will be even greater.
How can it not make people excited?