ChainCatcher message, regarding the recent developments of AGI-related products in the AI field, Ethereum founder Vitalik Buterin stated on the X platform, "My definition of AGI (Artificial General Intelligence) is: AGI is a sufficiently powerful artificial intelligence that, if one day all humans suddenly disappeared and the AI was uploaded into a robotic body, it would be able to independently continue civilization."

Clearly, this is a very difficult definition to measure, but I think it captures the core of the intuitive distinction many people have in their minds between 'the AI we are used to' and 'AGI.' It marks the shift from a tool that continuously relies on human input to a self-sufficient form of life. ASI (Artificial Superintelligence) is entirely another matter—my definition is that when humans no longer add value to productivity in the loop (as in board games, we only reached this point in the past decade).

Yes, ASI makes me feel scared—even the AGI as I define it scares me because it brings obvious risks of losing control. I support focusing our work on building intelligent enhancement tools for humanity rather than creating superintelligent forms of life."