December 21 news, regarding the recent AGI-related product dynamics in the field of OpenAI and AI, Ethereum founder Vitalik Buterin stated on the X platform, "My definition of AGI (Artificial General Intelligence) is: AGI is a sufficiently powerful artificial intelligence that if one day all humans suddenly disappear, and this AI is uploaded into a robotic body, it will be able to independently continue civilization. Clearly, this is a very difficult definition to measure, but I believe it captures the core of the intuitive distinction in many people's minds between 'the AI we are accustomed to' and 'AGI'. It marks a shift from a tool that constantly relies on human input to a self-sufficient form of life. ASI (Artificial Superintelligence) is completely another matter — my definition is that when humans no longer add value to productivity in the loop (like in board games, we actually only reached this point in the past decade). Yes, ASI scares me — even the AGI I defined also scares me because it brings obvious risks of losing control. I support focusing our work on building intelligence-enhancing tools for humans rather than creating superintelligent forms of life."