CoinVoice recently learned that regarding OpenAI and the recent developments of AGI-related products in the AI field, Ethereum founder Vitalik Buterin stated on the X platform, 'My definition of AGI (Artificial General Intelligence) is: AGI is a sufficiently powerful artificial intelligence that if one day all humans suddenly disappear and the AI is uploaded into a robot body, it will be able to independently continue civilization.'
Clearly, this is a very difficult definition to measure, but I think it is precisely at the core of the intuitive distinction between 'the AI we are used to' and 'AGI' that many people have in mind. It marks a shift from a tool that continuously relies on human input to a self-sustaining form of life. ASI (Superintelligent AI) is a completely different matter—my definition is that when humans no longer add value to productivity in the loop (as in board games, we actually only reached this point in the last decade).
Yes, ASI scares me—even the AGI I define scares me because it brings about obvious risks of losing control. I support focusing our work on building intelligence-enhancing tools for humans rather than creating superintelligent life forms.” [Original link]