CoinVoice has recently learned that regarding OpenAI and recent developments related to AGI products in the AI field, Ethereum founder Vitalik Buterin posted on the X platform stating, 'My definition of AGI (Artificial General Intelligence) is: AGI is a sufficiently powerful artificial intelligence that, if one day all humans suddenly disappeared and this AI was uploaded into a robotic body, it would be able to independently continue civilization.'
Clearly, this is a very difficult definition to measure, but I think it is at the core of the intuitive distinction between 'the AI we are accustomed to' and 'AGI' in the minds of many people. It marks the shift from a tool that continually relies on human input to a self-sustaining form of life. ASI (Superintelligent AI) is a completely different matter – my definition is that it occurs when humans no longer add value to productivity in the loop (just like in board games, we actually reached this point only in the last decade).
Yes, ASI scares me – even the AGI I define scares me because it brings obvious risks of losing control. I support focusing our efforts on building intelligence-enhancing tools for humans rather than creating superintelligent forms of life.” [Original link]