BlockBeats news, on December 21, Ethereum founder Vitalik Buterin posted on social media saying, "My definition of AGI (General Artificial Intelligence) is:

AGI is a sufficiently powerful artificial intelligence that if one day all humans suddenly disappeared and this AI was uploaded into a robotic body, it would be able to independently continue civilization.

Clearly, this is a very difficult definition to measure, but I think it is at the core of the intuitive distinction between 'the AI we are used to' and 'AGI' in many people's minds. It marks the transition from a tool that constantly relies on human input to a self-sufficient life form.

ASI (Superintelligent AI) is a completely different matter - my definition is that when humans no longer contribute value to productivity in a cycle (as in board games, we only reached this point in the past decade).

Yes, ASI scares me - even the AGI I define also scares me because it brings obvious risks of losing control. I support focusing our work on building intelligent augmentation tools for humans rather than creating superintelligent life forms.