OpenAI co-founder Ilya Sutskever recently gave a lecture at the Neural Information Processing Systems (NeurIPS) 2024 conference in Vancouver, Canada arguing that the age of artificial intelligence pre-training is ending and forecasted the rise of an AI superintelligence.
According to Sutskever, increasing computing power through better hardware, software, and machine-learning algorithms outpaces the total amount of data available for AI model training. The AI researcher likened data to fossil fuels, which will eventually run out. Sutskever said:
"Data is not growing because we have but one internet. You could even say that data is the fossil fuel of AI. It was created somehow, and now we use it, and we've achieved peak data, and there will be no more — we have to deal with the data that we have."
The OpenAI co-founder predicted that agentic AI, synthetic data, and inference time computing were the next evolutions of artificial intelligence that will eventually give rise to an AI superintelligence.
Charts comparing compute power and dataset size for AI pre-training. Source: TheAIGRID, Ilya Sutskever
AI agents take the crypto world by storm
AI agents go beyond the current chatbot models by being able to make decisions without human input and have become a popular narrative in the crypto space with the rise of AI memecoins and large-language models (LLMs) like Truth Terminal.
Truth Terminal went viral after the LLM began to promote a memecoin called Goatseus Maximus (GOAT), which eventually hit a market capitalization of $1 billion — attracting attention from retail investors and venture capitalists.
GOAT token market information. Source: CoinMarketCap
Google's DeepMind artificial intelligence laboratory unveiled Gemini 2.0 — an artificial intelligence model that will power AI agents.
According to Google, agents built with the Gemini 2.0 framework will be able to assist in complex tasks such as coordinating between websites and logical reasoning.
Advancements in AI agents that can independently act and reason will lay the groundwork for AI to move past data hallucinations.
AI hallucinations occur due to incorrect data sets and because AI pre-training increasingly relies on using older LLMs to train newer LLMs, which degrades performance over time.
Magazine: A bizarre cult is growing around AI-created memecoin ‘religions’: AI Eye