The cost of training artificial intelligence (AI) models is skyrocketing, with projections suggesting a significant increase over the next few years. Dario Amodei, CEO of the AI startup Anthropic, highlighted these rising expenses during a recent episode of the “In Good Company” podcast.

Also Read: Samsung workers go on strike demanding higher wages during growing AI competition

Current advanced AI models, such as ChatGPT-4, require around $100 million to train. But, according to Amodei, this could rise to between $10 billion and $100 billion in the next three years. 

Artificial general intelligence development drives costs higher

The sharp rise is due to the shift from generative AI, such as ChatGPT, to artificial general intelligence (AGI). The progress towards AGI seeks to develop systems that can comprehend, acquire, and utilize knowledge in a manner similar to the human mind.

“I think if we go to $10 or $100 billion, and I think that will happen in 2025, 2026, maybe 2027… then I think there is a good chance that by that time we’ll be able to get models that are better than most humans at most things.” 

Amodei

According to Tom’s Hardware, the existing framework for these advancements is massive at the moment. For instance, to train ChatGPT, over 30,000 GPUs were needed, and each Nvidia B200 AI chip costs between $30,000 and $40,000. This hardware investment is one of the elements that may be causing the increased expenses. 

Computational resources drive up AI training costs

There are several reasons why the cost of training for AI is on the rise. The main reason is that the amount of computational resources needed is immense. With the advancement in deep learning models, powerful GPUs, and other specially designed hardware are needed. Over 3. 8 million GPUs were shipped to data centers in 2023, which demonstrates the extent of the infrastructure required. 

Also Read: Artificial Intelligence to affect a broad range of public services

Another important factor is energy consumption. The power consumption of all the GPUs sold last year would be enough to power 1. 3 million homes. This high energy consumption not only leads to high expenses for the business but also raises issues on the effects on the environment and conservation.   According to a recent report from Google, the company’s emissions rose by nearly 50% over four years, primarily because of the energy required for AI learning.

Tech giants invest heavily in AI infrastructure

Moreover, leading technology firms are investing a lot of money towards the advancement of artificial intelligence. For instance, Elon Musk wants to buy 300,000 of Nvidia’s state-of-the-art AI chips. Likewise, Microsoft and OpenAI are said to be working on a $100 billion data center for the advancement of AI.

Also Read: Wimbledon employs AI to combat online abuse

Despite these rising costs, there are attempts to optimize the costs of AI training. Google’s DeepMind has recently presented a technique called Joint Example Selection (JEST) that claims to reduce the number of iterations by a factor of 13 and the computational resources needed by a factor of 10. This helps cut down the resources to be used and the time to be taken.

However, even with these advancements, the overall direction is towards higher costs due to the advancement of AGI. From generative AI to AGI, the models are required to interpret large data sets, learn from them, anticipate different situations, and solve problems that call for critical thinking. 

Cryptopolitan Reporting by Brenda Kanana