Incredible new research in the field of “photonic computing” could have breakthrough implications for the development of human-level artificial intelligence.
Scientists in China recently developed an artificial intelligence training chip that uses light to conduct computations instead of electricity. According to the team, it’s extremely more efficient than the most popular AI chips on the market.
Meanwhile, an unrelated team of researchers out of Oxford demonstrated that similar light-based computing techniques can be conducted using regular light sources, as opposed to high-powered lasers.
These two breakthroughs represent a potential fork in the road for the development of artificial general intelligence (AGI), also known as “human-level AI.”
Artificial General Intelligence
AGI isn’t a scientific term. It’s a purely theoretical idea that basically means “a machine smart enough to do anything an average human could, given the same resources.”
Scientists are exploring numerous paths to AGI, with generative pre-trained transformers (GPTs) being one of the most popular. However, some researchers claim that GPTs are a dead end on the path to AGI and still others claim that we’ll need something more powerful than classical computers to imitate the human brain.
Photonic computing
The use of light to perform computations has been around since the 1960s. It’s often referred to as optic computing and, according to physicists working in the field, it could one day replace electrical-signal compute due to the fact that generating light takes a lot less energy than generating electricity.
Now that a team in China has developed a photonic computer chip expressly for the purpose of training an AI model, and a separate team in the United Kingdom have demonstrated photonic computing using normal light, it appears as though there are new options available for AI researchers.
Classical or quantum?
In order to try and approximate human thinking, AI developers continue to scale models such as GPT-4o in hopes that one day they’ll be big enough to imitate the multitude of connections occurring between the 100 billion neurons and 1,000 trillion synapses in our brains.
But research suggests our brains work more like quantum computers. If true, a binary AI model would, theoretically, have to contain more artificial neurons and synapses than a human brain by orders of magnitude in order to come close to approximating its complexity.
This leaves researchers with two possibilities: they can either maximize their binary compute or start over with quantum-enabled hardware and training solutions.
If photonic AI chips prove to be a viable, energy-efficient alternative to the status quo, then it’s possible they could push GPT models beyond what would otherwise be feasibly possible due to their efficiency alone.
And, when it comes to interfacing with any potential quantum AI solutions in the future, it’s a simple fact of nature that light travels faster than electricity.
Related: UK-based tech startup could become Nvidia of quantum