Running AI costs a staggering amount of electricity. Data centers powering systems like ChatGPT and Gemini are on track to consume over 1,000 terawatt-hours globally this year, roughly equal to Japan’s entire energy usage. A team of researchers at the University of Cambridge thinks they may have found a way to change that, and the answer came from the organ sitting inside your skull.
The researchers built a tiny electronic device that mimics the way synapses work in the human brain. Published in Science Advances, their work describes a memristor, a component that can both store and process information in the same physical spot, made from a modified form of hafnium oxide laced with strontium and titanium.
Why does that matter? In traditional computer chips, data has to travel back and forth between memory and the processor. It’s like having your kitchen in one building and your dining room in another. Every meal requires a commute. The brain doesn’t work that way. Neurons and synapses handle storage and processing together, right where the action happens, and they do it on roughly 20 watts of power. Your laptop charger draws more than that.
“Energy consumption is one of the key challenges in current AI hardware,” said lead author Dr. Babak Bakhit from Cambridge’s Department of Materials Science and Metallurgy. “To address that, you need devices with extremely low currents, excellent stability, outstanding uniformity across switching cycles and devices, and the ability to switch between many distinct states.”
What makes this particular memristor different from previous attempts is how it actually switches. Most memristors rely on tiny filaments that grow and break inside the material, a process that’s about as predictable as lightning striking the same tree twice. Bakhit’s device instead works through an interface mechanism, creating a junction between two layers where the energy barrier can be smoothly tuned up or down.
“Filamentary devices suffer from random behaviour,” Bakhit said. “But because our devices switch at the interface, they show outstanding uniformity from cycle to cycle and from device to device.”
The numbers back that up. The memristors endured more than 10,000 pulse-switching cycles with stable resistance states, and their binary states remained steady for over 500,000 seconds (that’s nearly six days of continuous operation). They also managed switching currents below 10 nanoamps and produced several hundred distinct conductance levels, meaning they can handle far more nuance than a simple on-off switch.
Perhaps more impressive is what the device can actually do. The Cambridge team demonstrated that their memristors reproduce several learning behaviors found in biological synapses: paired-pulse facilitation (getting stronger with repeated signals), paired-pulse depression (weakening with repeated signals), short-term plasticity, and spike-timing dependent plasticity. That last one is a mouthful, but it’s essentially how your brain decides which connections to strengthen based on timing, the fundamental mechanism behind learning and memory.
“These are the properties you need if you want hardware that can learn and adapt, rather than just store bits,” Bakhit said.
The approach falls under a broader field called neuromorphic computing, which aims to build chips that process information the way brains do. The market for this technology is expected to reach $13.2 billion by 2028, driven largely by the fact that current AI systems are extraordinarily wasteful with electricity. A single ChatGPT query uses roughly ten times more energy than a Google search.
The Cambridge device is made using hafnium oxide, a material already common in semiconductor manufacturing, which means it wouldn’t require factories to completely retool. The material self-assembles at low temperatures, another practical advantage for potential mass production. The researchers estimate this approach could reduce energy consumption in AI hardware by as much as 70%.
There’s a catch, though. The technology still has a temperature sensitivity issue that needs solving before it can move from the lab to a factory floor. Bakhit was candid about the road ahead.
“I spent almost three years on this. There were a huge number of failures,” he said. “But at the end of November, we saw the first really good results. It’s still early days of course, but if we can solve the temperature issue, this technology could be game-changing because the energy consumption is so much lower and at the same time, the device performance is highly promising.”
The research was led by Bakhit under the supervision of Professor Judith MacManus-Driscoll at Cambridge’s Department of Materials Science and Metallurgy. The full paper, titled “HfO2-based memristive synapses with asymmetrically extended p-n heterointerfaces for highly energy-efficient neuromorphic hardware,” is available in Science Advances.
