As AI data centers consume an ever-larger share of global electricity — with some projections suggesting AI could account for 10% of global power consumption by 2030 — researchers have been racing to find fundamentally more efficient approaches to AI computation. A new breakthrough, published on April 23, 2026, offers one of the most promising answers yet: a neuromorphic chip that mimics the energy efficiency of biological neurons and reduces AI energy use by up to 70%.

The chip uses artificial neurons that fire in patterns closely resembling biological brain cells. Unlike conventional digital chips, which process information using binary logic gates that are either fully on or fully off, neuromorphic chips use analog circuits that can represent a continuous range of values — much like the graded potentials of biological neurons. This approach is inherently more energy-efficient for the kinds of pattern recognition and inference tasks that dominate AI workloads.

How Neuromorphic Computing Works

The fundamental insight behind neuromorphic computing is that the human brain — which can perform sophisticated reasoning on roughly 20 watts of power — is vastly more energy-efficient than any silicon chip. The brain achieves this through sparse activation, local learning, and event-driven computation. The new chip incorporates all three principles, implemented using printed organic semiconductor devices that are both cheaper and more flexible than conventional silicon fabrication.

Data Visualization

AI Chip Energy Efficiency Comparison

NVIDIA H100NVIDIA B200Google TPU v5Intel Gaudi 3Neuromorphic (new)0255075100
  • Relative Energy
Relative energy consumption for equivalent AI inference workloads. The new neuromorphic chip achieves approximately 70% lower energy use than the NVIDIA H100 baseline.

The 70% energy reduction figure applies to inference workloads — the process of running a trained AI model to generate predictions or responses. Training remains a challenge for neuromorphic hardware, but inference is where most of the energy in deployed AI systems is consumed, making this the more commercially significant target. The UK Parliament's Science and Technology Committee has opened an inquiry into low-energy computing approaches, citing this research as one of several promising directions.