Extropic designed the world’s first scalable probabilistic computer
News Analysis

Extropic Claims 10,000x Energy Savings With New Probabilistic AI Chip

5 minute read
Michelle Hawley avatar
By
SAVED
A new probabilistic computing approach from Extropic promises radically lower energy use for generative AI, challenging GPU-based AI infrastructure.

Key Takeaways

  • Extropic claims its new “probabilistic” hardware could run GenAI using far less energy than GPUs.
  • Simulations show ~10,000x energy savings with its new Denoising Thermodynamic Model (DTM).
  • Extropic said it plans to remove power constraints that limit AI scaling today.

Table of Contents

AI Maxes Out the Power Grid   

The AI boom has come with a physical constraint most consumers never see: electricity.

Data centers across the world struggle to secure enough power to support AI training and inference. Three years ago, tech startup Extropic bet that energy — not chips, not data — would become the primary limit to AI scaling. In their latest announcement, they say that bet has proven correct.

"With today’s technology, serving advanced models to everyone all the time would consume vastly more energy than humanity can produce."

- Extropic Officials

Rather than work on energy generation, which would require major infrastructure and government support, Extropic targeted another side of the problem: how to make AI itself more energy efficient. 

Extropic Introduces First Scalable Probabilistic Computer

Modern AI is built on GPUs, a type of processor originally designed to render graphics. GPUs evolved into AI accelerators because they are good at matrix multiplication, the core mathematical operation behind neural networks. But GPUs are not energy efficient, and most of their power consumption goes into moving information around the chip, not the math itself.

Extropic claims to have designed an alternative: a new class of AI chip — a scalable probabilistic computer — built for sampling probability directly instead of performing GPU-style matrix math.

Extropic designed the world’s first scalable probabilistic computer

According to the company, their hardware:

  • Uses “orders of magnitude” less energy than GPUs
  • Performs AI tasks by sampling probability, not crunching large matrices
  • Was fabricated and tested in silicon
  • Runs a new kind of generative AI algorithm

This new device is called the Thermodynamic Sampling Unit (TSU).

Related Article: The Billion-Dollar Data Center Boom No One Can Ignore

How a TSU Works

TSUs function as probabilistic AI chips. Most AI chips today, including GPUs and TPUs, perform massive matrix multiplications to estimate probabilities and then sample from them. Extropic’s hardware claims to skip the matrix multiplication entirely and directly sample from complex distributions.

Key Claims About TSUs

Extropic states that TSUs:

  • Are built from large arrays of probabilistic cores
  • Sample from energy-based models (EBMs), a class of machine learning (ML) models
  • Use the Gibbs sampling algorithm to combine many simple probabilistic circuits into complex distributions
  • Minimize energy by keeping communication strictly local — circuits only interact with nearby neighbors

This last point is critical. Extropic argued that the biggest energy drain in GPUs is data movement. By designing hardware where communication is entirely local, the TSU architecture avoids expensive long-distance wiring and voltage changes within the chip.

In other words: TSUs are built to be physically, and therefore energetically, optimized for probability, not arithmetic.

How TSUs Compare to AI Chips

  • GPUs/TPUs: Deterministic math engines optimized for matrix multiplication
  • TSUs: Probabilistic chips that generate samples directly
  • pbits: Transistor-based probabilistic bits that fluctuate between 0 and 1
  • Goal: Deliver generative AI using far less energy than GPU-based systems

The Smallest Building Block: The pbit

At the core of the TSU is what Extropic calls a pbit.

  • A traditional digital bit is always a 1 or a 0
  • A pbit fluctuates randomly between 1 and 0
  • The probability of being in either state is programmable

This makes a pbit essentially a hardware random number generator.

A single pbit is not very useful. But, as Extropic noted, neither is a single NAND gate. Combine enough of them, and you get a functioning computer. 

Extropic claims that:

  • Existing academic pbit designs were not commercially viable because they required exotic components
  • Extropic designed a pbit built entirely from transistors
  • Its pbits use orders of magnitude less energy to generate randomness
  • A hardware “proof of technology” has already validated the concept

Because pbits are small and energy-efficient, they can be packed tightly into a TSU. And because they are made from ordinary transistors, they can be integrated alongside standard computing circuitry.

A New Generative AI Model: The Denoising Thermodynamic Model

To show how their hardware can be used in real applications, Extropic also developed a new generative AI algorithm called the Denoising Thermodynamic Model (DTM).

DTMs are inspired by diffusion models, the same broad family used by image generators like Stable Diffusion. Like diffusion, a DTM starts with noise and iteratively transforms it into structured output.

However, Extropic states that DTMs are designed specifically for TSUs and are therefore far more energy-efficient.

Chart showing simulations of TSUs running DTMs
According to Extropic, simulations of its first production-scale TSUs could run small-scale generative AI benchmarks using far less energy than GPUs.

According to Extropic:

  • Simulations of DTMs running on TSUs could be 10,000x more energy-efficient than modern algorithms running on GPUs
  • Results can be replicated using thrml, their open-source Python library

Why Extropic's Breakthrough Matters

Extropic framed the problem in simple terms: the world does not have enough power for unlimited AI.

The Bottleneck

  • Every major AI model increases compute requirements
  • Every increase in compute increases energy demand
  • Data centers are already struggling to secure power

If generative AI were served to billions of users continuously — at scale similar to email or search — today’s hardware could consume more energy than the world currently produces.

The Proposed Solution

Improve energy generation and reduce computing energy consumption, removing the energy ceiling preventing widespread, always-on AI. 

Related Article: Why AI Data Centers Are Turning to Nuclear Power

Learning Opportunities

The Real Test: From Research to Build-Out

Extropic said the “fundamental science is done” and the company has entered the build-out phase. To move from small prototypes to production-scale systems, it is hiring:

  • Mixed-signal integrated circuit designers
  • Hardware systems engineers
  • Probabilistic machine learning experts

The company’s XTR-0 development platform has already been beta-tested by early partners, though no partner names were disclosed in the announcement.

Key Takeaways for Enterprise Leaders

1. AI will hit physical energy limits sooner than compute limits.

Data centers cannot scale indefinitely, and electricity is the next bottleneck.

2. Extropic offers a new class of AI chip.

It does not replace GPUs through faster matrix math — TSUs avoid matrix math altogether by generating samples directly.

3. The company has demonstrated early, small-scale tests in hardware and simulation.

This includes a working pbit design and open-source algorithm replication.

4. The energy-efficiency claim — up to 10,000x — remains the centerpiece.

If validated at scale, it would represent a major shift in AI infrastructure economics.

What Comes Next in AI Computing 

As of today, Extropic’s achievements exist at prototype and simulation scale. The next step, building full production TSUs, will determine whether this becomes a foundational shift in AI computing or a promising research direction.

The company is confident in its trajectory:

"Once we succeed, energy constraints will no longer limit AI scaling."

- Extropic Officials

For now, Extropic is still in the early stages of turning breakthrough into deployment. But the ambition is clear: rebuild computing from the ground up to match what AI actually is — probabilistic, not deterministic.

Frequently Asked Questions

Generative AI is a primary target. Because models like diffusion systems rely on probability sampling, these workloads align naturally with the TSU’s architecture. Extropic demonstrated this through its Denoising Thermodynamic Model, which was built specifically for TSU hardware.

No. Extropic has produced a hardware proof of technology and a development platform called XTR-0. It has been beta tested by early partners, but is not yet a commercial-scale system. 

The startup is now hiring hardware and machine learning experts to scale TSUs into production-ready systems.

Extropic released an open-source Python library called thrml, which simulates TSUs and allows developers to experiment with thermodynamic machine learning models today. The company also funded an independent replication of its research using thrml, and that replication is publicly available.
No single technology solves the AI energy crisis. Extropic claims TSUs as one half of the equation: reducing energy per AI workload. The other half, producing more power, remains the focus of companies building new energy infrastructure, such as nuclear-powered data centers. TSUs aim to make AI less dependent on massive increases in electricity supply.
About the Author
Michelle Hawley

Michelle Hawley is an experienced journalist who specializes in reporting on the impact of technology on society. As editorial director at Simpler Media Group, she oversees the day-to-day operations of VKTR, covering the world of enterprise AI and managing a network of contributing writers. She's also the host of CMSWire's CMO Circle and co-host of CMSWire's CX Decoded. With an MFA in creative writing and background in both news and marketing, she offers unique insights on the topics of tech disruption, corporate responsibility, changing AI legislation and more. She currently resides in Pennsylvania with her husband and two dogs. Connect with Michelle Hawley:

Featured Research