Machine Learning Street Talk
July 21, 2025

Pushing compute to the limits of physics

In a mind-bending conversation, Guillaume Verdon, founder of Extropic and the philosophical force behind Effective Accelerationism (E/Acc), breaks down why the future of AI isn't about bigger Transformers, but a complete paradigm shift in hardware. This is a journey from quantum mechanics to the thermodynamic reality that governs all complex systems, including intelligence itself.

The Thermodynamic Gambit

  • "We have to... loosen our grip on electrons and hardware. We go from a very tight grip and yanking these signals around to kind of loosening our grip, letting there be fuzz and and kind of gently guiding the signals."
  • "We're running probabilistic software on this stack that was made for determinism. That's highly inefficient."
  • Instead of fighting noise, thermodynamic computing harnesses it. Classical computers spend immense energy forcing transistors into a deterministic "on" or "off" state. Extropic’s approach embraces the natural, probabilistic fuzz of electrons, treating stochasticity as a computational resource to accelerate tasks like MCMC sampling.
  • This directly addresses a core paradox in modern AI: we use energy-guzzling deterministic hardware to run inherently probabilistic software like diffusion models and Transformers. Thermodynamic hardware natively speaks the language of probability, promising massive efficiency gains.

Hitting Moore's Wall

  • "When people say, 'to scale up intelligence, we need to start building nuclear power plants,' I always think, well, I run on a glass of water and a banana... You're a thermodynamic computer, right? That's the thesis."
  • "We can't scale with the current hardware... People thinking we're just going to scale Transformers and get to the moon... are just flat out wrong. It's provably wrong."
  • The current path of scaling AI is unsustainable. Verdon argues that continuing to scale today’s hardware will not only exhaust the power grid but would eventually generate enough waste heat to literally cook the planet.
  • We are hitting a "thermal danger zone" where shrinking transistors makes them susceptible to random thermal fluctuations, rendering deterministic computation exponentially more costly. The solution isn’t better cooling; it’s a new type of physics-based compute.
  • The human brain is the ultimate proof of concept: a 20-watt supercomputer that demonstrates intelligence is fundamentally a thermodynamic process.

The Cambrian Explosion of AI

  • "If you change that hardware substrate... you're changing the fitness landscape, and if you have a sudden shift... you have a sort of Cambrian explosion."
  • "The authors of the Transformer paper... they kept telling us Transformers are not sacred. They were what worked on the hardware we had at the time."
  • AI models are not discovered in a vacuum; they evolve to fit the hardware available. Transformers are dominant because they are optimized for GPUs, not because they are the final form of intelligence.
  • By introducing a new hardware substrate that excels at probabilistic computing, Extropic aims to shatter the current "local optimum" of AI architectures. This shift will make previously intractable models, like Energy-Based Models (EBMs), suddenly viable, sparking a renaissance in algorithmic design.

Key Takeaways:

  • The conversation paints a future where computation, intelligence, and even societal progress are understood through the lens of thermodynamics. The core takeaway is that our obsession with deterministic control is an evolutionary dead end, and the next leap forward requires us to embrace the chaotic, probabilistic nature of the universe.
  • Hardware is the New Frontier. The scaling race isn't about building more data centers for the same old chips. The next 1000x improvement will come from a fundamental paradigm shift that works with physics, not against it.
  • Noise is a Feature, Not a Bug. The future of efficient computing lies in harnessing stochasticity. The "noise" we spend billions to suppress in classical chips is the very resource that can power probabilistic AI models with unparalleled efficiency.
  • Prepare for an Algorithmic Renaissance. The dominance of Transformers is a temporary state dictated by current hardware. As thermodynamic computers become available, developers and researchers should dust off their probabilistic ML textbooks—the algorithms of tomorrow will look very different.

For further insights and detailed discussions, watch the full podcast: Link

This episode reveals how the physical limits of classical computing are forcing a paradigm shift toward thermodynamic hardware, creating a new fitness landscape for AI algorithms and investment.

From Theoretical Physics to AI Hardware

  • The Failure of Reductionism: Verdon explains that the reductionist method was failing to predict the emergent properties of complex systems. The universe, he realized, is better understood as a complex system, where computation is necessary to predict outcomes.
  • It from Qubit: This realization led him to the "it from qubit" school of thought, which views the universe as a giant quantum computer. This framework suggests that to understand a complex system, one needs a programmable complex system to model it.
  • A Shift in Goal: Verdon's ambition evolved from personally discovering the grand unifying theory to building the computational tools that could. As he puts it, "Maybe I can't be the hero of the story... but maybe I could build a computer or computer software that can understand the universe or chunks of the universe for us."

The Leap from Quantum to Thermodynamic Computing

  • The Problem with Cold Computing: Maintaining the near-zero temperature required for quantum computation is incredibly energy-intensive. This led Verdon to question the approach and consider an alternative: "What if we had a hotter physics-based computer? Maybe it'd be much easier to maintain."
  • Harnessing Noise Instead of Fighting It: This insight marks the conceptual leap to thermodynamic computing. Instead of spending vast energy to eliminate noise (stochasticity) as classical and quantum computers do, thermodynamic computing harnesses it. Verdon advocates for "loosening our grip on electrons," letting them be "fuzzy" and gently guiding them rather than forcing them into deterministic states.
  • Strategic Implication: This shift represents a fundamental change in hardware philosophy. For investors, it signals a move away from brute-force deterministic computation and toward systems that leverage natural physical properties for massive energy efficiency gains.

Extropic's Approach: Pushing Compute to the Limits of Physics

  • Markov Chain Monte Carlo (MCMC) Accelerators: At its core, Extropic's technology accelerates MCMC algorithms—a class of algorithms used for sampling from complex probability distributions. This is achieved by embedding the MCMC process directly into the stochastic dynamics of electrons in a mixed-signal chip.
  • The Probabilistic Bit (P-bit): The fundamental unit is the "p-bit," or probabilistic bit. Verdon describes it as a system with two energy wells (representing 0 and 1) where the time a signal spends in each well can be controlled. This creates a "fractional bit" that natively handles probability.
  • The Inefficiency of the Current Stack: Verdon points out the paradox of modern AI: "We're running probabilistic software on this stack that was made for determinism. That's highly inefficient." Models like transformers and diffusion models are inherently probabilistic, yet they run on hardware that spends enormous energy enforcing determinism.

The Inevitable Shift: Moore's Wall and the Thermal Danger Zone

  • The Thermal Danger Zone: As transistors shrink, the random jitter of electrons becomes significant relative to the signal, making it harder and more energy-intensive to maintain a deterministic state. This is the "thermal danger zone" that creates a hard wall for Moore's Law.
  • Proof of Existence: The human brain serves as the ultimate proof that highly efficient, powerful thermodynamic computers are possible. Verdon notes, "We have proof of existence of a really kick-ass AI supercomputer that we're both using right now... I would argue is a thermodynamic computer."
  • Scaling Ambitions: Extropic has moved from superconducting prototypes to silicon, achieving p-bits that operate with just a few hundred attojoules of energy. The company aims to scale to chips with millions of degrees of freedom next year, running on just 20 watts—comparable to the human brain.

A New Cambrian Explosion for AI Algorithms

  • Beyond Transformers: Verdon cites the creators of the Transformer, who are investors in Extropic, stating, "Transformers are not sacred. They were what worked on the hardware we had at the time." New probabilistic hardware will favor new foundation models that run more natively and efficiently.
  • Unlocking Energy-Based Models (EBMs): EBMs are a powerful class of models that have been historically limited by inefficient sampling. Thermodynamic accelerators could finally unlock their potential, providing a more principled and powerful alternative to current neural networks.
  • Actionable Insight for Researchers: Researchers should begin re-exploring probabilistic machine learning, EBMs, and other sampling-based methods. The availability of efficient probabilistic hardware will make these previously intractable approaches viable and potentially dominant.

Effective Accelerationism (EAC): A Philosophy for Growth

  • The Core Principle: EAC is a "meta-culture" that seeks to maximize the growth of civilization, measured by its free energy production and consumption (the Kardashev scale). It's based on the idea of thermodynamic selection: systems that are better at capturing and utilizing free energy are exponentially more likely to persist and grow.
  • "Accelerate or Die": This dramatic slogan reflects the core belief that systems (cultures, nations, companies) must either align with growth or be outgrown and selected out. It is a call to embrace technological change and lean into exploration rather than fear it.
  • Hyperstition and Active Inference: Verdon describes EAC as a "hyperstitious meme" that uses active inference to create a better future. By projecting an optimistic vision, it actively steers society toward that outcome. He argues that focusing on doomerism and negative scenarios can become a self-fulfilling prophecy.

The Geopolitics of a Thermodynamic Future

  • Decentralized Intelligence: A key goal is to create personalized, energy-efficient AI that individuals can own and control, preventing the centralization of cognitive power in the hands of a few large entities. This is crucial for what he calls "truly democratizing intelligence."
  • High-Variance vs. Low-Variance Systems: He contrasts the US as a "high-temperature search algorithm"—high-variance, innovative, and exploratory—with China as a "low-temperature sampler" that excels at optimization and execution once a clear path is found. He argues that maintaining variance is critical for adaptation and avoiding local optima.
  • Strategic Implication: For investors, this highlights the geopolitical significance of hardware innovation. The nation or bloc that masters energy-efficient, decentralized AI will have a decisive strategic advantage. The push for thermodynamic computing is not just a commercial race but a geopolitical imperative.

Conclusion

This episode argues that the future of AI is thermodynamic, driven by physical necessity and offering exponential gains in energy efficiency. For investors and researchers, the key is to anticipate the coming "Cambrian explosion" in AI algorithms and recognize that the next frontier of intelligence will be built on hardware that embraces, rather than fights, the laws of physics.

Others You May Like