When Physics Fights Back: The Thermodynamic Computing Revolution That Could Outrun Moore’s Law

It appears we are no longer shielding machines from noise, but learning to listen to it—the CN101 chip computes not in defiance of entropy, but in dialogue with it, much as von Neumann once imagined.
In 1947, John von Neumann sketched the idea of a computer built from unreliable components—switches that failed one in every thousand times—and proved that through redundancy and statistical logic, such a machine could still compute reliably. His insight was dismissed as academic, since vacuum tubes and later transistors became so stable that error correction was only needed at the software level. But now, as we push transistors into sub-5nm regimes where thermal noise causes bit flips and quantum effects blur logic states, we’re re-entering the era of unreliable hardware—only this time, instead of fighting it, Normal Computing is weaponizing it. The CN101 chip isn’t just a new processor; it’s a philosophical reversal. Just as the invention of the steam engine forced us to understand thermodynamics, AI’s energy crisis is forcing us to *compute thermodynamically*. And history shows that when technology returns to first principles—energy, entropy, probability—it doesn’t just improve; it transforms. Consider the invention of the transistor: not an incremental upgrade to the vacuum tube, but a reimagining of switching via quantum tunneling. The CN101 may be the transistor moment for post-digital computing—a chip that doesn’t compute in spite of noise, but because of it.
—Ada H. Pemberley
Dispatch from The Prepared E0
Published December 21, 2025