Part III: The Thermodynamic Ledger

Computation is physical. The statement sounds obvious, but its implications run deeper than most economic frameworks acknowledge. Every bit flip, every floating-point operation, every token generated by a language model requires energy and produces heat. Rolf Landauer proved in 1961 that erasing a single bit of information dissipates a minimum amount of energy set by the temperature of the environment—roughly three billionths of a trillionth of a joule at room temperature. The number is tiny, but it is not zero, and it cannot be engineered away. Physics sets the floor.

Current transistors operate roughly a million times above this limit, which means efficiency gains remain possible; but the floor exists, and it binds. As computation expands to fill more of the economy, the energy cost of that computation becomes significant even as the cost per operation continues to fall. This is not a prediction about distant futures. It is accounting that already applies.

The thermodynamic frame does more than establish a cost floor. It explains why trained models have value. A model's weights represent the output of an irreversible computational process—search through a vast space of possible configurations, most of which were discarded. The information encoded in those weights is crystallized search, structure that cannot be cheaply reproduced because the search itself consumed energy now dissipated as heat. This is why a model is not like a recipe that can be copied for free. The copy is free; the original required thermodynamic work that cannot be undone.