The Key Input

The machine does not isolate man from the great problems of nature but plunges him more deeply into them.

Antoine de Saint-Exupéry, Wind, Sand and Stars (1939), ch. III

If capital migrates toward whatever relaxes the binding constraint, then the question for any era is: what input, once scarce or expensive, has become abundant enough to reorganize production around it? Each major technological revolution has produced such an input, a resource or capability whose cost dropped so dramatically that it became economical to redesign entire systems around its availability.

Carlota Perez calls this the key input of a techno-economic paradigm.(Perez 2002)Carlota Perez, Technological Revolutions and Financial Capital: The Dynamics of Bubbles and Golden Ages (Cheltenham, UK: Edward Elgar, 2002).View in bibliography Perez gets cited; she rarely gets read. The installation phase / deployment phase distinction is not about technology adoption curves. It is about the relationship between financial capital and productive capital. The concept is precise: not merely a useful commodity, but one whose cost structure has shifted enough to create a new design logic. The key input is what everyone must have and what, for a time, only some can provide.

Paradigm shifts arrive as cost revolutions: when the cost structure flips, old designs become uneconomic.

Consider two cases in detail: the first industrial revolution and the information age.

Arkwright's water-frame, patented in 1769, made it possible to spin cotton thread by machine rather than by hand. The machine was a discontinuity. It changed the feasible scale of spinning. A single water-powered mill could do the work of dozens of cottage spinners, and it could run day and night as long as the river flowed. Arkwright's frame mattered because it made cotton thread cheap at scale. That cost collapse, more than the device itself, reorganized the system. The thread created a market for power looms; the looms created a market for bleaching and dyeing at industrial scale; the dyes created a market for chemical synthesis. The factory system emerged because the economics of machine production required workers, machines, and power in one place, under one discipline. The organizational innovation followed the new cost structure.

Two centuries later, Intel's 4004 microprocessor made it possible to put computation on a chip small enough to embed in devices of all kinds. The discontinuity was cost per operation. Mainframes had been powerful for decades, but what had required a room-sized machine and a team of operators could now be done by a component costing a few dollars. The key input was cheap, embeddable computation. The design consequences cascaded: personal computers, then networked computers, then phones and cars and appliances with processors inside, then the internet as a coordination layer, then platforms built on the assumption that everyone could compute and communicate at near-zero marginal cost. The organizational innovations, modular production, outsourcing, just-in-time logistics, platform business models, followed the physical constraint.

The remaining paradigms fit the same pattern.

Steam and railways reorganized geography itself. Before the Liverpool-Manchester line opened in 1829, overland transport moved at speeds that had improved only modestly over centuries; within a generation, goods and people moved at velocities that would have seemed miraculous to their grandparents. The key input was coal and mechanical power; the organizational innovation was the hierarchical corporation capable of coordinating complex operations across hundreds of miles of track, with standardized schedules and signals.

Steel and electricity enabled a different kind of scale: vertical integration, where a single firm controlled everything from raw ore to finished product. Carnegie's Bessemer plants and Edison's power stations were the exemplars. Cheap structural metal and distributed power made possible factories, bridges, and buildings that the iron age could not have supported.

Oil and mass production reorganized space itself. The Ford Model T and the moving assembly line created the twentieth-century landscape: suburbs, highways, shopping centers, logistics networks, consumer credit, and the assumption that personal mobility was a default condition rather than a luxury.

Each transition required decades to unfold, and each produced its own financial bubbles, crashes, and institutional crises as the old common sense gave way to the new. The question is whether we are now in the early phase of another such transition, and if so, what the key input is.

The assets decisive in one paradigm become necessary but not sufficient in the next. Factories did not disappear when software became the key input; they became commoditized, insufficient on their own to confer advantage. Software and network effects remain essential in the current transition, but they have become table stakes, necessary but no longer differentiating. What binds now, and who holds the key?

The outlines of the emerging paradigm are now visible. Since 2017, and especially since 2022, the development of foundation models has produced capabilities that were not anticipated even a few years earlier. These models perform cognitive tasks that previously required human labor: drafting, summarizing, coding, analyzing, translating, reasoning through problems. They require enormous computation to train, with frontier models now costing hundreds of millions to billions of dollars, most of which is spent on hardware and electricity, and significant computation to run at inference scale.

The term "computation" requires precision here. A calculator computes. A spreadsheet computes. Neither participates in the dynamic that matters. What distinguishes foundation models is not computation in the generic sense but learned inference: the execution of models trained through gradient descent on empirical data, capable of generalizing to novel contexts without explicit reprogramming. A calculator executes instructions. It amplifies a choice already made. A foundation model originates judgment. The key input is not computation generically but the conversion of electricity into inference on trained representations—plus the training processes that create those representations in the first place.

The distinction reshapes infrastructure economics. Earlier software was architectural: purpose-built structures with fixed footprints. A payroll system required specific resources; doubling the servers did not double the payroll's utility. Learned inference behaves differently. It expands to fill whatever volume contains it. More compute yields better outputs along a continuous frontier. Demand is not a fixed quantity that infrastructure eventually satisfies. Demand is pressure that rises to meet supply until price constrains it. Prior computation reached saturation. This computation reaches only the walls of its container—and pushes against them.

The physical specifics therefore matter in a way they did not for prior paradigms. They divide into three constraint categories.

Compute. A single training run for a frontier model may consume tens of gigawatt-hours of electricity over several months. The chips that perform this computation are supply-constrained at multiple points: TSMC's advanced packaging capacity, Nvidia's allocation of H100 and successor GPUs, the availability of high-bandwidth memory. These create queues that money alone cannot clear. Lead times for frontier-grade hardware often extend twelve to eighteen months even for well-capitalized buyers.

Power. The data centers that host these workloads are industrial facilities with power demands measured in hundreds of megawatts, comparable to aluminum smelters or steel mills. Interconnection queues in U.S. grid regions now exceed 2,600 GW of requested capacity nationally, with processing timelines stretching to five years from request to commercial operation.(Kahrl 2024)Joseph Rand and Nick Manderlink and Will Gorman and Ryan Wiser and Joachim Seel and Julie Mulvaney Kemp and Seongeun Jeong and Fritz Kahrl, "Queued Up: 2024 Edition, Characteristics of Power Plants Seeking Transmission Interconnection As of the End of 2023" (2024).View in bibliography The constraint includes not just generation but transmission, substations, and the physical infrastructure required to move electrons from source to load.

Supply chain. Cooling systems, water permits, transformer lead times (now commonly 36 months, with maximum lead times reaching 60 months(Energy 2024)U.S. Department of Energy, "Large Power Transformer Resilience: Report to Congress" (2024).View in bibliography), and the availability of skilled labor for construction and operations all create additional bottlenecks. A hyperscaler announcing a 500 MW campus is announcing a multi-year, permit-bound construction program subject to regulatory, logistical, and procurement risk at every stage.

In this case, the key input is delivered as a service layered on physical plant: watts into chips into tokens into work.

If the pattern holds, the key input of the emerging paradigm is something specific: energy structured into computation, and computation structured into economically useful cognition. The three are inseparable. The cognition cannot exist without the computation, and the computation cannot exist without the energy. The cost of a token, the marginal unit of model output, is falling rapidly, but the cost of the infrastructure required to produce tokens at scale is rising. The constraint is shifting from software to physical plant.

The hinge moment is 2022: the year foundation models crossed the threshold from research curiosity to deployed capability. ChatGPT's launch in November of that year was the moment the new common sense became visible to the wider economy. The underlying models had been developing for years, but within months of that launch, enterprises were experimenting with integration; within a year, inference infrastructure had become a bottleneck. The installation phase, in Perez's terms, has begun. What remains uncertain is how long the installation phase will last, how severe its characteristic financial crises will be, and which assets will prove decisive when the deployment phase arrives.

ParadigmMarker EventKey Input
Industrial Revolution (1771)Arkwright's millCotton, water power
Steam & Railways (1829)Liverpool-Manchester railwayCoal, steam
Steel & Electricity (1875)Carnegie Bessemer plantSteel, electricity
Oil & Mass Production (1908)Ford Model TOil, automobiles
Information (1971)Intel 4004Chips, software
Factor Prime (2022–)Foundation models deployedEnergy structured into computation

Table II.1: Six Techno-Economic Paradigms

The table is a simplification. Each transition unfolded over decades, and the marker event is a symbol rather than a cause. But the pattern is consistent: a key input whose cost structure shifts dramatically, a cascade of applications that redesign systems around that input, and a reorganization of what counts as capital.