The Innovation Fallacy

The core concept that has to be grasped to make sense of the future looming up before us, it seems to me, is the concept of limits. Central to ecology, and indeed all the sciences, this concept has failed so far to find any wider place in the mindscape of industrial society. The recent real estate bubble is simply another example of our culture’s cult of limitlessness at work, as real estate investors insisted that housing prices were destined to keep on rising forever. Of course those claims proved to be dead wrong, as they always are, but the fact that they keep on being made – it’s been only a few years, after all, since the same rhetoric was disproven just as dramatically in the tech stock bubble of the late 1990s – shows just how allergic most modern people are to the idea that there’s an upper limit to anything.

It’s this same sort of thinking that drives the common belief that limits on industrial society’s access to energy can be overcome by technological innovations. This claim looks plausible at first glance, since the soaring curve of energy use that defines recent human history can be credited to technological innovations that allowed human societies to get at the huge reserves of fossil fuels stored inside the planet. The seemingly logical corollary is that we can just repeat the process, coming up with innovations that will give us ever increasing supplies of energy forever.

Most current notions about the future are based on some version of this belief. The problem, and it’s not a small one, is that the belief itself is based on a logical fallacy.

One way to see how this works – or, more precisely, doesn’t work – is to trace the same process in a setting less loaded with emotions and mythic narratives than the future of industrial society. Imagine for a moment, then, that we’re discussing an experiment involving microbes in a petri dish. The culture medium in the dish contains 5% of a simple sugar that the microbes can eat, and 95% of a more complex sugar they don’t have the right enzymes to metabolize. We put a drop of fluid containing microbes into the dish, close the lid, and watch. Over the next few days, a colony of microbes spreads through the culture medium, feeding on the simple sugar.

Then a mutation happens, and one microbe starts producing an enzyme that lets it feed on the more abundant complex sugar. Drawing on this new food supply, the mutant microbe and its progeny spread rapidly, outcompeting the original strain, until finally the culture medium is full of mutant microbes. At this point, though, the growth of the microbes is within hailing distance of the limits of the supply of complex sugar. As we watch the microbes through our microscopes, we might begin to wonder whether they can produce a second mutation that will let them continue to thrive. Yet this obvious question misleads, because there is no third sugar in the culture medium for another mutation to exploit.

The point that has to be grasped here is as crucial as it is easy to miss. The mutation gave the microbes access to an existing supply of highly concentrated food; it didn’t create the food out of thin air. If the complex sugar hadn’t existed, the mutation would have yielded no benefit at all. As the complex sugar runs out, further mutations are possible – some microbes might end up living on microbial waste products; others might kill and eat other microbes; still others might develop some form of photosynthesis and start creating sugars from sunlight – but all these possibilities draw on resources much less concentrated and abundant than the complex sugar that made the first mutation succeed so spectacularly. Nothing available to the microbes will allow them to continue to flourish as they did in the heyday of the first mutation.

Does this same logic apply to human beings? A cogent example from 20th century history argues that it does. When the Second World War broke out in 1939, Germany arguably had the most innovative technology on the planet. All through the war, German technology stayed far ahead of the opposition, fielding jet aircraft, cruise missiles, ballistic missiles, guided bombs, and many other advances years before anybody else. Their great vulnerability was a severe shortage of petroleum reserves, and even this area saw dramatic technological advances: Germany developed effective methods of CTL (coal to liquids) fuel production, and put them to work as soon as it became clear that the oil fields of southern Russia were permanently out of reach.

The results are instructive. Despite every effort to replace petroleum with CTL and other energy resources, the German war machine ran out of gas. By 1944 the Wehrmacht was struggling to find fuel even for essential operations. The outcome of the Battle of the Bulge in the winter of 1944-5 is credited by many military historians to the raw fact that the German forces didn’t have enough fuel to follow up the initial success of their Ardennes offensive. The most innovative technology on the planet, backed up with substantial coal reserves and an almost limitless supply of slave labor, proved unable to find a replacement for cheap abundant petroleum.

It’s worthwhile to note that more than sixty years later, no one has done any better. Compare the current situation with the last two energetic transitions – the transition from wind and water power to coal in the late 18th and early 19th centuries, and the transition from coal to petroleum at the beginning of the 20th – and a key distinction emerges. In both the earlier cases, the new energy resource took a dominant place in the industrial world’s economies while the older ones were still very much in use. The world wasn’t in any great danger of running out of wind and water in 1750, when coal became the mainspring of the industrial revolution, and peak coal was still far in the future in 1900 when oil seized King Coal’s throne.

The new fuels took over because they were more concentrated and abundant than the competition, and those factors made them more economical than older resources. In both cases a tide of technological advance followed the expansion of energy resources, and was arguably an effect of that expansion rather than its cause. In the 1950s and 1960s many people expected nuclear power to repeat the process – those of my readers who were around then will recall the glowing images of atomic-powered cities in the future that filled the popular media in those days. Nothing of the kind happened, because nuclear power proved to be much less economical than fossil fuels. Only massive government subsidies, alongside the hidden “energy subsidy” it received from an economy powered by cheap fossil fuels, made nuclear power look viable at all.

Mind you, uranium contains a very high concentration of energy, though the complex systems needed to mine, process, use, and clean up after it probably use more energy than the uranium itself contains. Most other resources touted as solutions to peak oil either contain much lower concentrations of energy per unit than petroleum, or occur in much lower abundance. This isn’t accidental; the laws of thermodynamics mandate that on average, the more concentrated an energy source is, the less abundant it will be, and vice versa. They also mandate that all energy transfers move from higher to lower concentrations, and this means that you can’t concentrate energy without using energy to do it. Really large amounts of concentrated energy occur on earth only as side effects of energy cycles in the biosphere that unfold over geological time – that’s where coal, oil, and natural gas come from – and then only in specific forms and locations. It took 500 million years to create our planet’s stockpile of fossil fuels. Once they’re gone, what’s left is mostly diffuse sources such as sunlight and wind, and trying to concentrate these so they can power industrial society is like trying to make a river flow uphill.

Thus the role of technological innovation in the rise of industrial economies is both smaller and more nuanced than it’s often claimed to be. Certain gateway technologies serve the same function as the mutations in the biological model used earlier in this post; they make it possible to draw on already existing resources that weren’t accessible to other technological suites. At the same time, it’s the concentration and abundance of the resource in question that determines how much a society will be able to accomplish with it. Improvements to the gateway technology can affect this to a limited extent, but such improvements suffer from a law of diminishing returns backed up, again, by the laws of thermodynamics.

Innovation is a necessary condition for the growth and survival of industrial society, in other words, but not a sufficient condition. If energy resources aren’t available in sufficient quality and quantity, innovation can make a successful society but it won’t make or maintain an industrial one. It’s worth suggesting that the maximum possible level of economic development in a society is defined by the abundance and concentration of energy resources available to that society. It’s equally possible, though this is rather more speculative, that the maximum possible technological level of an intelligent species anywhere in the universe is defined by the abundance and concentration of energy resources on the planet where that species evolves. (We’ll be talking more about this in next week’s post.)

What we’re discussing here is an application of one of the central principles of ecology. Liebig’s law – named after the 19th century German agronomist Justus von Liebig, who first proposed it – holds that the maximum growth of a community of organisms is limited by whatever necessary factor in the environment is in shortest supply. A simpler way of stating this law is that necessary resources aren’t interchangeable. If your garden bed needs phosphorus, adding nitrogen to it won’t help, and if it’s not getting enough sunlight, all the fertilizer in the world won’t boost growth beyond a certain point.

For most of human history, the resource that has been in shortest supply has arguably been energy. For the last three hundred years, and especially for the last three-fourths of a century, that’s been less true than ever before. Today, however, the highly concentrated and abundant energy resources stockpiled by the biosphere over the last half billion years or so are running low, and there are no other resources on or around Earth at the same level of concentration and abundance. Innovation is vital if we’re to deal with the consequences of that reality, but it can’t make the laws of thermodynamics run backwards and give us an endless supply of concentrated energy just because we happen to want one.