The Butlerian Future

Science fiction has a very mixed track record so far in predicting the future. It’s scored some impressive hits – E.M. Forster’s harrowing 1907 prevision of Internet cyber-culture, “The Machine Stops,” comes to mind – but plenty of its predictions remain stubbornly unfulfilled, and quite a few of the major trends of the last half century got missed entirely by science fiction’s would-be prophets. Manned landings on the Moon were a staple of science fiction from Jules Verne until Apollo 11, and yet nobody in the SF scene even guessed at the immense cultural impact that television coverage of that first actual lunar landing would turn out to have. The thought that the Apollo flights would turn out to be not the beginning of a golden age of space exploration, but an extravagance too costly to push further out into the solar system – as it turned out to be – would have been rejected out of hand in science fiction’s own golden age between the two world wars.

Still, it’s as often as not the indirect predictions in science fiction that prove the most prescient. The E.M. Forster story mentioned earlier wasn’t an attempt to foresee the internet; Forster described it as “a counter-blast to one of the early heavens of H.G. Wells,” and used it mostly to talk about the downside of his own culture’s obsession with ideas as a substitute for lived experience. In the same way, the science fiction novel I want to discuss here – Frank Herbert’s sprawling classic Dune – doesn’t claim to talk about the near future of our own society, but several of its central themes are likely to make the transition from speculative fiction to hard reality in the decades ahead of us.

Centuries before the events of Herbert’s novel, the universe he chronicled was convulsed by the Butlerian Jihad, a massive and violent popular movement against computer technology. “Once,” Herbert has one character explain to another, “men turned their thinking over to machines in the hope that this would set them free. But that only permitted other men with machines to enslave them.” In the aftermath of the Butlerian Jihad, the human race went down a different path. As the same character comments: “The Great Revolt took away a crutch...It forced human minds to develop. Schools were started to train human functions.” By the time Dune opens, human beings fill many of the roles now entrusted to machines. Mentats, people trained in mnemonic and analytic skills who function as living computers, handle data processing; struggles between major power blocs employ assassins and highly trained special forces rather than the massed military technologies of today’s warfare; secret societies such as the Bene Gesserit sisterhood pursue disciplines of mind-body mastery that give them astonishing powers over themselves and other people as well.

Under present circumstances, mind you, a Butlerian Jihad is about as likely as a resumption of the Punic Wars. Even radical neoprimitivists who think we all ought to go back to hunting and gathering rely on websites and podcasts to get their message out. Still, Herbert may turn out to be a prophet after all; there’s a real chance that we may find ourselves backing into a Butlerian future without intending anything of the kind. The reasons I have in mind have nothing to do with the romanticism of which people who question today’s technological triumphalism are so often accused. Rather, they’re a matter of cold hard economics.

The modern faith in progress has its blind spots, and one of the most pervasive is the tendency for people to believe that the present arrangement of society is somehow inevitable, the natural result of all those centuries of progress. It seems inevitable, for example, that every culture will end up relying on machines rather than people for tasks like data processing, simply because that’s the way we do things. Behind the grand facade of progress, though, lies a simple economic fact: in an age of abundant fossil fuels, it’s cheaper – a lot cheaper – to build and power a machine to do something than it is to train and employ a human being to do the same thing. As long as that equation holds, the only constraint that limits how many people get replaced by machines is the sophistication of the machines, and so the same equation drives technological advances; because machines do things at lower cost than people do, investment in new technology tends to pay for itself. The last three centuries of the western world’s history show what happens when this process goes into high gear.

The whole process depends, though, on having a cheap, abundant source of mechanical and electrical energy. For the last three centuries, fossil fuels have provided that, but the lesson of peak oil – and the wider context of resource depletion and ecosystem damage driving the rising spiral of crises that besets industrial civilization today – is that this was a temporary situation, made possible only because human beings found and exploited huge but finite reserves of cheap energy in the earth’s crust. Everything based on that fact is subject to change – including the equation that makes machine labor cheaper than human labor.

In a world where fossil fuels are expensive and scarce, in fact, the equation works the other way. Modern machines require very specialized and resource-intensive inputs of energy and materials, and if those aren’t available within tight specifications, the machines don’t work. Human beings, by contrast, can be kept happy and productive with very simple, generally available resources – food, drink, warmth, shelter, companionship, and mental stimulation – that all have wide tolerances and a great deal of room for substitution. In a society that has to operate within the energy budget provided by renewable resources, these things are much less challenging to provide than the pure, concentrated, and precisely controlled inputs needed by complex machines. This is why the steam turbine, invented in ancient Greek times by Hero of Alexandria, remained a philosopher’s toy, and why the brilliant mechanical inventions of medieval China never caused the social and economic transformations the industrial revolution launched in the modern west. Machines existed, but without the energy resources to power them – or, more exactly, without the realization that coal can be turned into mechanical energy if you have the right machine – human labor was more economical, and so the machines languished.

Nor is the sort of exotic labor performed by Herbert’s mentats and Bene Gesserit sisters purely a matter of science fiction. Medieval European scholars and savants practiced mental disciplines such as the art of memory, which allowed a trained person to accurately store and recall prodigious amounts of data, and the Lullian art, an algebra of concepts used to process information. In Asia, practitioners of yoga and the martial arts have evolved physical disciplines that bring exceptional feats within reach of the trained practitioner. Certain traditions of east and west, most of them affiliated with mystical religious teachings, have put together comprehensive training systems meant to develop human capacities to their highest pitch. All these are potential resources for societies of the deindustrial future.

The ideologies of the industrial age either devalued human potential in favor of the possibilities opened up by fossil-fuel-powered machines, or reacted against this sort of thinking by glorifying whatever human beings could do that the machines of any given time couldn’t do. The 19th century clash between industrial triumphalism and its Romantic opposition still defines most of the terms in which we think of machines, human beings, and their interactions today. Herbert’s imagination leapt beyond that to offer a glimpse of what human beings might be capable of, if we pursued human potential with as much enthusiasm as today’s engineers push the limits of machines. In a world where energy-intensive high technologies may not be supportable for all that much longer, it’s a glimpse well worth thinking about.