The third stage of the process of collapse, following what
I’ve called the eras of pretense and impact, is the era of response. It’s easy
to misunderstand what this involves, because both of the previous eras have
their own kinds of response to whatever is driving the collapse; it’s just that
those kinds of response are more precisely nonresponses, attempts to make the
crisis go away without addressing any of the things that are making it happen.
If you want a first-rate example of the standard nonresponse
of the era of pretense, you’ll find one in the sunny streets of Miami, Florida
right now. As a result of global climate change, sea level has gone up and the
Gulf Stream has slowed down. One consequence is that these days, whenever Miami
gets a high tide combined with a stiff onshore wind, salt water comes boiling
up through the storm sewers of the city all over the low-lying parts of town.
The response of the Florida state government has been to ssue an order to all
state employees that they’re not allowed to utter the phrase “climate change.”
That sort of thing is standard practice in an astonishing
range of subjects in America these days. Consider the roles that the
essentially nonexistent recovery from the housing-bubble crash of 2008-9 has
played in political rhetoric since that time. The current inmate of the White
House has been insisting through most of two turns that happy days are here
again, and the usual reams of doctored statistics have been churned out in an
effort to convince people who know better that they’re just imagining that
something is wrong with the economy. We can expect to hear that same claim made
in increasingly loud and confident tones right up until the day the bottom
finally drops out.
With the end of the era of pretense and the arrival of the
era of impact comes a distinct shift in the standard mode of nonresponse, which
can be used quite neatly to time the transition from one era to another. Where
the nonresponses of the era of pretense insist that there’s nothing wrong and
nobody has to do anything outside the realm of business as usual, the
nonresponses of the era of impact claim just as forcefully that whatever’s gone
wrong is a temporary difficulty and everything will be fine if we all unite to
do even more of whatever activity defines business as usual. That this normally
amounts to doing more of whatever made the crisis happen in the first place,
and thus reliably makes things worse is just one of the little ironies history
has to offer.
What unites the era of pretense with the era of impact is
the unshaken belief that in the final analysis, there’s nothing essentially
wrong with the existing order of things. Whatever little difficulties may show
up from time to time may be ignored as irrelevant or talked out of existence,
or they may have to be shoved aside by some concerted effort, but it’s
inconceivable to most people in these two eras that the existing order of
things is itself the source of society’s problems, and has to be changed in
some way that goes beyond the cosmetic dimension. When the inconceivable
becomes inescapable, in turn, the second phase gives way to the third, and the
era of response has arrived.
This doesn’t mean that everyone comes to grips with the real
issues, and buckles down to the hard work that will be needed to rebuild
society on a sounder footing. Winston Churchill once noted with his customary
wry humor that the American people can be counted on to do the right thing,
once they have exhausted every other possibility. He was of course quite
correct, but the same rule can be applied with equal validity to every other
nation this side of Utopia, too. The era of response, in practice, generally
consists of a desperate attempt to find something that will solve the crisis du
jour, other than the one thing that everyone knows will solve the crisis du
jour but nobody wants to do.
Let’s return to the two examples we’ve been following so
far, the outbreak of the Great Depression and the coming of the French
Revolution. In the aftermath of the 1929 stock market crash, once the initial
impact was over and the “sucker’s rally” of early 1930 had come and gone, the
federal government and the various power centers and pressure groups that
struggled for influence within its capacious frame were united in pursuit of a
single goal: finding a way to restore prosperity without doing either of the
things that had to be done in order to restore prosperity. That task occupied the best minds in the US elite
from the summer of 1930 straight through until April of 1933, and the mere fact
that their attempts to accomplish this impossibility proved to be a wretched
failure shouldn’t blind anyone to the Herculean efforts that were involved in
the attempt.
The first of the two things that had to be tackled in order
to restore prosperity was to do something about the drastic imbalance in the
distribution of income in the United States. As noted in previous posts, an
economy dependent on consumer expenditures can’t thrive unless consumers have
plenty of money to spend, and in the United States in the late 1920s, they
didn’t—well, except for the very modest number of those who belonged to the
narrow circles of the well-to-do. It’s not often recalled these days just how
ghastly the slums of urban America were in 1929, or how many rural Americans
lived in squalid one-room shacks of the sort you pretty much have to travel to
the Third World to see these days. Labor unions and strikes were illegal in
1920s America; concepts such as a minimum wage, sick pay, and health benefits
didn’t exist, and the legal system was slanted savagely against the poor.
You can’t build prosperity in a consumer society when a good
half of your citizenry can’t afford more than the basic necessities of life.
That’s the predicament that America found clamped to the tender parts of its
economic anatomy at the end of the 1920s. In that decade, as in our time, the
temporary solution was to inflate a vast speculative bubble, under the
endearing delusion that this would flood the economy with enough unearned cash
to make the lack of earned income moot. That worked over the short term and
then blew up spectacularly, since a speculative bubble is simply a Ponzi scheme
that the legal authorities refuse to prosecute as such, and inevitably ends the
same way.
There were, of course, effective solutions to the problem of
inadequate consumer income. They were exactly those measures that were taken
once the era of response gave way to the era of breakdown; everyone knew what
they were, and nobody with access to political or economic power was willing to
see them put into effect, because those measures would require a modest decline
in the relative wealth and political dominance of the rich as compared to
everyone else. Thus, as usually happens, they were postponed until the arrival
of the era of breakdown made it impossible to avoid them any longer.
The second thing that had to be changed in order to restore
prosperity was even more explosive, and I’m quite certain that some of my
readers will screech like banshees the moment I mention it. The United States
in 1929 had a precious metal-backed currency in the most literal sense of the
term. Paper bills in those days were quite literally receipts for a certain
quantity of gold—1.5 grams, for much of the time the US spent on the gold
standard. That sort of arrangement was standard in most of the world’s
industrial nations; it was backed by a dogmatic orthodoxy all but universal
among respectable economists; and it was strangling the US economy.
It’s fashionable among certain sects on the economic fringes
these days to look back on the era of the gold standard as a kind of economic
Utopia in which there were no booms and busts, just a warm sunny landscape of
stability and prosperity until the wicked witches of the Federal Reserve came
along and spoiled it all. That claim flies in the face of economic history.
During the entire period that the United States was on the gold standard, from
1873 to 1933, the US economy was a moonscape cratered by more than a dozen
significant depressions. There’s a reason for that, and it’s relevant to our
current situation—in a backhanded manner, admittedly.
Money, let us please remember, is not wealth. It’s a system
of arbitrary tokens that represent real wealth—that is, actual, nonfinancial
goods and services. Every society produces a certain amount of real wealth each
year, and those societies that use money thus need to have enough money in
circulation to more or less correspond to the annual supply of real wealth. That
sounds simple; in practice, though, it’s anything but. Nowadays, for example,
the amount of real wealth being produced in the United States each year is
contracting steadily as more and more of the nation’s economic output has to be
diverted into the task of keeping it supplied with fossil fuels. That’s
happening, in turn, because of the limits to growth—the awkward but inescapable
reality that you can’t extract infinite resources, or dump limitless wastes, on
a finite planet.
The gimmick currently being used to keep fossil fuel
extraction funded and cover the costs of the rising impact of environmental
disruptions, without cutting into a culture of extravagance that only cheap
abundant fossil fuel and a mostly intact biosphere can support, is to increase
the money supply ad infinitum. That’s become the bedrock of US economic policy
since the 2008-9 crash. It’s not a gimmick with a long shelf life; as the
mismatch between real wealth and the money supply balloons, distortions and
discontinuities are surging out through the crawlspaces of our economic life,
and crisis is the most likely outcome.
In the United States in the first half or so of the
twentieth century, by contrast, the amount of real wealth being produced each
year soared, largely because of the steady increases in fossil fuel energy
being applied to every sphere of life. While the nation was on the gold
standard, though, the total supply of money could only grow as fast as gold
could be mined out of the ground, which wasn’t even close to fast enough. So
you had more goods and services being produced than there was money to pay for
them; people who wanted goods and services couldn’t buy them because there
wasn’t enough money to go around; business that wanted to expand and hire
workers were unable to do so for the same reason. The result was that moonscape
of economic disasters I mentioned a moment ago.
The necessary response at that time was to go off the gold
standard. Nobody in power wanted to do this, partly because of the dogmatic
economic orthodoxy noted earlier, and partly because a money shortage paid
substantial benefits to those who had guaranteed access to money. The rentier
class—those people who lived off income from their investments—could count on
stable or falling prices as long as the gold standard stayed in place, and the
mere fact that the same stable or falling prices meant low wages, massive
unemployment, and widespread destitution troubled them not at all. Since the
rentier class included the vast majority of the US economic and political
elite, in turn, going off the gold standard was unthinkable until it became
unavoidable.
The period of the French revolution from the fall of the
Bastille in 1789 to the election of the National Convention in 1792 was a
period of the same kind, though driven by different forces. Here the great
problem was how to replace the Old Regime—not just the French monarchy, but the
entire lumbering mass of political, economic, and social laws, customs, forms,
and institutions that France had inherited from the Middle Ages and never quite
gotten around to adapting to drastically changed conditions—with something that
would actually work. It’s among the more interesting features of the resulting
era of response that nearly every detail differed from the American example
just outlined, and yet the results were remarkably similar.
Thus the leaders of the National Assembly who suddenly
became the new rulers of France in the summer of 1789 had no desire whatsoever
to retain the traditional economic arrangements that gave France’s former
elites their stranglehold on an oversized share of the nation’s wealth. The
abolition of manorial rights that summer, together with the explosive rural
uprisingsagainst feudal landlords and their chateaux in the wake of the
Bastille’s fall, gutted the feudal system and left most of its former
beneficiaries the choice between fleeing into exile and trying to find some way
to make ends meet in a society that had no particular market for used
aristocrats. The problem faced by the National Assembly wasn’t that of prying
the dead fingers of a failed system off the nation’s throat; it was that of
trying to find some other basis for national unity and effective government.
It’s a surprisingly difficult challenge. Those of my readers
who know their way around current events will already have guessed that an
attempt was made to establish a copy of whatever system was most fashionable
among liberals at the time, and that this attempt turned out to be an abject
failure. What’s more, they’ll have been quite correct. The National Assembly
moved to establish a constitutional monarchy along British lines, bring in
British economic institutions, and the like; it was all very popular among
liberal circles in France and, naturally, in Britain as well, and it flopped.
Those who recall the outcome of the attempt to turn Iraq into a nice
pseudo-American democracy in the wake of the US invasion will have a tolerably
good sense of how the project unraveled.
One of the unwelcome but reliable facts of history is that
democracy doesn’t transplant well. It thrives only where it grows up naturally,
out of the civil institutions and social habits of a people; when liberal
intellectuals try to impose it on a nation that hasn’t evolved the necessary foundations
for it, the results are pretty much always a disaster. That latter was the
situation in France at the time of the Revolution. What happened
thereafter is what almost always happens
to a failed democratic experiment: a period of chaos, followed by the rise of a
talented despot who’s smart and ruthless enough to impose order on a chaotic
situation and allow new, pragmatic institutions to emerge to replace those
destroyed by clueless democratic idealists. In many cases, though by no means
all, those pragmatic institutions have ended up providing a bridge to a future
democracy, but that’s another matter.
Here again, those of my readers who have been paying
attention to current events already know this; the collapse of the Soviet Union
was followed in classic form by a failed democracy, a period of chaos, and the
rise of a talented despot. It’s a curious detail of history that the despots in
question are often rather short. Russia has had the great good fortune to find,
as its despot du jour, a canny realist who has successfully brought it back
from the brink of collapse and reestablished it as a major power with a body
count considerably smaller than usual.. France was rather less fortunate; the
despot it found, Napoleon Bonaparte, turned out to be a megalomaniac with an
Alexander the Great complex who proceeded to plunge Europe into a quarter
century of cataclysmic war. Mind you, things could have been even worse; when
Germany ended up in a similar situation, what it got was Adolf Hitler.
Charismatic strongmen are a standard endpoint for the era of
response, but they properly belong to the era that follows, the era of
breakdown, which will be discussed next week. What I want to explore here is
how an era of response might work out in the future immediately before us, as
the United States topples from its increasingly unsteady imperial perch and
industrial civilization as a whole slams facefirst into the limits to growth.
The examples just cited outline the two most common patterns by which the era
of response works itself out. In the first pattern, the old elite retains its
grip on power, and fumbles around with increasing desperation for a response to
the crisis. In the second, the old elite is shoved aside, and the new holders
of power are left floundering in a political vacuum.
We could see either pattern in the United States. For what
it’s worth, I suspect the latter is the more likely option; the spreading
crisis of legitimacy that grips the country these days is exactly the sort of
thing you saw in France before the Revolution, and in any number of other
countries in the few decades just prior to revolutionary political and social
change. Every time a government tries to cope with a crisis by claiming that it
doesn’t exist, every time some member of the well-to-do tries to dismiss the
collective burdens its culture of executive kleptocracy imposes on the country
by flinging abuse at critics, every time institutions that claim to uphold the
rule of law defend the rule of entrenched privilege instead, the United States
takes another step closer to the revolutionary abyss.
I use that last word advisedly. It’s a common superstition
in every troubled age that any change must be for the better—that the overthrow
of a bad system must by definition lead to the establishment of a better one.
This simply isn’t true. The vast majority of revolutions have established
governments that were far more abusive than the ones they replaced. The
exceptions have generally been those that brought about a social upheaval without
wrecking the political system: where, for example, an election rather than a
coup d’etat or a mass rising put the revolutionaries in power, and the
political institutions of an earlier time remained in place with only such
reshaping as new necessities required.