You might not expect it from a college town where public nudity is legal and the Republican Party hasn’t had a significant presence in thirty years, but the Fourth of July is a big deal in the small Northwest town where I live. This year’s parade was grand as always, with enough marching bands and colorful floats to satisfy anyone’s taste for pageantry, and a couple of jet fighters from the local Air National Guard base added an unexpected note of realism to this celebration of a revolutionary war by carrying out mock strafing runs over the crowd. As I type these words, barbecue smoke is rising from backyards all over town, and a few blocks east of my home the technicians are putting finishing touches on the evening’s fireworks display.
There’s a certain comforting solidity in a festival that’s been celebrated in the same way since well before I was born. If this year’s celebration is shot through with worries about the future and a bitter ambivalence about martial symbolism during an unpopular war, the same things were true of the Julys of my childhood, when the undeclared war du jour was in Vietnam rather than Iraq and a different generation of demagogues mouthed the same slogans that fill the talk shows today. Still, with the ten volumes of Toynbee’s A Study of History weighing down the shelf above me like the headstones of dead civilizations, it’s hard not to remember that the apparent solidity of old customs in a changing world can turn out to be as deceptively fragile as lake ice in the springtime.
In last week’s Archdruid Report post I suggested that from the point of view of history, nations are fluid and fragile things, and plans for the future that take their stability for granted are likely to end up in history’s recycling bin sooner than most. In many parts of the world this observation would not even have to be made, since borders have changed and nations have appeared and disappeared within living memory. Here in America, by contrast, it hasn’t even begun to find its way into the national conversation about the future.
The closest thing you’ll find to it is the suggestion, kept alive by the memory of our nation’s only civil war so far, and trotted out now and again for shock value, that the United States might someday split up into two or more still recognizably American nations. The possibility that the current borders of the United States might be the high water mark of an American continental empire, one whose tide is already turning from flow to ebb, remains all but unnoticed. The possibility that a century from now the United States might be a much smaller nation with no bigger role in international affairs than, say, Italy, is practically unthinkable. History shows that this sort of change happens all the time, but it seems very hard for Americans to apply a historical perspective of this kind to their own national community.
There’s a complex history behind this notion of American exceptionalism. Before the Mayflower brought the first shipload of Puritan refugees across the Atlantic, the idea that European settlement in the New World might be exempt from the sordid details of the Old World’s history was already in the air. The first Rosicrucian manifesto, which ignited a continent-wide furore on its publication in 1614, promised cryptically that “there shall be a door opened to Europe” and that “Europe is with child and will bring forth a strong child;” the German Rosicrucian sects who settled in Pennsylvania in the late 17th and early 18th centuries were far from the only people to apply these prophecies to the newly discovered lands across the ocean.
These habits of thought were sealed into place when the revolutionaries of 1776 chose to define their struggle for self-determination against English colonialism in the radical language of the Declaration of Independence. The ideals enshrined in the document whose signing we celebrate today had already been put into circulation during the English revolution more than a century before, and went nowhere; the restoration of Charles II to the British throne in 1660 drew a line beneath Britain’s brief experiment with Republican government. When those same ideals became the foundation of a lasting political settlement on American soil, the idea that America might go its own way, unburdened by the Old World’s troubles, became an article of faith for many Americans.
The problem with this comforting faith, of course, was that history failed to play along with it. It’s one thing to talk of westward expansion and manifest destiny, and quite another to come to terms with the wars of conquest and extermination that cleared the continent for America’s growth. Equally, it’s one thing to discuss the Monroe Doctrine as a matter of guaranteeing the freedom of the New World against the Old, and another to notice that in practice, too often, this amounted to a guarantee that the United States and not some other power would force its will on the nations of Latin America. Like every other country on earth, the United States faced its share of conflicts between its own ideals, on the one hand, and the demands of power, prosperity, and survival in the brutal world of international politics, and like every other country on earth, the United States made its share of wretchedly bad decisions in response.
When a gap opens up between ideals and reality, the result is what psychologists call cognitive dissonance, and America has a very bad case of it. A great deal of American political debate over the last half century or so has accordingly focused on trying to relieve the cognitive dissonance of America’s inevitable failure to live up to the high ideals on which it was founded. On the one hand, mostly but not exclusively on the political and social right, you can find loud claims that America’s moral failures either didn’t happen or don’t count, and that the ideals ought to be taken as an description of the way the United States actually behaves in the world. On the other hand, mostly but not exclusively on the political and social left, you can find equally loud claims that America’s moral failures not only cancel out anything worthwhile our nation has done, but prove that the ideals themselves are a sham.
These two claims, the Tweedledum and Tweedledee of American political rhetoric, have become so pervasive these days that very few people seem to notice either the bankrupt logic that drives them both, or the disastrous disconnection from reality that they both foster. The reasoning they share in common is the logic of Utopia, the claim that the right political, economic, or social system can make people behave like angels, and that anything less is therefore unacceptable. Conservatives who want to say that the American system works well thus end up arguing that it’s perfect, and radicals who want to point out that it has problems thus end up denouncing it as evil incarnate. Once the logic of Utopia enters the picture, the possibility of middle ground vanishes, and with it goes the potential for the compromise and cooperation that the founders of the American political system, pragmatists that they were, saw as essential.
This is disastrous enough, but to my way of thinking, the second consequence of the flight from cognitive dissonance may prove to be much worse as America stumbles into an age in which cheap fossil fuels become a thing of the past. To insist that America is by definition the world’s best society, inevitably destined to triumph in its disinterested pursuit of democracy around the globe, is to give up citizenship in the real world and move to a country as imaginary as Oz. To insist on the precise opposite of these claims is to do exactly the same thing. Neither set of beliefs would provide anything in the way of useful guidance even if America and the world could count on relative stability over the next century or so. In a world facing a long and difficult transition to sustainability on the far side of Hubbert’s peak, trying to impose the geography of imaginary countries on the real world will most likely prove suicidal.
It probably needs to be said that worshiping America as Utopia is not the same thing as patriotism. It’s not even a useful substitute. Apply the same logic to marriage – which is, after all, simply another form of social organization – and the fallacy becomes plain. Insist that your own marriage is already the best possible marriage, that its problems either don’t exist or don’t matter, and that for that reason there’s no need to discuss making changes, and drastic marital problems are pretty much a given. Respond to social troubles in the same way and you make an explosion inevitable. Equally, though, if you insist that your own marriage is so uniquely bad that marrying anybody else at all would be an improvement, your chances at marital bliss are no better, and the same is just as true in the world of politics; those who insist that American empire is the worst of all worlds might want to consider what would have happened if the Soviet Union had won the Cold War.
What makes all this pursuit of imaginary countries embarrassing from the historical point of view is that of all the revolutionary ideologies of the last few centuries, the one that shaped America’s institutions is one of the least congenial to extreme claims. You’ll find few political documents in all of history as riddled with caution, compromise, and wiggle room for necessary change as the American constitution, precisely because the unlikely radicals in Philadelphia whose act we celebrate today were profoundly aware of the power of political passions and the fallibility of institutions. That very fact makes their handiwork all the more relevant in a future when caution, compromise, and wiggle room will be desperately needed. The fact that the system they designed was also crafted to fit the measured pace of transport and communications in an age before fossil fuels points to another reason why the old pragmatic rules of the founding fathers may just turn out to be more relevant to the real world of tomorrow than to the imaginary worlds of today’s political rhetoric.