Producing Democracy

Last week's post here on The Archdruid Report attempted to raise a question that, as I see it, deserves much more discussion than it gets these days.  Most currently popular ways of trying to put pressure on the American political system presuppose that the politicians will pay attention if the people, or the activists who claim to speak in their name, simply make enough noise. The difficulty is that the activists, or for that matter the people, aren’t actually giving the politicians any reason to pay attention; they’re simply making noise, and the politicians have gotten increasingly confident that the noise can be ignored with impunity.

That’s what’s implied by saying that protest marches, petitions, letter-writing campaigns, and the like consume democracy.  These standard modes of activism only work if the officials who make decisions have some reason to think that the activists can follow through with some more effective action, such as a serious challenge in the next election, if they’re ignored.  It’s those other actions that produce democracy or, in less metaphoric terms, convince elected officials that ignoring the activists could put their careers at risk.

What sets democracy apart from most other systems of government is that it gives citizens a peaceful way to make good on such threats.  This is a huge advantage, for the most pragmatic of reasons.  In autocratic societies, where the populace has no way to get rid of inept officials short of revolution, a vast amount of administrative idiocy and incompetence goes unpunished. The notion that autocracies are by definition more competent than democracies is quite simply untrue; the annals of autocratic states such as ancien régime France and Communist Russia are packed with examples of the most egregious incompetence—and no, despite the slogans, Mussolini didn’t make the trains run on time, either.  It’s simply easier to cover up governmental stupidity in an autocracy, since there aren’t media independent of government control to spread the word and embarrass the powers that be.

Yet that advantage, again, depends on the ability of citizens to vote the rascals out when they deserve it. In today’s America, that ability is little more than theoretical these days.  I’ve discussed in a number of posts already how what was once one of the world’s liveliest and most robust democratic systems has lapsed into a sham democracy uncomfortably reminiscent of the old Eastern Bloc states, where everyone had the right to vote for a preselected list of candidates who all support the same things. The reasons for that decay are complex, and again, I’ve talked about them in detail already. What I want to address this week is what might be done about it—and that requires a second look at the countervailing forces that were once hardwired into the grassroots level of the American system.

A thought experiment might help clarify the issues here.  Imagine, dear reader, that early next year you hear that a couple of legislators and popular media figures in your state are talking about forming a new political party that will offer a meaningful alternative to the stalemate in Washington DC.  The two major parties ignore them, but by early 2014 the new party is more or less in existence, and candidates under its banner are running for Congress and a range of state offices.  The morning after the 2014 election, Republicans and Democrats across the nation wake up to discover that they are going to have to deal with a significant third-party presence in Congress and a handful of state governments controlled lock, stock and barrel by the new party.

The two years leading up to the 2016 election pass by in a flurry of political activity as the old parties struggle to regain their joint monopoly over the political process and the new party scrambles to build a national presence.  In 2016, the new party nominates its first presidential candidate, a longtime activist and public figure.  The campaign faces an uphill fight, and loses heavily; some of the new party’s people in Congress and state governments are ousted as well. Pundits insist that it was all a flash in the pan, but they’re discomfited in the midterm elections in 2018 when the new party scores a major upset, winning a majority in the House of Representatives and nearly doubling its Senate presence.

The new party’s gains strain the existing structure of American partisan politics to the breaking point. As the 2020 election nears, the Democratic Party, outflanked and marginalized by the rising new party, disintegrates in internecine feuding and fails to field a presidential candidate at all.  The Republican Party breaks in two, with Tea Party and country-club Republicans holding competing conventions and nominating separate presidential tickets.  Yet another new party springs up, composed mostly of old guard politicians from what used to be the middle of the road, and nominates its own candidate. Under US law, whatever party gets the most votes in any state wins that state’s votes in the electoral college—and so the new party, by winning a plurality of the popular vote in just enough states to matter, sees its candidate raising his hand on January 20, 2021 to take the oath of office as the nation’s next president.

Suggest a scenario of that kind to most Americans today and they’ll dismiss it as impossible. That’s all the more curious, in that every detail of the thought experiment I’ve just sketched out is drawn from an earlier period in American history.  The years in question ran from 1854 to 1860, and the new party was the Republican Party; the Whigs were the party that imploded, the Democrats the party that split in two, the short-lived fourth party was the Constitutional Union Party and, of course, the tall figure standing up there taking the oath of office in 1861 was Abraham Lincoln.

Yet it’s true that an upset of the same kind would be much more difficult to pull off today.  Several different factors combine to make that the case, but to my mind, the most important of them is the simple and awkward fact that the skills that would be needed to make it happen are no longer to be found among activists or, for that matter, American citizens in general.  Organizing a new political party, building up a constituency on a national level, and making the machinery of democracy turn over in response, requires the pragmatic application of certain learned and learnable skill sets, which most people in America today do not know and are by and large uninterested in learning.  There are, broadly speaking, three such skill sets, and we’ll take them one at a time

The first can’t be discussed without opening an industrial sized can of worms, one that will take the rest of this post and still leave ends wriggling in all directions, but that can’t be helped. I’d like to start the can opener going with one of the odder conversations that spun off from last week’s post.   My regular readers will remember that one of the core themes of that post was the suggestion that, though democratic systems are routinely corrupt and suffer from a galaxy of other pervasive problems, they generally provide more civil rights and more consistent access to due process to their citizens than do autocratic systems, and that this is true even in a nonindustrial setting.

One of my readers took heated exception to this claim, and insisted that preindustrial America was no better than any other country of the time. It’s the debate that followed, though, that brought out the detail I want to emphasize.  To defend his counterclaim, my reader talked about the current US prison system, the evils of intellectual property rights, and a flurry of other issues irrelevant to the point at hand, ending up with a claim that since Adolf Hitler was elected Chancellor of Germany in 1933, Nazi Germany was a democracy and democracy was therefore bad.  None of his arguments had any bearing on whether citizens of the United States in its preindustrial days—not, please note, all its residents, much less people outside its borders; democracies abuse noncitizens more or less as often as other forms of government do, which is why I specified citizens in my distinctly lukewarm words of praise—had more civil rights and more reliable access to due process than citizens of autocratic countries during the same period.

It’s a matter of historical record that in 1800, say, when the United States was still almost wholly agrarian, an American citizen could stand up in a public meeting anywhere in the country, say that President Adams was a liar, a thief, and a scoundrel who should be hounded out of office at the first opportunity, and suffer no civil or criminal penalty whatever for that utterance—and could go on with equal impunity to make good on his words by doing his best to hound Adams out of office and put Tom Jefferson in his place.  It’s equally a matter of historical record that making a similar comment in the same year about First Consul Napoleon Bonaparte, Tsar Pavel I, Sultan Selim III, the Jiaqing Emperor, or Shogun Tokugawa Ienari in the respective domains of these autocrats would have resulted in a painful death, or at best a long stay in prison under ghastly conditions, and let’s not even talk about what happened to people who showed any sign of trying to replace these heads of state with some other candidate. That’s a significant difference in civil rights, and it’s what I was talking about, but my attempts to suggest to my reader that he was not addressing my point got answered by increasingly irritable comments insisting that yes, he was.

It finally dawned on me that from his perspective, he was, because the point he thought I was making was something like “democracy is good,” or more exactly that the verbal noise “democracy” ought to be linked with warm fuzzy feelings.  He was insisting in response, more or less, that the verbal noise “democracy” ought to be linked to cold prickly feelings, and his rhetorical strategy—a very common one on the internet these days, as it happens—was simply to attempt to associate various cold prickly feelings with the verbal noise in question, in the hope that enough of the cold prickly feelings would stick to the verbal noise to make his point.  The fact that I might be trying to do something other than linking a verbal noise to an emotional state seemingly did not occur to him.

It’s only fair to point out that he was far from the only person whose response to that post amounted to some equally simplistic emotional linkage. On the other side of the political spectrum, for instance, was a reader who insisted that the United States was not an empire, because empires are bad and the United States is good.  To him, the verbal noise “empire” was linked to cold prickly feelings, and those clashed unbearably with the warm fuzzy feelings he linked to the word “America.” It’s a fine example of the lumpen-Aristotelianism against which Alfred Korzybski contended in vain: A is A and therefore A cannot be not-A, even if A is a poorly chosen hypergeneralization that relates primarily to an emotional state and embraces an assortment of vaguely defined abstractions with no connection between them other than a nearly arbitrary assignment to the same verbal noise.

I don’t bring up these examples because they’re in any way unusual; they’re not. I bring them up because they quite adequately represent most of what passes for political discussion in America today. Examples abound; for one, think of the way the right uses the word “socialist” to mean exactly what the left means by the word “fascist.”  In plain English, either one translates out as “I hate you,” but both can be far more adequately represented by the snarl a baboon makes when it’s baring its teeth in threat. Now of course both words, like “democracy” and “empire,” actually mean something specific, but you won’t find that out by watching their usage these days.

For another—well, I wonder how many of my readers have had, as I have, the experience of attempting to talk about the policies and behavior of a politician when the other person in the conversation insists on reducing everything to personalities. I think of an email exchange I endured a while back, in which my correspondent was trying to convince me that I was wrong to criticize Barack Obama, since he was a nice man, a man of integrity, and so on.  Every issue got dragged back to the man’s personality—or, more precisely, to my correspondent’s impressions of his personality, garnered at third hand from the media. When I brought up the extent to which the Obama administration’s policies copied those of his predecessor, for example, I got a frosty response about how wrong it was to equate Obama and Bush, since they were such different people.  One was, after all, linked with warm fuzzy feelings in my correspondent’s mind, while the other was linked with cold prickly feelings, and A cannot equal not-A.

One way to talk about the point I’m trying to make here is that the great majority of Americans have never learned how to think.  I stress the word “learned” here; thinking is a learned skill, not an innate ability.  The sort of mental activity that’s natural to human beings is exactly the sort of linkage of verbal noises to emotional states and vague abstractions I’ve outlined above. To get beyond that—to figure out whether the verbal noises mean anything, to recognize that an emotional state is not an objective description of the thing that triggers it, and to replace the vague abstractions with clearly defined concepts that illuminate more than they obscure—takes education.

Now of course we have an educational system in the United States. More precisely, we have two of them, a public school system that reliably provides some of the worst education in the industrial world, and a higher education industry that provides little more than job training—and these days, by and large, it’s training for jobs that don’t exist.  You can quite easily pass through both systems with good grades, and never learn how to work through an argument to see if it makes sense or check the credentials of a purported fact. That’s a problem for a galaxy of reasons, but one of them bears directly on the theme of this post, for it’s a matter of historical record, again, that democratic politics work only when the people who have the right to vote—however large or small that class happens to be—also get an education in the basic skills of thinking.

That’s why the first-draft versions of Western democracy emerged in the ancient Mediterranean world, especially but not only in the city-states of Greece, at a time when the replacement of hieroglyphs with alphabets had made literacy a common skill among urban citizens and one of history’s great intellectual revolutions was inventing logic and formal mathematics.  It’s why democratic ideas began to spread explosively through western Europe once education stopped being a monopoly of religious institutions and refocused on the same logical and mathematical principles that sparked an equivalent shift in the ancient Mediterranean, courtesy of the Renaissance and its aftermath.  It’s why the extension of democracy to previously excluded groups in the United States followed, after varying lag times, the extension of public education to these same groups—and it’s also why the collapse of American education in recent decades has been promptly followed by the collapse of American democracy.

It’s common enough to hear claims that American voters of previous generations must have been as poorly equipped in the skills of thinking as their equivalents today.  I would encourage any of my readers who want to make such a claim, or who like to think that the inhabitants of our self-styled information society must inevitably be better at thinking than people of an earlier age, to take the time to read the Lincoln-Douglas debates in their entirety, and then compare them to this year’s presidential debates.  Lincoln and Douglas, remember, were not speaking to a roomful of Ph.D.s; they were in a hotly contested Congressional election, in front of audiences of farmers, millworkers, shopkeepers, and craftsmen, the ordinary voters of 1858 Illinois, few of whom had more than an eighth grade education and many of whom had much less.  It does not speak well for the pretensions of today’s America that its presidential candidates this year pursued their debates on a level that a crowd of Chicago feedlot workers in 1858 would have found embarrassingly simplistic.

That’s among the many reasons why devising a framework for adult education outside the grip of the current American education industry is one of the most pressing needs of the decade or two right ahead of us.  That huge topic, though, is going to require a series of posts all to itself. What I want to stress here is that teaching the electorate to think is not the only challenge here; those of my readers who may be involved in trying to change the direction of contemporary American society on any scale, and for any reason, might find it useful to turn a cold and beady eye upon their own mental processes, and on those of the movements they happen to support.

As extraordinary amount of what passes for argument in today’s activist scene, after all, is exactly the sort of linking of verbal noises with simple emotional reactions, warm and fuzzy or cold and prickly as needed.  Some of this may be coldly cynical manipulation on the part of canny operators pushing the emotional buttons of their intended targets, to be sure, but a cold and cynical manipulator who sees his manipulations failing normally changes tack, and tries to find something that will work better.  That isn’t what we see among activists, though.  Consider the way that the climate change movement went from an apparently unstoppable juggernaut a decade ago to nearly total failure today.  The strategy chosen by the great majority of climate change activists could be adequately described as the mass production of cold pricklies; when the other side in the debate figured out how to counteract that, the activists’ sole response was to shout "Cold prickly! Cold prickly! COLD PRICKLY!!!" as loud as they could, and then wonder why people weren’t listening.

You can’t craft an effective strategy if your mental processes are limited to linking up verbal noises, simple emotional reactions, and vague abstractions.  It really is as simple as that. Until those who hope to have an influence on any level recognize this, they’re not going to have the influence they seek, and America is going to continue stumbling on autopilot toward a wretched end. Once that hurdle is past, the remaining steps are a good deal easier; we’ll get to them next week.

****************
End of the World of the Week #52

Does it make an apocalyptic prophecy more likely to come true if several different visionaries, from different traditions and times, agree on it?  Well, the last months of 1999 were targeted for the end of the world by a flurry of predictions from a diverse cast of would-be prophets.  Hal Lindsey, who admittedly predicted the end of the world every ten years or so all through the latter 20th century, insisted in one of his books that the Second Coming would arrive before the year 2000; so did preacher James Gordon Lindsay (no relation); so did conspiracy theorist Texe Marrs; so did literature released by the Seventh Day Adventists and the Jehovah’s Witnesses; and so, curiously enough, did Timothy Dwight IV, the distinguished Revolutionary War era educator and theologian.

My regular readers will already know the outcome of the story:  Jesus pulled another no-show.  All of the prophets in question except for Dwight, who had the good common sense to die in 1817, did the usual thing and found new dates on which to anchor their fantasies.

—for more failed end time prophecies, see my book Apocalypse Not