The other day, courtesy of the public library system here in Cumberland, I had the chance to curl up on the couch with a copy of Canadian journalist Jonathan Kay’s survey of American conspiracy theorists, Among the Truthers. I’m sorry to say it was a disappointing read. Kay’s an engaging writer and the book has some good moments, but on the whole it was a depressing reminder of the reasons that the word “journalistic” has become a synonym for “facile and tendentious.”
This is doubly unfortunate, because the issues Kay was trying to raise are worth discussing. Over the last couple of decades, ordinary political discourse in America has been drowned out by a torrent of hatred and rage—there is really no other way to describe it—directed by people on nearly every part of the political spectrum against their perceived opponents. These days it’s become a commonplace of American political culture to insist that the world’s problems result from the deliberate malevolence of some group of people or other, whose intentions and historical role are portrayed with a degree of moral extremism that would make a third-century Gnostic gulp in disbelief. How many times, dear reader, have you seen blog posts and newspaper articles—we don’t even need to bring in the gutterslop that sloshes through talk radio these days—that flatten out the complex ambiguities of industrial civilization’s spiralling crises into an insistence that somebody or other is wrecking the world out of pure personal wickedness? This is the sort of thing I mean.
The bipartisan rise of this sort of hate politics in America, in turn, provides an exact parallel to the rise of the conspiracy theory movement that Kay tried to examine. Insist that George W. Bush was the puppet of a cabal of fascists plotting to conquer the world, or that Barack Obama is a socialist out to reduce Americans to slavery under a global dictatorship, and then it’s hardly a leap to go on to argue that Bush’s handlers masterminded the 9/11 attacks or that Obama is a Muslim illegal immigrant with a forged birth certificate. Argue for either of these latter, in turn, and you can use it to bolster your case for the limitless wickedness of your bĂȘte du jour.
It would take a book considerably more substantial than Kay’s to sort out the tangled roots of this twin pandemic of hatred and paranoia. For the moment, I want to focus on just one of the many factors involved, both because it’s not usually discussed in this context and because it’s deeply relevant to the project of this blog.
Every June, across America, a couple of million high school seniors go through graduation ceremonies in our nation’s public schools and receive the diploma that, once upon a time, certified that they had completed the general course of education proper to the citizen of a democracy. Nowadays, a sizable fraction of those graduates are functionally illiterate. More than half of them have no real notion how their government works and what the rest of the world is like, and have never had more than a passing glimpse of the major works of art, literature, and music that define America’s cultural heritage. All but a tiny fraction of them have never learned how to reason from premises to a conclusion or check the credentials of a fact.
I’m not at all sure how many of my readers outside the United States have any idea just how bad the state of education has gotten here. A pervasive philosophy of education that reduces its role to that of job training, cultural squabbles that have stripped curriculums of most of their meaningful content, reforms that made the careers of teachers and the finances of districts depend on standardized test scores and thus guaranteed that teaching students how to score high on those tests would be given priority over everything else, budget cuts that always get taken out of classroom expenses rather than administrative costs—well, you can do the math yourself. There are exceptions, but on the whole the public schools in America do a miserably poor job of teaching anything but contempt for learning.
Higher education is a more complex situation but, in some ways, an even more toxic one. Where the public schools trudge implacably onwards under the iron law of bureaucracy, colleges and universities have become an industry, governed by ethics no better than any other American business. It’s possible to get a good education from an American university if you’re lucky, smart and ruthless, but there are significant downsides to the experiment. The most important are, first, that the university system is more or less designed to leave you a quarter million dollars or so in debt by the time you finish your degree program, without the option of bankruptcy—college loans are federally guaranteed, meaning that the courts can’t discharge them—and, second, while the academic industry presents itself as a ticket to high-paying careers, the great majority of college degree programs don’t do anything of the kind. It’s been shown repeatedly that the vast majority of high school seniors who enter university now will never recover financially from the economic burden of paying off their student loans.
No doubt a case could be made, and no doubt it will be made, that the exposure to learning that comes from a college education is worth a lifetime of financial impoverishment. The difficulty with such claims is that the philosophy of education as job training that helped gut America’s public schools has done much the same thing to higher education, even in fields—such as the humanities—that sometimes claim to be exempt from the trend. In most of today’s American universities, despite a certain amount of lip service, humanities programs no longer fulfill their historic role of giving students a broad introduction to humanity’s cultural and intellectual heritage. Their focus instead is on the job training needed by future professors in one or another narrow subspecialty. Departments have to justify their existence in today’s academic industry by maximizing enrollment, however, and this means that degree programs in the humanities not only admit, but actively recruit, far more students every year than are needed to meet the demand for new professors of film studies, postcolonial literature, comparative history of ideas, and the like. That’s the reason why, as the joke goes, the first thing a liberal arts major says when he or she goes to work after graduation is “Would you like fries with that?”
Now factor in the multiple economic impacts of peak oil on a sprawling, dysfunctional collection of government bureacracies, on the one hand, and a corrupt and rapacious industry totally dependent on abundant credit and government loan guarantees, on the other. At the least, it’s a recipe for the end of American education as it’s currently practiced, and it’s not implausible that unless something else gets patched together in a hurry, it could mean the end of American education, period.
Like the rest of America’s bureaucracies and industries, education in this country got onto its current trajectory of metastatic growth in the aftermath of the Second World War, when oceans of cheap fossil fuel energy and the considerable benefits of global hegemony made no price tag look too big. When the wave of homegrown fossil fuel crested in the early 1970s, in turn, Americans—who even then were willing to blame almost anything for their troubles, other than the irritating unwillingness of the laws of physics to give them a limitless supply of energy—decided to double down and bluff, for all the world as though acting out the fantasy that we’d have plenty of energy and resources in the future would force the bluff to turn into reality.
The realization most Americans are frantically trying to stave off just now is that nature has called our bluff. That limitless new supply of energy most of us were waiting for hasn’t appeared, and there are good reasons, founded in the laws of physics, to think that it never will. In the meantime, our decision to double down has left us burdened with, among other things, a public school system and a collection of colleges and universities even more gargantuan and unaffordable than the ones we had before we doubled down, and a psychology of previous investment that all but guarantees that our society will keep on throwing good money after bad until there’s nothing left to throw. Politicians and ordinary people alike have taken to insisting, along these lines, that the solution to joblessness is to send people to college to get job training, on the assumption that this will somehow make jobs appear for them. To call this magical thinking is an insult to honest sorcerers, but it’s likely to be increasingly common in the years to come—at least until the bottom drops out completely.
Well before that happens, a system that’s already largely irrelevant to the needs of the present shows every sign of making itself completely irrelevant to the even more pressing challenges of the future. If anything is going to be salvaged from the wreckage, it’s going to have to be done by individuals who commit themselves to the task on their own time. To make sense of such a project, though, it’s going to be necessary to face a far more basic question: what, exactly, is the point of education?
That’s a far more complex question than it seems, because American culture has spent the last few decades at the far end of a pendulum swing between two sharply different understandings of education—and indeed of human knowledge itself. Call them abstraction and reflection. Abstraction is the view that holds that behind the hubbub and confusion of everyday life lies a fundamental order that can be known by the human mind, and accurately expressed by means of abstract generalizations—E=MC2, the law of supply and demand, the theory of evolution, or what have you. In an age dominated by abstraction, knowledge tends to be equated with these abstract generalizations, and education becomes a matter of teaching students to memorize them, apply them, and maybe add to the sum total of known generalizations.
Abstraction tends to predominate when civilizations are expanding. It’s a confident viewpoint, both in its faith that the human mind is capable of knowing the inner workings of the cosmos, and in its claims that its method for generating abstractions is applicable to all subjects and that its particular set of abstract generalizations equate to objective truth. Of course the faith and the claims run into trouble sooner or later; whatever method the civilization uses to determine truth—classical logic in ancient Greece, Christian theology in medieval Europe, experimental science in modern America—eventually ends up in paradox and self-contradiction, and the irreducible cussedness of nature turns the first elegant generalizations into clanking, overburdened theoretical machinery that eventually falls apart of its own weight. Meanwhile Utopian hopes of a society of reason and enlightenment, which partisans of abstraction always seem to cherish, run headlong into the hard realities of human nature: after Athens’ golden age, the Peloponnesian War and the self-destruction of Greek democracy; after the Gothic cathedrals and the great medieval summae, the Black Death and the Hundred Years War; after the brilliant trajectory of science from Galileo to Einstein—well, we’ll be around to see the opening stages of that.
That’s when reflection comes into play. Reflection is the view that recognizes that human ideas of the order of the cosmos are, in the final analysis, just another set of human ideas, and that the hubbub and confusion of everyday life is the only reality we can be sure of. In an age dominated by reflection, Giambattista Vico’s great maxim—“we can truly know only what we make”—takes center stage, and humanity rather than the cosmos becomes the core subject of knowledge. It’s not a knowledge that can be extracted in the form of abstract generalizations, either; it’s a personal, tacit knowledge, a knowledge woven of examples, intuitions, and things felt rather than things defined. From the standpoint of abstraction, of course, this isn’t knowledge at all, but in practical application it works surprisingly well; a sensitivity to circumstances and a memory well stocked with good examples and concrete maxims tend, if anything, to be more useful in the real world than an uncritical reliance on the constructions of current theory.
This is why Greek intellectual culture, with its focus on logic, mathematics, physics, and speculative philosophy, gave way to Roman intellectual culture, which focused instead on literature, history, jurisprudence, and ethical philosophy. It’s also why the culture of the high Middle Ages, with its soaring ambition to understand the cosmos by interpreting religious revelation in the light of reason, gave way to the humaniores litterae—literally, the more human writings—of the Renaissance, which focused attention back on humanity, not as an object under the not-yet-invented microscope, but as a subject capable of knowing and acting in a complex, unpredictable world. It’s by way of reference to those “more human writings” that we still call the characteristic interests of Renaissance culture “the humanities.”
Next week’s post will follow the most recent swing of the pendulum over to the side of abstraction, since that has to be understood in order to sort out what can be saved from contemporary science. Here, though, I want to spare a few moments for the almost completely neglected issue of the value of the humanities in an age of collapse. Modern American culture is so deeply invested in abstraction that the very suggestion that reflection, as I’ve defined it, could have pragmatic value as a way of knowledge seems ludicrous to most people. Still, given that we’ve landed ourselves in the usual trap that comes with overcommitment to abstraction—we can determine beyond a shadow of a doubt what has to be done, and prove that it has to be done, but we’ve become completely incapable of motivating people to do it—a strong case could be made that we need to pay more attention to that aspect of knowledge and culture that deals directly with human existence in its actual context of complexity and rootedness, an aspect that offers no general laws but many practical insights.
There’s another reason why it may be worthwhile to refocus on reflection rather than abstraction in the years ahead of us. As already mentioned, the partisans of abstraction have a hard time finding any value at all in reflection; Plato’s insistence that poets ought to be chucked out the gates of his Republic, John Scotus Erigena’s dismissal of core elements of the humanities because “they do not appear to have to do with the nature of things,” Descartes’ irritable condemnation of literary studies, and the fulminations of today’s scientific pundits against any dimension of human experience that can’t be measured on some kind of dial, all come from this habit of thought. Curiously, though, the reverse is rarely the case. In ages when reflection predominates, the sciences tend to be preserved and transmitted to the future along with the humanities, because the sciences are also products of human thought and culture; they can be studied as much for what they reveal about humanity as for what they reveal about nature. That shift has already been taking place; when the late Carl Sagan spun his compelling “We are star-stuff” myth for the viewers of Cosmos, for example, he was engaging in reflection rather than abstraction. Hs goal was not to communicate an abstract rule but to weave a narrative of meaning that provided a context within which human life can be lived.
The modern American educational system is by and large the last place on earth to try to pursue or communicate any such vision, whether undergirded by Saganism or some more traditional religion. Equally, though, as I’ve already pointed out, the modern American educational system is very poorly positioned indeed to deal with the impacts of peak oil, and the rest of the smorgasbord of crises the bad decisions of the last few decades have set out for us. The question that remains is what might replace it. What will come after the public schools is already taking shape, in the form of a lively and successful homeschooling movement that routinely turns out the sort of educated young people public schools once did; the replacement for what’s left of America’s once thriving trade schools is less well organized and defined as yet, but is emerging as craftspeople take on apprentices and people interested in a dizzying array of crafts form networks of mutual support. What we don’t yet have is a replacement for what the universities used to offer—some form of organized activity, however decentralized, informal, and inexpensive, that will see to the preservation and transmission of the intellectual heritage of our age.
What form such a thing might take is a challenging question, and one for which I don’t have any immediate answers. Still, it’s an issue that needs to be addressed. The pervasive spread of paranoiac conspiracy theories in contemporary American culture, which I mentioned toward the beginning of this post, is only one of several signs that too many people in this country have never learned how to doublecheck the validity of their own thinking, either against the principles of logic—a core element of the cultural heritage of abstraction—or against that attentiveness to circumstances and human motives that comes from “more human writings”—a core element of the cultural heritage of reflection. The people who chant “Drill, baby, drill,” as though it’s an incantation that will summon barrels of oil from thin air, are doing just as poor a job of reasoning about the world and reflecting on their motivations as the people who use the unprepossessing individuals teetering on the upper end of our political class as inkblots on which to project their need for scapegoats and their fantasies of absolute evil.
Working out the first rough sketch of a replacement for the American academic industry won’t stop the incantations or the scapegoating any time soon, and arguably won't stop it at all. Many other forces, as I suggested earlier, drive the contemporary flight from the muddled complexities of civil society into a comic-book world of supervillains whose alleged malignity is so clearly a product of the believer’s need to find someone to blame. Yet the tasks facing those of us who are trying to get ready for the unraveling of industrial America, and the comparable tasks our grandchildren and our grandchildren’s grandchildren will face, will demand plenty of clear, perceptive, well-informed thinking, guided both by abstraction’s useful generalizations and by reflection’s sharpened sensitivities. Doing something to salvage learning, while there’s still a chance to do so, is one potentially crucial way to help that happen.