Originally published at SSRN on January 29, 2019
The theorizing about what has come to be known as the "technological Singularity" holds that human beings can produce a greater-than-human sentience technologically, and that this may come within the lifetimes of not only today's children, but much or even most of its adult population as well.1 Much of this anticipates that this will be part of a larger trend of accelerating technological change, with advances in computers matched by advances in other areas, like genetic engineering and nano-scale manipulation of matter; or be a source of acceleration itself as superhuman artificial intelligences perform technological feats beyond human capability.2 In either case, as the term "singularity" suggests, the consequences become unpredictable, but a common view is that shortly after the arrival of the big moment, we will see the most fundamental aspects of human existence—birth, growth, aging, senescence, mortality; the limits of an individual consciousness' functioning in space and time; material scarcity and the conditions it sets for the physical survival of human beings—altered so radically that we would become transhumans, on the way to becoming outright posthumans. Those who describe themselves as Singularitarians expect these changes to be not merely profound, but a liberation of the species from the constraints that have cruelly oppressed it since its first appearance on this planet.
All this, of course, is mind-bending stuff. Indeed, no one alive today can really, fully wrap their mind around it, even those most steeped in the idea. Still, the difficulty of the concepts lies not only in the complete alienness of such conditions to our personal and collective experience, but also in their flying in the face of the conventional expectations—not least, that even if we have grown used to technology changing constantly, the really important things in life do not, cannot change. Indeed, passive acceptance of the world as it is; a view of it as unchanged, unchanging and unchangeable; and given that this applies to a great deal that is unquestionably bad, an ironic attitude toward the prospects for human happiness; are commonly equated with "wisdom." And rejection of things as they are, a desire to alter them for the better, a belief that human beings have a right to happiness, are likewise equated with not just the opposite of wisdom, but the cause of disaster.
This is all, of course, a terribly bleak and unattractive perspective to any but the most complacent of us—and not altogether rational, inconceivable as it is without the idea that the cosmos is somehow metaphysically rigged against human beings. Why should this morbid outlook enjoy such force? Especially in a modern world where it has been proven that life does indeed change? And, frankly, that meaningful improvement in the terms of existence of human beings is achievable?
The Tragic View
One obvious reason is the weight of a very old, long tradition that has deeply suffused just about every human culture, Western culture by no means least. In ancient times, as people began to think about the world around them, they could see that the world was not what human beings would like it to be. Life was poor, nasty, brutish, short—with hunger, illness, violence, insecurity (to say nothing of innumerable lesser hardships and frustrations) utterly saturating their existence. Childbirth was dangerous to all concerned, and few of the children born made it all the way to adulthood. A more settled existence brought somewhat greater affluence and security, but this was only relative, and purchased at a high price in toil and oppression, with daily existence defined by a more regimented routine of labor, and a more stratified society. The many worked for the enrichment of a few—and even the comparative luxury in which the few lived only exempted them from so much. Even those who ate well suffered their share of disease in an age of primitive medicine, and violence too when being part of a ruling caste meant being a warrior. And of course, even the most sheltered existence meant aging and death.
It also seemed to them that there was not much they could do about it. The faintness of their understanding of how the world about them worked, the primitiveness of the means to hand that necessarily followed from this, the dimness of their awareness that knowledge of the world could be applied to the end of enlarging economic productivity, meant that one could picture only so much improvement in the economic output that in hindsight we know to be key to deep or lasting material progress.
The crying out in anguish against all this was the birth of tragedy. One can see it in that oldest of stories, the
Epic of Gilgamesh. Gilgamesh, seeing a bug fall out of the nose of his fallen friend Enkidu (the image crops up again and again in the poem) is horrified by the reality of death, sets out in quest of immortality—and all too predictably fails to achieve it, falling asleep while a snake gobbles up the herb that would have let him live forever before he can take it.
A very large part of higher culture has been a development of this sensibility. In the Abrahamic religious tradition we have the temptation of Adam and Eve, original sin, the Fall, expulsion from Eden, a punishment compounded by the familiar limits of the "human condition": traumatic birth, a life spent in toil, death.
So does it likewise go in the Classical tradition, where humans, whose Golden Age lies in the past, have their lives spun out, measured and cut by the Fates, and the details of those lives not decided by the Fates determined by the whims of gods intent on keeping them humble.3 (Poseidon could not keep Odysseus ever from getting home—outside his purview, that—but he did see that it was a ten year odyssey, and did a good many worse things with a good deal less reason; while "wise" Athena was not so far above petty jealousy as to refrain from turning the human who bested her as a weaver into a spider.)
Eventually developed, too, was an element of compensation for all this. Human beings suffer in this world—but if they bow their heads and obey, they will eventually be blessed in this world, or if they don't get so blessed, find something better in another one on the other side of death. And in at least some traditions, human suffering as a whole does end, a Millennium arriving and all well with the world after that.
Still, the connection between good behavior and reward was necessarily fuzzy, and even in those traditions notes of doubt about the rightness of all this are evident. As God inflicts on his exceptionally faithful servant Job one horrific suffering after another simply for the sake of a bet with the Devil, when he has had all he can take (he is huddling shit to keep warm), Job cries out "Why?"
Oedipus, approaching his death (in
Oedipus at Thebes, the most accomplished but least-read of the trilogy by Sophocles), wonders at the same thing. After all, killing a man who challenged him in that sort of roadside confrontation and marrying a queen were only turned from incidents in a tale of heroic adventure into cosmic crimes by the fact that the man was his father, the woman his mother, both of which details were totally unknown to him—while the whole sequence of events was triggered by his father's attempt to evade punishment for an unspeakable crime of his own by ordering his own son's infanticide. Where was the justice in that?
Of course, no satisfactory answer is forthcoming to such questions. Indeed, to modern, rational eyes, tales like those of Job and Oedipus are about the subjection of human beings through no fault of their own to horrors by the will of arbitrary, cruel gods, whose right to do such things is a simple matter of their having the power to do it and get away with it.
And as it happens, there are also doubts that things really have to be this way. The idea that humans could become like those gods, acquire the power to be like them, and even overthrow them, but that this was forbidden to them and they were slapped down when they tried, cropped up again and again. Gilgamesh, after all, may not have attained his goal but he did come very, very close, only at the very last minute losing the herb that would have made him live forever. In the Garden of Eden the sin of Adam and Eve was to eat of the fruit of the tree of Knowledge of Good and Evil, knowledge which could make them like gods. Zeus begrudged man the gift of fire, and punished his benefactor Prometheus by chaining him to Mount Elbrus and having a vulture tea out and eat his liver, after which it grew back at night so that it could be torn out and eaten again the next day, and the day after that, and the day after that . . . but human beings kept the knowledge of fire nonetheless.
All the same, these admittedly not insignificant details are exceptional, contrary hints, and not more than that in a narrative that, pretty much always, reiterated again and again passivity and awe before the majesty of a design beyond our ken.
"KNOW YOUR PLACE!" it all thunders.
And by and large, it was exceedingly rare that anyone carried the thought further. There was, after all, much more emphasis on what had been and what was than what might be—and little enough basis for thinking about that. Even among the few who had leisure to think and the education and associations to equip them with the best available tools with which to do it, mental horizons were bounded by the crudity of those tools. (Two thousand years ago, syllogisms were cutting-edge stuff.) By the narrowness of personal experience. (Communities were small, movement across even brief distances a privilege of very few and even that difficult and dangerous, while the "Known World" of which people even heard was a very small thing indeed.) The slightness of the means of communication. (Illiteracy was the norm; books hand-copied on papyrus and parchment and wax tablets were rare and expensive things; and the more complex and less standardized grammar and spelling, the tendency to use language decoratively rather than descriptively, the roundabout methods of argument and explanation—such as one sees in Socratic dialogue—likely made deep reading comprehension a rarer thing than we realize.) And even the brevity of life. (Life expectancy was thirty, imposing a fairly early cut-off on how much the vast majority of people could learn, even if they had the means and opportunity.)
Moreover, the conventional ideas enjoyed not only far more powerful cultural sanction than they possess now, through the force of religious belief and custom, but were backed up by all the violence of which society was capable. This was the more so not only because the powerful found it easier to take such views (it is one thing to be philosophical about man's condemnation to hard toil when one is a slave in the fields, another when one is a wealthy patrician who has never done any sitting on his shaded porch enjoying a cool drink), but because, within the limits of the world as they knew them, their situation was quite congenial to them.
Carroll Quigley, who wrote at some length about the conflict between democratically inclined, socially critical "progressives" and oligarchical "conservatives" in ancient Greece in
The Evolution of Civilizations observed that the latter settled on the idea "that change was evil, superficial, illusory, and fundamentally impossible" as a fundamental of their thought. This applied with particular force to terms of social existence, like slavery, which they held to be "based on real unchanging differences and not upon accidental or conventional distinctions." Indeed, the object pursued by those who would have changed such things—a redress of material facts—was itself attacked by the associated view that "all material things" were "misleading, illusory, distracting, and not worth seeking."
In short—the world cannot be changed, trying to change it will make life even worse than it is, and anyway, you shouldn't be thinking about the material facts at all. This anti-materialism went hand in hand with a denigration of observation and experiment as a way of testing propositions about the world—an aristocrat's musings superior to actually seeing for oneself how things actually stood. With the concrete facts of the world trivialized in this way, the conventional wisdom handed down from the past was that much further beyond challenge (while, of course, this outlook did not exactly forward the development of technological capability). Ultimately the promotion of these ideas by the "oligarchs" (and their rejection of the ideas not to their liking), helped by the primitiveness of communication (works the rich would not pay to copy did not endure), was so effective that, as Quigley noted, "the works of the intellectual supporters of the oligarchy, such as Plato, Xenophon, and Cicero" have survived, but "the writings of the Sophists and Ionian scientists," "of Anaxagoras and Epicurus," have not.
In the wake of all this it may be said that philosophy was less about understanding the world (let alone changing it) than accommodating oneself to it—by learning to be better at being passive. One picks up Marcus Aurelius'
Meditations, and finds the celebrated work by the famous emperor-philosopher to be . . . a self-help book. And rather than a philosophy concerned with nature or politics, the metaphysics of the thoroughly anti-worldly Plotinus (who taught that the ultimate good lay in one's turning away from the low world of material sense-reality to the higher one of the spirit as the path to ecstatic union with the Divine) was arguably the most influential legacy of the latter part of the Classical era.
The pessimism of the Classical outlook, particularly by way of Plotinus, did much to shape and even blend with the Abrahamic tradition as Jewish and early Christian thinkers used Greek philosophy in interpreting religious texts—while the work of the Greeks endured as the cornerstone of secular learning in its own right. Of course, Classical civilization crumbled in Western Europe. Yet, in its way that strengthened rather than weakened the hold of such thinking. Of the major institutions of the Western Roman Empire, only the Church survived and flourished, playing a larger role than ever in the centuries that followed. Meanwhile, amid these "Dark Ages," there was a nostalgia for ancient times, and with it a propensity for exalting its practical achievements as unsurpassed and unsurpassable. The Greeks, the Romans, were held to have all the answers, and it was thought best to consult them rather than try to find out new things, the means for which that Classical philosophy, of course, marginalized. Along with the prevailing religiosity, this whole outlook directed philosophers' attentions away from the workaday world—to the point of being famously satirized as debates over how many angels could dance on the head of a pin. And of course, if reason said one thing and religion another, then one had to accept that religion was right—or face the Inquisition. Unsurprisingly much energy went into attempts to reconcile the world's very real horrors with the existence of a divine plan by an all-good and all-powerful Supreme Being—pessimism about whether things could have been or could be better confusingly passed off as the optimism that this is the "best of all possible worlds."
Tragedy, Modernity and Reaction
So things went until the Renaissance, and the flowering of humanism with it, and the intellectual developments that succeeded it. In the seventeenth century thinkers like Francis Bacon and Renee Descartes not only explicitly formulated and advocated a scientific method based precisely on the value of study of the material world. They also declared for the object of uncovering all nature's secrets and applying the knowledge to the end of "relieving man's estate." Moreover, such thinking was quickly extended by others to social, economic and political life. Opposing barbarous custom and superstition, they identified and defended the rights of all individual human beings enjoyed specifically because they are human beings (life, liberty, property), extending to the right to choose their own government—even rebelling when an existing government failed to perform even the bare minimum of its duty (as Thomas Hobbes did), or became repressive (as John Locke did).
In short, the prospect of positive, meaningful, humanly conceived and controlled change was raised explicitly and forcefully by the Scientific Revolution, by liberalism, and by the Enlightenment more generally—and raised them successfully. However, that scientific inquiry, applied science and political liberalism flourished in modern times as they did not in the ancient did not mean that they were unopposed. The old ideas never ceased to have their purchase, and certainly vested interests in the modern world (as with Churchmen concerned for their influence and privileges) could not look on such talk with equanimity any more than their forebears did. Conservatives threatened by all this clung to tradition, to the religious institutions that usually sided with it, to the unchanging verities of the "ancients" over the reason and science of the "moderns." The now-obscure English theorist of divine right, Robert Filmer, insisted in
De Patriarcha that kings were granted by God paternal rights over their peoples, which extended to the power of life or death—and revolutionary, democratic alternatives doomed to short, bloody and chaotic lives ending in failure.
Filmer's arguments (which Locke eviscerated in his
First Treatise on Government) were belied by the unprecedented peace and prosperity that England enjoyed after the 1688 Revolution. However, conservatives responded to the Enlightenment with the Counter-Enlightenment, identifying reason and change more generally with disaster, stressing the original sin Christianity held tainted human beings, and even rejecting the idea of the individual human being as a meaningful unit of social analysis.
Indeed, it became common to oppose to the universalism of the Enlightenment a politics of identity in the manner of Joseph de Maistre (who famously remarked that he had met Frenchmen, Italians, Russians, but knew nothing of "man"), with identity usually defined in terms hostile to progressive ideas. Reason, secularism, democracy, were commonly characterized by such thinkers as alien, reflecting the character of another people—typically a character held to be less virtuous, less "pure," less "spiritual" than "our own." If such things worked at all, they said, it was only for those less virtuous, less pure, less spiritual people; certainly they cannot work for us, which is assuredly a good thing as our traditionalism, religiosity, monarchism, serf-lord relationships and the like express and sustain a greater wisdom than those Others can ever know, and which importing their ways could only corrupt.
Going hand in hand with this was much romanticizing of the rural, agrarian element of the population as a repository of those older ways—unlike these rootless city types, especially the ones with a modicum of "book learning," which seemed not an altogether good thing. Worst of all in their eyes those "overeducated" types who, "educated above their intelligence," perhaps defectively born with too much brain and too little body, too little blood, have become alienated from their roots and their natural feelings—internal foreigners. And indeed the visions of reform to which they so often inclined, they said, showed that while they spoke of the people they did not know, understand or respect them—and said that what they needed most of all was some hardship and toil among the lower orders to teach them "the real world." (Thus does it go in
Leo Tolstoy's War and Peace, where Pierre Bezukhov goes from Western-educated cosmopolitan intellectual to apostle of the peasant Karataev, who passively accepts whatever life brings until he dies, marching in the French army's prisoner train as it retreats from Russia.)
All this naturally converged with a whole host of prejudices, old and new, exemplified perhaps in Victorian-era theorists of "Aryanism," who identified the conservative, traditionalist stances (an acceptance of the unchanging nature of things, idealism over materialism, etc.) with spiritually superior Aryan cultures, and liberal/radical, modern outlooks with inferior "non-Aryans"—even when they made up their lists of who went under each heading in mutually exclusive ways. Thus one had the absurdity of German and Russian nationalists each insisting that their country was the purest bearer of the Aryan legacy—while the other nation was not Aryan at all.4
Of course, this reaction did not turn back the clock, the Old Regime never returning and those who even really wished for such an outcome becoming something of a lunatic fringe, but it still had its successes. Religious, nationalistic, traditionalist, anti-intellectual and "populist" appeals on behalf of the status quo and its greatest beneficiaries helped make the spread of formal democracy a less threatening thing to the contented. Meanwhile, as conscious, explicit belief in specific religious doctrines weakened, what might be called the "religious frame of mind" remained, plainly evident in the phrases of everyday language, like the need "to have faith," the idea that "things are meant to be" or "not meant to be," and of course, that there are "things man is not meant to know." (Faith in what, exactly? If things are "meant" or "not meant," just who—or Who—is doing the "meaning?")
And of course, as the old guard of throne, aristocracy and established church declined, the bourgeoisie that had once been revolutionary, now enjoying power and privilege, and anxious about the lower orders and the social questions their existence raised, became conservative in its turn, likewise inclining toward that idea of change as "evil, superficial, illusory, and fundamentally impossible," and reason and its prescriptions as a thing best not pushed "too far." It was less of a stretch than might be imagined—the bourgeois outlook, after all, being focused on the individual, specifically an ethic of individual success-striving and individual moral responsibility, the existing social frame so utterly taken for granted that it did not seem to exist for them at all. (Indeed, as Margaret Thatcher made clear, at times their politics has explicitly denied that it does.)
Unsurprisingly, those favoring constancy over change found more rationalistic-seeming supports for their outlook. The dark view of radical social change taken by the French Revolution's enemies, which identified the French Revolution not with the Declaration of the Rights of Man or the abolition of feudal oppressions but guillotines, Napoleon and Restoration, colored the popular imagination of the event—driving home the idea that if revolution was not a crime against God, then it was still bloody, messy and doomed to failure.
And this was not simply founded on a vague sense of society's machinery as a complex thing not easily tinkered with, or insecurity about whether the state of the art in social engineering is up to such a challenge (the position of a Karl Popper, for example), but a whole host of newer rationales for the unchangeable nature of the world, the ineradicability of the evils in it, the obvious implications for human happiness, and the wisdom of letting everything alone. Like Malthusian scarcity, which attributed poverty and its attendant miseries not to economic or social systems, but "the passion between the sexes." And its extension in the Social Darwinism of Herbert Spencer, in which perhaps God did not assign different people different stations, but Nature did in its uneven distribution of "fitness," and the still more uneven rewards accruing to it. Or the Nietzschean will-to-power. Or Freudian psychoanalysis, which declared the repression of basic human drives (the pursuit of sex, the aversion to work as we commonly think of it) as essential to civilized life.
Or postmodernism, ostensibly secular adherents of which speak of "the problem of evil" in the mystical tones of Medieval theologians, with the subject-object separation of their epistemology an apparent stand-in for the Fall, while in their attachment to identity politics they echo de Maistre's remarks about never having met Man, all of which adds up to a hostility to "grand narratives" as ferocious as any other attack ever launched on the idea of progress—in as thoroughly obscurantist a language as any clergy ever devised. And of course, there are more popular streams of thought, not least the self-help culture, which promotes a conservative idealism scarcely distinguishable from that of the ancient Greek oligarchs. (You can't change the world! There is no world! There's just you, and how you look at it, so change yourself instead!)
All of this so thoroughly saturates our culture that there is no getting away from it—even those most educated for the task of critical thought. The age, ubiquity, association of such an outlook with those most prestigious philosophical and literary texts that have never ceased to be the cornerstone of an education in the humanities (from Aristotle to Shakespeare, from Milton to Tolstoy) is itself be enough to endow such an outlook with enormous prestige to which few intellectuals are totally immune. (Indeed, it was a nineteenth century radical of some note who observed that "the past weighs like a nightmare on the brain of the living"—and a twentieth century radical who remarked that in his youth England and its empire were ruled by "men who could quote Horace but had never heard of algebra.") That there is never a shortage of rationalists who, in disappointment or despair personal or political—or simple self-interest—repudiate their old beliefs to take up exactly opposite ones at the other end of the political spectrum also encourages the tendency. (After all, just as much as ever, intellectual, cultural and especially political life remain the preserve of the privileged, who remained inclined to conservatism, while privilege remains ready to place its prestige and wealth behind conservative thinkers and thought.)
Little wonder, then, that ultra right-wing postmodernism, passed off as daring leftishness to the confusion of nearly all, has become the conventional wisdom of the humanities, the social sciences, and much else, while popular culture serves up a diet of pessimism in one movie about science and technology going wrong after another. We may not know what will happen, exactly, but we are usually safe in assuming that something bad will come of that experiment, that invention. Safe in assuming that if there is a robot, it will rebel. And with the far, far more radical prospects opened up by the Singularity, the dread is commensurately greater.
Those who would discuss the subject rationally have to realize that they are up against all this baggage. And if they wish to be persuade the public of the positive possibilities that our technology has already opened up, and might continue to open up in the coming years, they must be prepared not only to promise the realization of long thwarted human hopes, but challenge the colossal weight of millennia of dark, misanthropic irrationality with the still more overpowering force of reason, too little appreciated these days, but as great as it ever was.
1 The term "technological Singularity" is generally credited to Vernor Vinge. Other writers in this area include Hans Moravec and perhaps most famously Ray Kurzweil. See Vernor Vinge, "The Coming Technological Singularity: How to Survive in the Post-Human Era," VISION-21 Symposium, NASA Lewis Research Center and the Ohio Aerospace Institute, 30-31 Mar. 1993; Hans Moravec,
Mind Children: The Future of Human and Robot Intelligence (Cambridge, MA: Harvard University Press, 1990) and Moravec,
Robot (New York: Oxford University Press, 2000); Ray Kurzweil,
The Age of Spiritual Machines: When Computers Exceed Human Intelligence (New York: Viking, 1999) and Kurzweil,
The Singularity is Near: When Humans Transcend Biology (New York: Viking, 2005).
2 As Irving John Good put it in an early paper on the subject, "the first ultraintelligent machine is the last invention that man need ever make." Irving John Good, "Speculations Concerning the First Ultra-Intelligent Machine." In Franz L. Alt and Morris Rubinoff,
Advances in Computers 6 (New York: Academic Press, 1965), 31-88.
3 Writing of the ancient Greeks John Ransom Crowe characterized their view of the world as something that not only "resists mastery, is more mysterious than intelligible . . . a world of appearances," but also "perhaps . . . more evil than good." John Crowe Ransom,
The New Criticism (Westport, CN: Greenwood Press, 1979), 335.
4 German theorists, of course, excluded the Slavs from the Aryan category, while the Russian Slavophile Aleksey Khomyakov regarded the Slavs as Europe's true Aryans, while Germans were non-Aryan "Kushites."