Originally published at SSRN on January 29, 2019
The theorizing about what has come to be known as the "technological Singularity" holds that human beings can produce a greater-than-human sentience technologically, and that this may come within the lifetimes of not only today's children, but much or even most of its adult population as well.1 Much of this anticipates that this will be part of a larger trend of accelerating technological change, with advances in computers matched by advances in other areas, like genetic engineering and nano-scale manipulation of matter; or be a source of acceleration itself as superhuman artificial intelligences perform technological feats beyond human capability.2 In either case, as the term "singularity" suggests, the consequences become unpredictable, but a common view is that shortly after the arrival of the big moment, we will see the most fundamental aspects of human existence—birth, growth, aging, senescence, mortality; the limits of an individual consciousness' functioning in space and time; material scarcity and the conditions it sets for the physical survival of human beings—altered so radically that we would become transhumans, on the way to becoming outright posthumans. Those who describe themselves as Singularitarians expect these changes to be not merely profound, but a liberation of the species from the constraints that have cruelly oppressed it since its first appearance on this planet.
All this, of course, is mind-bending stuff. Indeed, no one alive today can really, fully wrap their mind around it, even those most steeped in the idea. Still, the difficulty of the concepts lies not only in the complete alienness of such conditions to our personal and collective experience, but also in their flying in the face of the conventional expectations—not least, that even if we have grown used to technology changing constantly, the really important things in life do not, cannot change. Indeed, passive acceptance of the world as it is; a view of it as unchanged, unchanging and unchangeable; and given that this applies to a great deal that is unquestionably bad, an ironic attitude toward the prospects for human happiness; are commonly equated with "wisdom." And rejection of things as they are, a desire to alter them for the better, a belief that human beings have a right to happiness, are likewise equated with not just the opposite of wisdom, but the cause of disaster.
This is all, of course, a terribly bleak and unattractive perspective to any but the most complacent of us—and not altogether rational, inconceivable as it is without the idea that the cosmos is somehow metaphysically rigged against human beings. Why should this morbid outlook enjoy such force? Especially in a modern world where it has been proven that life does indeed change? And, frankly, that meaningful improvement in the terms of existence of human beings is achievable?
The Tragic View
One obvious reason is the weight of a very old, long tradition that has deeply suffused just about every human culture, Western culture by no means least. In ancient times, as people began to think about the world around them, they could see that the world was not what human beings would like it to be. Life was poor, nasty, brutish, short—with hunger, illness, violence, insecurity (to say nothing of innumerable lesser hardships and frustrations) utterly saturating their existence. Childbirth was dangerous to all concerned, and few of the children born made it all the way to adulthood. A more settled existence brought somewhat greater affluence and security, but this was only relative, and purchased at a high price in toil and oppression, with daily existence defined by a more regimented routine of labor, and a more stratified society. The many worked for the enrichment of a few—and even the comparative luxury in which the few lived only exempted them from so much. Even those who ate well suffered their share of disease in an age of primitive medicine, and violence too when being part of a ruling caste meant being a warrior. And of course, even the most sheltered existence meant aging and death.
It also seemed to them that there was not much they could do about it. The faintness of their understanding of how the world about them worked, the primitiveness of the means to hand that necessarily followed from this, the dimness of their awareness that knowledge of the world could be applied to the end of enlarging economic productivity, meant that one could picture only so much improvement in the economic output that in hindsight we know to be key to deep or lasting material progress.
The crying out in anguish against all this was the birth of tragedy. One can see it in that oldest of stories, the Epic of Gilgamesh. Gilgamesh, seeing a bug fall out of the nose of his fallen friend Enkidu (the image crops up again and again in the poem) is horrified by the reality of death, sets out in quest of immortality—and all too predictably fails to achieve it, falling asleep while a snake gobbles up the herb that would have let him live forever before he can take it.
A very large part of higher culture has been a development of this sensibility. In the Abrahamic religious tradition we have the temptation of Adam and Eve, original sin, the Fall, expulsion from Eden, a punishment compounded by the familiar limits of the "human condition": traumatic birth, a life spent in toil, death.
So does it likewise go in the Classical tradition, where humans, whose Golden Age lies in the past, have their lives spun out, measured and cut by the Fates, and the details of those lives not decided by the Fates determined by the whims of gods intent on keeping them humble.3 (Poseidon could not keep Odysseus ever from getting home—outside his purview, that—but he did see that it was a ten year odyssey, and did a good many worse things with a good deal less reason; while "wise" Athena was not so far above petty jealousy as to refrain from turning the human who bested her as a weaver into a spider.)
Eventually developed, too, was an element of compensation for all this. Human beings suffer in this world—but if they bow their heads and obey, they will eventually be blessed in this world, or if they don't get so blessed, find something better in another one on the other side of death. And in at least some traditions, human suffering as a whole does end, a Millennium arriving and all well with the world after that.
Still, the connection between good behavior and reward was necessarily fuzzy, and even in those traditions notes of doubt about the rightness of all this are evident. As God inflicts on his exceptionally faithful servant Job one horrific suffering after another simply for the sake of a bet with the Devil, when he has had all he can take (he is huddling shit to keep warm), Job cries out "Why?"
Oedipus, approaching his death (in Oedipus at Thebes, the most accomplished but least-read of the trilogy by Sophocles), wonders at the same thing. After all, killing a man who challenged him in that sort of roadside confrontation and marrying a queen were only turned from incidents in a tale of heroic adventure into cosmic crimes by the fact that the man was his father, the woman his mother, both of which details were totally unknown to him—while the whole sequence of events was triggered by his father's attempt to evade punishment for an unspeakable crime of his own by ordering his own son's infanticide. Where was the justice in that?
Of course, no satisfactory answer is forthcoming to such questions. Indeed, to modern, rational eyes, tales like those of Job and Oedipus are about the subjection of human beings through no fault of their own to horrors by the will of arbitrary, cruel gods, whose right to do such things is a simple matter of their having the power to do it and get away with it.
And as it happens, there are also doubts that things really have to be this way. The idea that humans could become like those gods, acquire the power to be like them, and even overthrow them, but that this was forbidden to them and they were slapped down when they tried, cropped up again and again. Gilgamesh, after all, may not have attained his goal but he did come very, very close, only at the very last minute losing the herb that would have made him live forever. In the Garden of Eden the sin of Adam and Eve was to eat of the fruit of the tree of Knowledge of Good and Evil, knowledge which could make them like gods. Zeus begrudged man the gift of fire, and punished his benefactor Prometheus by chaining him to Mount Elbrus and having a vulture tea out and eat his liver, after which it grew back at night so that it could be torn out and eaten again the next day, and the day after that, and the day after that . . . but human beings kept the knowledge of fire nonetheless.
All the same, these admittedly not insignificant details are exceptional, contrary hints, and not more than that in a narrative that, pretty much always, reiterated again and again passivity and awe before the majesty of a design beyond our ken.
"KNOW YOUR PLACE!" it all thunders.
And by and large, it was exceedingly rare that anyone carried the thought further. There was, after all, much more emphasis on what had been and what was than what might be—and little enough basis for thinking about that. Even among the few who had leisure to think and the education and associations to equip them with the best available tools with which to do it, mental horizons were bounded by the crudity of those tools. (Two thousand years ago, syllogisms were cutting-edge stuff.) By the narrowness of personal experience. (Communities were small, movement across even brief distances a privilege of very few and even that difficult and dangerous, while the "Known World" of which people even heard was a very small thing indeed.) The slightness of the means of communication. (Illiteracy was the norm; books hand-copied on papyrus and parchment and wax tablets were rare and expensive things; and the more complex and less standardized grammar and spelling, the tendency to use language decoratively rather than descriptively, the roundabout methods of argument and explanation—such as one sees in Socratic dialogue—likely made deep reading comprehension a rarer thing than we realize.) And even the brevity of life. (Life expectancy was thirty, imposing a fairly early cut-off on how much the vast majority of people could learn, even if they had the means and opportunity.)
Moreover, the conventional ideas enjoyed not only far more powerful cultural sanction than they possess now, through the force of religious belief and custom, but were backed up by all the violence of which society was capable. This was the more so not only because the powerful found it easier to take such views (it is one thing to be philosophical about man's condemnation to hard toil when one is a slave in the fields, another when one is a wealthy patrician who has never done any sitting on his shaded porch enjoying a cool drink), but because, within the limits of the world as they knew them, their situation was quite congenial to them.
Carroll Quigley, who wrote at some length about the conflict between democratically inclined, socially critical "progressives" and oligarchical "conservatives" in ancient Greece in The Evolution of Civilizations observed that the latter settled on the idea "that change was evil, superficial, illusory, and fundamentally impossible" as a fundamental of their thought. This applied with particular force to terms of social existence, like slavery, which they held to be "based on real unchanging differences and not upon accidental or conventional distinctions." Indeed, the object pursued by those who would have changed such things—a redress of material facts—was itself attacked by the associated view that "all material things" were "misleading, illusory, distracting, and not worth seeking."
In short—the world cannot be changed, trying to change it will make life even worse than it is, and anyway, you shouldn't be thinking about the material facts at all. This anti-materialism went hand in hand with a denigration of observation and experiment as a way of testing propositions about the world—an aristocrat's musings superior to actually seeing for oneself how things actually stood. With the concrete facts of the world trivialized in this way, the conventional wisdom handed down from the past was that much further beyond challenge (while, of course, this outlook did not exactly forward the development of technological capability). Ultimately the promotion of these ideas by the "oligarchs" (and their rejection of the ideas not to their liking), helped by the primitiveness of communication (works the rich would not pay to copy did not endure), was so effective that, as Quigley noted, "the works of the intellectual supporters of the oligarchy, such as Plato, Xenophon, and Cicero" have survived, but "the writings of the Sophists and Ionian scientists," "of Anaxagoras and Epicurus," have not.
In the wake of all this it may be said that philosophy was less about understanding the world (let alone changing it) than accommodating oneself to it—by learning to be better at being passive. One picks up Marcus Aurelius' Meditations, and finds the celebrated work by the famous emperor-philosopher to be . . . a self-help book. And rather than a philosophy concerned with nature or politics, the metaphysics of the thoroughly anti-worldly Plotinus (who taught that the ultimate good lay in one's turning away from the low world of material sense-reality to the higher one of the spirit as the path to ecstatic union with the Divine) was arguably the most influential legacy of the latter part of the Classical era.
The pessimism of the Classical outlook, particularly by way of Plotinus, did much to shape and even blend with the Abrahamic tradition as Jewish and early Christian thinkers used Greek philosophy in interpreting religious texts—while the work of the Greeks endured as the cornerstone of secular learning in its own right. Of course, Classical civilization crumbled in Western Europe. Yet, in its way that strengthened rather than weakened the hold of such thinking. Of the major institutions of the Western Roman Empire, only the Church survived and flourished, playing a larger role than ever in the centuries that followed. Meanwhile, amid these "Dark Ages," there was a nostalgia for ancient times, and with it a propensity for exalting its practical achievements as unsurpassed and unsurpassable. The Greeks, the Romans, were held to have all the answers, and it was thought best to consult them rather than try to find out new things, the means for which that Classical philosophy, of course, marginalized. Along with the prevailing religiosity, this whole outlook directed philosophers' attentions away from the workaday world—to the point of being famously satirized as debates over how many angels could dance on the head of a pin. And of course, if reason said one thing and religion another, then one had to accept that religion was right—or face the Inquisition. Unsurprisingly much energy went into attempts to reconcile the world's very real horrors with the existence of a divine plan by an all-good and all-powerful Supreme Being—pessimism about whether things could have been or could be better confusingly passed off as the optimism that this is the "best of all possible worlds."
Tragedy, Modernity and Reaction
So things went until the Renaissance, and the flowering of humanism with it, and the intellectual developments that succeeded it. In the seventeenth century thinkers like Francis Bacon and Renee Descartes not only explicitly formulated and advocated a scientific method based precisely on the value of study of the material world. They also declared for the object of uncovering all nature's secrets and applying the knowledge to the end of "relieving man's estate." Moreover, such thinking was quickly extended by others to social, economic and political life. Opposing barbarous custom and superstition, they identified and defended the rights of all individual human beings enjoyed specifically because they are human beings (life, liberty, property), extending to the right to choose their own government—even rebelling when an existing government failed to perform even the bare minimum of its duty (as Thomas Hobbes did), or became repressive (as John Locke did).
In short, the prospect of positive, meaningful, humanly conceived and controlled change was raised explicitly and forcefully by the Scientific Revolution, by liberalism, and by the Enlightenment more generally—and raised them successfully. However, that scientific inquiry, applied science and political liberalism flourished in modern times as they did not in the ancient did not mean that they were unopposed. The old ideas never ceased to have their purchase, and certainly vested interests in the modern world (as with Churchmen concerned for their influence and privileges) could not look on such talk with equanimity any more than their forebears did. Conservatives threatened by all this clung to tradition, to the religious institutions that usually sided with it, to the unchanging verities of the "ancients" over the reason and science of the "moderns." The now-obscure English theorist of divine right, Robert Filmer, insisted in De Patriarcha that kings were granted by God paternal rights over their peoples, which extended to the power of life or death—and revolutionary, democratic alternatives doomed to short, bloody and chaotic lives ending in failure.
Filmer's arguments (which Locke eviscerated in his First Treatise on Government) were belied by the unprecedented peace and prosperity that England enjoyed after the 1688 Revolution. However, conservatives responded to the Enlightenment with the Counter-Enlightenment, identifying reason and change more generally with disaster, stressing the original sin Christianity held tainted human beings, and even rejecting the idea of the individual human being as a meaningful unit of social analysis.
Indeed, it became common to oppose to the universalism of the Enlightenment a politics of identity in the manner of Joseph de Maistre (who famously remarked that he had met Frenchmen, Italians, Russians, but knew nothing of "man"), with identity usually defined in terms hostile to progressive ideas. Reason, secularism, democracy, were commonly characterized by such thinkers as alien, reflecting the character of another people—typically a character held to be less virtuous, less "pure," less "spiritual" than "our own." If such things worked at all, they said, it was only for those less virtuous, less pure, less spiritual people; certainly they cannot work for us, which is assuredly a good thing as our traditionalism, religiosity, monarchism, serf-lord relationships and the like express and sustain a greater wisdom than those Others can ever know, and which importing their ways could only corrupt.
Going hand in hand with this was much romanticizing of the rural, agrarian element of the population as a repository of those older ways—unlike these rootless city types, especially the ones with a modicum of "book learning," which seemed not an altogether good thing. Worst of all in their eyes those "overeducated" types who, "educated above their intelligence," perhaps defectively born with too much brain and too little body, too little blood, have become alienated from their roots and their natural feelings—internal foreigners. And indeed the visions of reform to which they so often inclined, they said, showed that while they spoke of the people they did not know, understand or respect them—and said that what they needed most of all was some hardship and toil among the lower orders to teach them "the real world." (Thus does it go in Leo Tolstoy's War and Peace, where Pierre Bezukhov goes from Western-educated cosmopolitan intellectual to apostle of the peasant Karataev, who passively accepts whatever life brings until he dies, marching in the French army's prisoner train as it retreats from Russia.)
All this naturally converged with a whole host of prejudices, old and new, exemplified perhaps in Victorian-era theorists of "Aryanism," who identified the conservative, traditionalist stances (an acceptance of the unchanging nature of things, idealism over materialism, etc.) with spiritually superior Aryan cultures, and liberal/radical, modern outlooks with inferior "non-Aryans"—even when they made up their lists of who went under each heading in mutually exclusive ways. Thus one had the absurdity of German and Russian nationalists each insisting that their country was the purest bearer of the Aryan legacy—while the other nation was not Aryan at all.4
Of course, this reaction did not turn back the clock, the Old Regime never returning and those who even really wished for such an outcome becoming something of a lunatic fringe, but it still had its successes. Religious, nationalistic, traditionalist, anti-intellectual and "populist" appeals on behalf of the status quo and its greatest beneficiaries helped make the spread of formal democracy a less threatening thing to the contented. Meanwhile, as conscious, explicit belief in specific religious doctrines weakened, what might be called the "religious frame of mind" remained, plainly evident in the phrases of everyday language, like the need "to have faith," the idea that "things are meant to be" or "not meant to be," and of course, that there are "things man is not meant to know." (Faith in what, exactly? If things are "meant" or "not meant," just who—or Who—is doing the "meaning?")
And of course, as the old guard of throne, aristocracy and established church declined, the bourgeoisie that had once been revolutionary, now enjoying power and privilege, and anxious about the lower orders and the social questions their existence raised, became conservative in its turn, likewise inclining toward that idea of change as "evil, superficial, illusory, and fundamentally impossible," and reason and its prescriptions as a thing best not pushed "too far." It was less of a stretch than might be imagined—the bourgeois outlook, after all, being focused on the individual, specifically an ethic of individual success-striving and individual moral responsibility, the existing social frame so utterly taken for granted that it did not seem to exist for them at all. (Indeed, as Margaret Thatcher made clear, at times their politics has explicitly denied that it does.)
Unsurprisingly, those favoring constancy over change found more rationalistic-seeming supports for their outlook. The dark view of radical social change taken by the French Revolution's enemies, which identified the French Revolution not with the Declaration of the Rights of Man or the abolition of feudal oppressions but guillotines, Napoleon and Restoration, colored the popular imagination of the event—driving home the idea that if revolution was not a crime against God, then it was still bloody, messy and doomed to failure.
And this was not simply founded on a vague sense of society's machinery as a complex thing not easily tinkered with, or insecurity about whether the state of the art in social engineering is up to such a challenge (the position of a Karl Popper, for example), but a whole host of newer rationales for the unchangeable nature of the world, the ineradicability of the evils in it, the obvious implications for human happiness, and the wisdom of letting everything alone. Like Malthusian scarcity, which attributed poverty and its attendant miseries not to economic or social systems, but "the passion between the sexes." And its extension in the Social Darwinism of Herbert Spencer, in which perhaps God did not assign different people different stations, but Nature did in its uneven distribution of "fitness," and the still more uneven rewards accruing to it. Or the Nietzschean will-to-power. Or Freudian psychoanalysis, which declared the repression of basic human drives (the pursuit of sex, the aversion to work as we commonly think of it) as essential to civilized life.
Or postmodernism, ostensibly secular adherents of which speak of "the problem of evil" in the mystical tones of Medieval theologians, with the subject-object separation of their epistemology an apparent stand-in for the Fall, while in their attachment to identity politics they echo de Maistre's remarks about never having met Man, all of which adds up to a hostility to "grand narratives" as ferocious as any other attack ever launched on the idea of progress—in as thoroughly obscurantist a language as any clergy ever devised. And of course, there are more popular streams of thought, not least the self-help culture, which promotes a conservative idealism scarcely distinguishable from that of the ancient Greek oligarchs. (You can't change the world! There is no world! There's just you, and how you look at it, so change yourself instead!)
All of this so thoroughly saturates our culture that there is no getting away from it—even those most educated for the task of critical thought. The age, ubiquity, association of such an outlook with those most prestigious philosophical and literary texts that have never ceased to be the cornerstone of an education in the humanities (from Aristotle to Shakespeare, from Milton to Tolstoy) is itself be enough to endow such an outlook with enormous prestige to which few intellectuals are totally immune. (Indeed, it was a nineteenth century radical of some note who observed that "the past weighs like a nightmare on the brain of the living"—and a twentieth century radical who remarked that in his youth England and its empire were ruled by "men who could quote Horace but had never heard of algebra.") That there is never a shortage of rationalists who, in disappointment or despair personal or political—or simple self-interest—repudiate their old beliefs to take up exactly opposite ones at the other end of the political spectrum also encourages the tendency. (After all, just as much as ever, intellectual, cultural and especially political life remain the preserve of the privileged, who remained inclined to conservatism, while privilege remains ready to place its prestige and wealth behind conservative thinkers and thought.)
Little wonder, then, that ultra right-wing postmodernism, passed off as daring leftishness to the confusion of nearly all, has become the conventional wisdom of the humanities, the social sciences, and much else, while popular culture serves up a diet of pessimism in one movie about science and technology going wrong after another. We may not know what will happen, exactly, but we are usually safe in assuming that something bad will come of that experiment, that invention. Safe in assuming that if there is a robot, it will rebel. And with the far, far more radical prospects opened up by the Singularity, the dread is commensurately greater.
Those who would discuss the subject rationally have to realize that they are up against all this baggage. And if they wish to be persuade the public of the positive possibilities that our technology has already opened up, and might continue to open up in the coming years, they must be prepared not only to promise the realization of long thwarted human hopes, but challenge the colossal weight of millennia of dark, misanthropic irrationality with the still more overpowering force of reason, too little appreciated these days, but as great as it ever was.
1 The term "technological Singularity" is generally credited to Vernor Vinge. Other writers in this area include Hans Moravec and perhaps most famously Ray Kurzweil. See Vernor Vinge, "The Coming Technological Singularity: How to Survive in the Post-Human Era," VISION-21 Symposium, NASA Lewis Research Center and the Ohio Aerospace Institute, 30-31 Mar. 1993; Hans Moravec, Mind Children: The Future of Human and Robot Intelligence (Cambridge, MA: Harvard University Press, 1990) and Moravec, Robot (New York: Oxford University Press, 2000); Ray Kurzweil, The Age of Spiritual Machines: When Computers Exceed Human Intelligence (New York: Viking, 1999) and Kurzweil, The Singularity is Near: When Humans Transcend Biology (New York: Viking, 2005).
2 As Irving John Good put it in an early paper on the subject, "the first ultraintelligent machine is the last invention that man need ever make." Irving John Good, "Speculations Concerning the First Ultra-Intelligent Machine." In Franz L. Alt and Morris Rubinoff, Advances in Computers 6 (New York: Academic Press, 1965), 31-88.
3 Writing of the ancient Greeks John Ransom Crowe characterized their view of the world as something that not only "resists mastery, is more mysterious than intelligible . . . a world of appearances," but also "perhaps . . . more evil than good." John Crowe Ransom, The New Criticism (Westport, CN: Greenwood Press, 1979), 335.
4 German theorists, of course, excluded the Slavs from the Aryan category, while the Russian Slavophile Aleksey Khomyakov regarded the Slavs as Europe's true Aryans, while Germans were non-Aryan "Kushites."
Tweet
Monday, March 2, 2020
Technological Hype and the Military Balance
Originally published through SSRN, June 3, 2018
The modern world has seen profound technological change, but it has also seen equally profound technological hype—extravagant, intensive publicity and promotion not merely of new technologies, but technologies merely thought to be near realization, which has tended to go hand in hand with the exaggeration of their maturity, their capabilities and their implications for actual life. Relative to how large—and often, how damaging—a role such hype has played in recent history, and how big a factor it is in contemporary thinking today, the subject has been little studied. Its implications for the military sphere are no exception to this rule.
This article is intended as an examination of that subject, beginning with a consideration of the factors that make today's observers susceptible to technological hype. It will then consider the implications of such hype for thinking about the distribution of power among nations—both the ways in which they affect economic and geopolitical thinking generally; and in which they affect thinking about military technology more narrowly. From there the discussion proceeds to a consideration of the present—the past few years, and very likely, a good many of the years ahead—as a period of particular vulnerability to such hype.
Understanding Technological Hype
To better appreciate the actual pervasiveness and weight of technological hype, it is helpful to consider four factors from which it derives its strength: our prolonged experience of technological upheaval; the "rumors of the future" we get from futurology, science and technology reporting, and to a greater degree than is widely appreciated, science fiction; popular understandings of technological research and development; and the culture of commerce.
Prolonged Experience of Technological Upheaval
It would be impossible to exaggerate the significance of the shift from a predominantly agrarian, rural, pre-industrial civilization to an urbanized, industrial civilization founded on the expansion of human capability through machinery powered by inanimate power sources (coal-fired steam engines, internal combustion engines, electricity). That change, which in much of the world is still ongoing, and even in the most advanced countries still less than perfectly complete, has been recent enough and dramatic enough to create a general expectation of such change as a constant in modern life.1
It may well be the case that this has led to the overrating of what are arguably the more modest changes seen since that time, like the revolution in digital computing and communications, so often accorded a significance comparable to that of the Industrial Revolution, which has exaggerated the impression of the rate and depth of more recent technological change.2 Such exaggeration may be reinforced by a weak historical sense on the part of the public (especially among younger people), and perhaps the popular obsessions with entertainment, social media, consumerism and "lifestyle" generally.3 However one explains it, in the end the perception of such continuous change is there, and interacts with other factors, not least the fact that more such change is continually promised to them in the media.
"Rumors of the Future"
The preoccupation with the ways in which the future will be different from the present, and still more, the investment of rigorous effort in working out those differences were in the late nineteenth century still a novelty. Works like Stanley Jevons' The Coal Question, Ivan Bloch's The Future of War, and H.G. Wells' Anticipations, as well as the stories by Jules Verne and other early proponents of science fiction, all owe their place in cultural and intellectual history partly to that novelty. However, during the twentieth century professional futurology became a massive industry, and while its prestige may have fallen somewhat since its peak in the post-World War II period, its presence is perhaps more strongly felt than ever before.4 In the discussion of topics ranging from nuclear proliferation to climate change, from the rate of economic growth to the space program, even the casual consumer of news comes into contact with more or less sophisticated extrapolations not only about what has happened or is happening, but what will happen years or decades hence, and the implications of those possibilities for the present—and so frequently that they scarcely even notice the fact. With news outlets routinely devoting full sections of their papers, magazines and web sites to "Science" and "Technology" News, discussion of technologies only in development, perhaps only very early development, are often the explicit focus of a substantial portion of news coverage itself.
At the same time, the imagery of imaginary technologies has become virtually pervasive across popular culture. In recent years, any list of the highest-grossing movies at the box office tends to consist primarily of action-adventure films packed with futuristic vehicles and other technologies, like the Star Wars and Marvel Comics Universe films. Charles Gannon has usefully characterized such imagery as "rumors of the future," habituating us to the thought of technologies that do not yet exist.5 Indeed, that such technologies actually do exist, if only somewhere, in secret, is a staple of science fiction, which routinely presents intelligence agencies and other such organizations as secretly carrying on ultra-advanced programs to explore space and the like in the present day. (This is the view that the "future is already here—it's just not very evenly distributed."6)
Popular Conceptions of Science and Technology
Even while the products of science, hypothetical as well as real, have become ubiquitous in cultural life, the actual process that is science remains poorly understood by non-scientists on the whole. The reality is that engineering work of the kind relevant to this discussion is often a collaborative, cumulative, slow-moving activity, which may be straining against the limits not only of what has been done, but what is possible with the existing state of the art. It also tends to occur within a business, economic and political context, where imperatives besides mere feasibility are operative, and people who are not scientists or engineers—who may in fact be ill-educated in these matters—are often in control. (The corporation that cuts its R & D budget to make this quarter's earnings report look better is not unknown.7) And even in the best of circumstances technological research and development can be subject to fits and starts, dead ends and flashes-in-the-pan, often resulting in a device that exists but is virtually unworkable in any real-life conditions, or perhaps simply stillborn at the concept stage.
However, the popular image of such work envisions such work as highly individualistic and speedy, with revolutionary breakthroughs cheap and easy. This is, again, partly a matter of fiction favoring the simple, heroic, dramatic story over a more complex reality.8 Yet, fiction is not the sole source of this thinking, greatly encouraged by the manner in which journalism has tended to portray real-life developments, epitomized by the conception of the computing revolution. While the history of mechanical computing goes back centuries, with key computing concepts dating very far back in them (Blaise Pascal, Gottfried Leibniz, Charles Baggage laying down essential groundwork), and electronic computers to the 1940s, after which they were the object of massive government R & D efforts that led to still more key innovations (ENIAC, SAGE, NLS, ARPANET), there is a tendency to imagine it all as having emerged out of garages in California without help from anyone else. This view of technological development as such a quick, simple matter, makes it the easier to imagine that a technology only "in development," and which on closer examination may be a very uncertain or distant prospect, is almost ready for commercial employment.
The Culture of Commerce
Interacting with and reinforcing past experience of technological age, "rumors of the future," and the prevalent, oversimplified view of how technologies are realized in practice, is what one may term the "culture of commerce." It is, of course, the business of advertisers to make the wares on sale appear as attractive and compelling as possible, and one obvious and commonly taken course is to present those wares as being as new and revolutionary as possible—as a practical matter, more new and revolutionary than they really are. Still, this indisputably distorts understanding of the matter. It does not help that journalism is increasingly virtually indistinguishable from advertising, a tendency epitomized by those media outlets that carry what is euphemistically referred to as "paid" or "sponsored" content. However, the mentality pervades even more conventional journalism, dependent as it is on capturing audiences with exciting headlines. Especially when this imperative is combined with a weak grasp of the process by which concepts actually become viable technologies, it easily creates an exaggerated picture of the technology's feasibility. Indeed, the fostering of expectations inflated far beyond what will ever be met are virtually built into the process by which technologies enter into use, the "Gartner Hype Cycle" based on studies of technologies going back to railroads positing a sequence of technological breakthrough drawing attention before the production of usable, commercial products; inflated expectations; a period of disillusionment; a second, rising trend of genuine usage; and then a "plateau" of productivity, in which the technology makes a genuine contribution, if less spectacular than what the optimists imagined during that earlier period of inflated expectations.
The cases of single technologies apart, the pattern of such reportage has a profound shaping influence on perceptions. Any viewer of the news is subject to constant, specific claims of explosive change in one area or another. The individual claims (which tend not to stand up to scrutiny) are forgotten, but the overall impression of that rate of change remains. Moreover, when the old claims which produced that impression are critically examined, the evaluation is often generous—overly generous, the bar for "prophecy" set low, with an obvious example the assessment of Raymond Kurzweil's forecasts.9
When despite this a prediction, or mass of them, proves to have been clearly wrong, a typical response is to shrug off the exaggerations, and suggest that those taking the more critical view of "missing the forest for the trees." Yes, they say, the futurists were wrong that time (and that time, and perhaps that time too), but they "know" that the overall reading of the trend as one of explosive change is correct. (The ruthless mockery to which those bullish about the pace of change subject those who dare to ask "Where are the flying cars?" is exemplary of this.10)
Techno-Hype and the Balance of Power
In private economic life, technological hype often leads to the misallocation of economic resources—to questionable purchases of goods and assets that can in the most extreme cases lead to speculative bubbles that disrupt whole economies, from the wonders claimed for the South Sea Company in its heyday, to the information technology bubble that burst in 2000. However, hype about such technologies has implications well beyond the purely civilian sphere. It has often had implications for perceptions about the prosperity of particular nations, and the power they derive from it, including the military power that in the modern world is founded on economic and industrial power. There has in fact been a recurring tendency to take an exaggerated view of the implications of an extraordinary lead in one or a few technologies for the wealth and power of a particular country closely identified with that technology, imagining that a disproportionate market share here can readily translate to a wildly disproportionate prosperity or power. This has frequently been reinforced by a tendency to claim that the country in question possesses some unique comparative advantage that will make its lead in that technology enduring.
An obvious case is Britain in the post-World War II period. Despite its post-war economic situation and decolonization, expectations existed that British accomplishments in cutting-edge fields as computers, aerospace and nuclear energy would sustain its economic and strategic position in the post-war world. Reality fell far short of such hopes in a history which saw such disappointments as the de Havilland Comet, the "Magnox" gas-cooled nuclear reactor, and the Concorde, which showed its advantages to be less robust than imagined, and the exploitation of these advantages more complex and uncertain.11
The tendency has grown more rather than less pronounced over time, with Japan's extraordinary dominance in microchips during the late 1980s another, even more germane example. Despite accounting for less than 3 percent of the world's population, 12 percent of world GDP and 16 percent of manufacturing, the country produced over half the world's chips at the time.12 In this context Japanese politician Shintaro Ishihara created a sensation with the claim in his book The Japan That Can Say No that Japan's strength in microchips put the Cold War balance of power in its hands, and would be a pillar of its enduring stature in the post-Cold War era.13 (This was to see it draw the former Soviet sphere into its own orbit, and manage the world's affairs jointly with the United States as the "Group of Two.")
While Ishihara's predictions have on the whole fared very poorly, it is notable that the country's extraordinary lead in microchip production proved fragile and brief, in part because it rested on quite different, more complex and less enduring foundations than he imagined. Japan's position as a microchip producer was based on its position as an exporter of consumer electronics more generally—the makers of such goods buying their chips from domestic producers, underlining the reality that its exceptional market share reflected a broader industrial supremacy, though even that proved fleeting. When Japanese consumer electronics lost their market dominance, the share they represented of the chip market declined. Moreover, the highly touted virtues of Japanese management and workers were nowhere near enough to compensate for a badly flawed restructuring of the industry amid a more competitive environment in the 1990s.14
While its power has been broader-based and more enduring, the United States, too, has been susceptible to such hype, similarly based on misconceptions about computer technology, and perceptions of the United States as more fitted than others to develop and use it. A pointed example is Ralph Peters' vision of the U.S. enjoying complete, continuous, global battlespace dominance based on a space-based military system so extensive and powerful its
Ultimately, each case has been a reminder of the reality that a single technology is too narrow a base for any nation to retain a preponderant share of the world's wealth and power, and in any event, technological capabilities tend to diffuse. Perhaps especially when the technology in question is evolving and its market growing, and the technology is a consumer's rather than a producer's technology, that diffusion is especially prone to be rapid and deep. The result is that in the longer run a nation's weight tends toward what is suggested by that bigger complex of factors, from the extent and richness in natural resources of a nation's territory, to the size and education of its population, to the commitment of its government to such policies as make for industrial success or failure, rather than narrow and fleeting technological advantages.
Techno-Hype in the Military Sphere
Each of the factors operative in the civilian sphere is also operative in discussions of technologies designed specifically for military use. However, discussions of military technology here have their own, distinct tendencies, including the exaggerated idea of what a particular technology could do for the party that developed it. Generally, what has occurred is the idea of a technology's capability far outrunning its actuality because of the perceived desirability of that capability. Since World War I in particular there has been a preoccupation with the possibility of new technologies providing a means to quick, cheap, decisive military victory in a major interstate contest. This is exemplified by the '20s and '30s-era futurology regarding "mechanization," specifically the ability of small armored and air forces to substitute for large armies and total war efforts, with Basil Liddell Hart boldly claiming that planes and tanks would end the era of "'absolute war' and . . . its fungus growth—'the nation in arms,'" by enabling swift knockout blows by one power against another.18 Guilio Douhet was still more extravagant, claiming that a fleet of even twenty operational planes put a nation's "aero-chemical arm" in a position "to break up the whole social structure of the enemy in less than a week, no matter what his army and navy may do."19
While not going as far in the development of these technologies as some critics may suggest, expectations of the revolutionary nature of mechanization did shape national policies. In the interwar period Britain's leaders invested in the largest air force and most mechanized army of the day.20 Likewise, an expansion-minded Nazi leadership relied on these technologies as the only way in which its program of European conquests could be plausible, reflected in the emphasis on the Luftwaffe, and the development of an armored corps proficient in operating in the manner prescribed by the new theorists.21
When the war actually came armor and air power did indeed revolutionize the battlefield, but they fell far short of that ideal of quick, cheap, decisive victory. The simple reality is that technologies diffuse quickly in the modern world, and with similar speed spur the development of countermeasures that neutralize them.22 Consequently Germany, helped by an implausible combination of a high-risk strategy with a profoundly mismanaged French defense, won a quick, cheap victory in the 1940 Battle of France.23 However, already at that date other armies were taking note and learning to use their mechanized forces in the same offensive manner, or apply similar techniques to the problem of defense. The result was that Germany did not see another comparable victory for the rest of the war, with the failure of German operations to achieve the desired speedy victory in the early phase of Operation Barbarossa dooming it to the long war that virtually spelled defeat.
Likewise, even with air forces vastly larger, better-equipped and deployed in far more sophisticated fashion than anything Hart or Douhet discussed, strategic bombing was an attritional affair, practiced not just massively but over the extreme long term, with results that were ultimately ambiguous—in part because of the advent of radar, and its use to direct similarly larger and more capable forces of fighter aircraft. As a result Britain withstood prolonged German bombing in 1940-1941, and Germany itself withstood such prolonged bombing which, whatever else can be claimed for it, did not secure the promised speedy collapse of German willingness or ability to prosecute the war.
Ultimately it was not Liddell-Hart or Douhet who was proven right, but rather the less well-known and less heeded Vladimir Triandafillov, who much more astutely realized that the next war would be both a high-tech affair, and a mass affair, continuing even longer and taking a greater toll in blood and treasure than that First World War against which they reacted so strongly.24 Indeed, new technologies did not so much supplant earlier methods of war-fighting as increase the stock of weapons, systems, capabilities that a major power needed to fight its war. Armored fighting vehicles did surprisingly little to reduce the need for infantry able to fight dismounted, with the advent of man-portable anti-tank weapons actually necessitating their escort by protective infantry. At the same time air forces did not eliminate the need for armies or navies—the Luftwaffe not only proving no substitute for a German navy able to land a large invasion force on British soil, while when the tide turned, the Combined Bombing Offensive did not spare the Allies the need to invade Europe and ultimately take Berlin.
As it went with mechanization, so did it go with virtually every significant later development, because either the reality of the capability fell short of the promise, or because the diffusion of the technology, or the development of countermeasures to it, complicated the matter. The extent to which nuclear weaponry could substitute for other kinds of military power proved greatly exaggerated, witness the reform of the U.S. armed forces in the 1950s, and then the backing away from those changes in the period afterward, as the Soviet Union narrowed the nuclear gap, and the rigidities of a policy based on Massive Retaliation became increasingly apparent—as well as its irrelevance to the situations of insurgency that seemed to be of rising importance.25 Strategic arsenals did not eliminate the need for conventional forces of all the known types (land, air, sea), with their increasingly sophisticated expensive equipment, while the problem of insurgency encouraged the advent of special operations forces for dealing with these as well. For cash-strapped post-war Britain, air mobility, "commando carriers" and tactical nuclear weapons also appeared a cheap way of sustaining a worldwide military presence, an illusion repeatedly shown up until the pretension was abandoned in the withdrawal from "east of Suez."26
Time and again, space-based weaponry has been an object of such hype, particularly after the expectations of extremely cheap, reliable, regular space access created by the space shuttle in the 1970s. While this proved false the Strategic Defense Initiative renewed the hype in this area in the 1980s, enduring even after the Cold War (for which many erroneously accord it the credit), and resurging yet again around the turn of the century (the technology itself, as well as buoyant expectations of enduring New Economy boom, key to claims like Peters')—the command of the highest ground of all, again, making major war seem winnable.27 The exaltation of the technologies of the "Revolution in Military Affairs," notably its combination of digital computing and communications with increasingly advanced electronic sensors and precision-guided munitions, has in the view of many analysts led to exaggerated ideas of the capacity of policymakers to substitute them for more expensive and politically problematic commitments of ground forces.28
Hype Resurgent?
There seem two particular reasons for special concern at the present moment. The first is that such hype tends to go through cycles of boom and bust, with a recent bust seemingly being followed by a new boom at the time of this writing. The second would be the stresses of the international environment as it is today, which may leave anxious policymakers more than usually eager to grasp at illusory solutions to their problems.
Boom and Bust and Boom Again?
The late 1990s was a period of particular boom for technological hype, as personal computing, cellular telephony and Internet access became staples of everyday life for the advanced industrial nations' middle classes, and increasingly other groups as well. It was also accompanied by explosive expectations in areas like artificial intelligence, robotics, virtual reality, nanotechnology and genetic engineering, and reinforced by the experience of relatively rapid economic growth in the 1995-2000 period. However, the "tech bubble" burst. The advances in computing and cell phones became increasingly incremental. At the same time other technologies proved less immediately feasible than anticipated, with advances in neural networks proving sluggish, and advances in nano-machinery more elusive than their proponents suggested. Additionally, the spiking of commodity prices and energy prices especially evoked Malthusian catastrophe rather than Wellsian super-technology, and the experience of the "Great Recession" dampened expectations yet again. Rather than explosive technological change, prophecies of ecological doom resurged.
Since then progress in a number of key technological areas, particularly artificial intelligence, but also materials science and renewable energy, has revived expectations of deep, wide, near-term changes, particularly the automation of a great deal of economic activity, with the self-driving car perhaps the most emblematic at the moment. Such developments are already having significant cultural and intellectual effects, evident in the amount of time given to discussing their implications, as seen in the explosion of writing in the press regarding such subjects as appropriate governmental responses to the development of robots years or decades more advanced than anything now in existence. The prospect of mass unemployment as a result of the new technologies, for example, is already being discussed as if it were already a matter of immediate urgency in pieces by the most orthodox of thinkers running in the most mainstream of publications.29 (Indeed, it can seem that future unemployment resulting from automation is treated more earnestly than the current, actual problem of unemployment or underemployment.) One might well argue that such writers are getting ahead of themselves.
An "Easy" Out?
The second factor to be considered here is that such hype may be especially seductive in a period of constrained resources. This may especially be the case when a power which has been predominant, and particularly given to reliance on high technology, finds its position challenged—as the case of Britain dealing with its own decline in the early and mid-twentieth century strongly suggests.
Today the U.S. is similarly facing a changing distribution of wealth and power. In 1992 the U.S. economy was six times' the size of those of Russia and China combined. Today China's economy alone is larger than the U.S.'s when measured in Purchasing Power Parity terms. All of this is increasingly reflected in the hard reality that military capabilities and advantages the U.S. has long held a monopoly on are increasingly within the reach of other states. At the time of this writing, not only has China put its first squadron of fifth-generation (stealth) fighter aircraft into service, but Russia has sent such fighter planes to Syria operationally.30 None of this means that the U.S. has ceased to be the leading military power, or will cease to be that any time soon. It does not even mean that the U.S. has ceased to be the pace-setter for other military powers. However, the combination of quantitative and qualitative superiority it enjoyed in the 1990s is a thing of the past, and unlikely to return. Especially given the extent to which the United States has relied on cutting-edge technology as a basis for military strength, not only in the recent past, but historically; and the continued tendency to think of innovation, especially in the area of computers, as somehow uniquely American; the temptation to imagine some leap in weapons technology recovering the old edge may be great.
However, it should also be remembered that the U.S. is not the only actor susceptible to such illusions by any means, virtually every military power of note today in a comparable condition. Russia endeavors to be a world military power in a manner comparable with its Soviet-era stature, but on the basis of only a small fraction of its power base.31 China is by some measures the largest economy in the world today, but seeing its growth slow down markedly while its per capita income remains 15-30 percent that of the U.S.. The same goes even for less ambitious states. European states like Britain, France and Germany, and Japan, have become more committed to investing in military power, but from a position of slow economic growth and massive and growing indebtedness.32
Additionally, virtually all advanced nations are today anxious about populations which are aging, and perhaps also a decline of civic militarism, and hoping to substitute money and technology for manpower in this as in other areas. Moreover, none of these states has been averse to technophilia, or technological hype—as Vladimir Putin's widely publicized remarks on the prospect of global dominance going to the nation that leads the way in artificial intelligence make all too clear.33 Rather than revolutionary weaponry overturning the global balance, we should expect continued advances in weapons technology among a circle of powers determined by the possession of more traditional foundations of power—not least, scale in population, territory, natural resources, its manufacturing base and financial strength.
Conclusions
The problem of technological hype does not admit of tidy policy solutions. Rather the most that can be hoped for is a greater alertness to the problem on the part of analysts, which would be helped by a remembrance of a handful of lessons strongly suggested by many a reading of history: that hype is virtually built into the cycle of technological adoption; that the road from laboratory demonstration to practical use, let alone really effective use, is long and uncertain; and that a single technology is too narrow a basis for durable power, economic or military. Certainly the track record for new military technologies delivering swift, cheap, decisive victory in anything resembling a conflict between peers is dismal.
None of this is to deny that technology will continue to advance, that those advances will matter, or that those who fail to keep up will suffer untoward consequences. However, the experiences of the past century and above all its most calamitous war demonstrates, the price of acting on excessive expectations of what a technology can do will be just as high.
1 To cite only one of the more obvious metrics, the world's population was only 55 percent urbanized as of 2017, well below the norm for the developed countries. Central Intelligence Agency, "World," CIA World Factbook (Washington D.C.: Central Intelligence Agency, 2018). Accessed https://www.cia.gov/library/publications/the-world-factbook/geos/xx.html.
2 The most notable exponent of this line of argument is perhaps Robert J. Gordon. See Gordon, "Does the 'New Economy' Measure Up to the Great Inventions of the Past?" Journal of Economic Perspectives 14 No. 1 (Fall 2000), 49-74; The Rise and Fall of American Growth: The U.S. Standard of Living Since the Civil War (Princeton, NJ: Princeton University Press, 2016).
3 A search of the New York Times archive in March 2018 showed that where the word "lifestyle" was virtually unused before the late 1960s, its usage increased through the twentieth century, and exploded in the twenty-first. By 2007 it was appearing in the paper a thousand times a year, and has generally stayed near that figure since.
4 For a brief overview of the relevant history, see Nicholas Rescher, Predicting the Future: An Introduction to the Theory of Forecasting (Albany, NY: State University of New York, 1998), 19-33.
5 Charles E. Gannon, Rumors of War and Infernal Machines: Technomilitary Agenda-Setting in American and British Speculative Fiction (Lanham, MD: Rowman & Littlefield, 2003).
6 The quotation is commonly attributed to author William Gibson, who used it on multiple occasions, among them a 1999 appearance on U.S. National Public Radio. See "The Science in Science Fiction," Talk of the Nation, National Public Radio, 30 Nov. 1999. Accessed at https://www.npr.org/templates/story/story.php?storyId=1067220.
7 The tendency is much-noted in the small but growing literature on "short-termism." See Angela Black and Patricia Fraser, "Stock Market Short-termism—An International Perspective," Journal of Multinational Financial Management 12.2 (April 2002), 135-158. One study found that 80 percent of executives would sacrifice R & D for the sake of the "smooth earnings" that are the object of such behavior. John R. Graham, Campbell R. Harvey, and Shivaram Rajgopal, "The Economic Implications of Corporate Financial Reporting," Journal of Accounting and Economics 40 (2005), 3–73.
8 Critics of science fiction actually have a term for such a narrative—John Clute coining the term "Edisonade." See John Clute and Peter Nicholls, The Encyclopedia of Science Fiction (London, Orbit, 1993).
9 A case in point is Kurzweil's predictions for 2009 in his book The Age of Spiritual Machines, noteworthy both for containing many specific, testable forecasts, and the wide attention they have received. Alex Knapp offered a rare, critical examination of those forecasts in 2012, and this was sufficiently novel that Kurzweil himself elected to respond personally to Knapp in his own publication. Alex Knapp, "Ray Kurzweil's Predictions for 2009 Were Mostly Inaccurate," Forbes, 20 Mar. 2012. Accessed at https://www.forbes.com/sites/alexknapp/2012/03/20/ray-kurzweils-predictions-for-2009-were-mostly-inaccurate/#789884b73f9a. Ray Kurzweil, "Ray Kurzweil Defends His 2009 Predictions," Forbes, 21 Mar. 2012. Accessed at https://www.forbes.com/sites/alexknapp/2012/03/21/ray-kurzweil-defends-his-2009-predictions/#95811a94852e.
10 For an example of the tendency on the part of one well-known science fiction author more attentive to these matters than most, see Charles Stross, "Let's Put the Future Behind Us," Antipope.com, 23 Nov. 2014. Accessed at http://www.antipope.org/charlie/blog-static/2006/10/lets_put_the_future_behind_us.html.
11 Correlli Barnett, The Verdict of Peace: Britain Between Her Yesterday and the Future (New York: Macmillan, 2001); David Edgerton, Warfare State: Britain 1920-1970 (New York: University of Cambridge Press, 2006).
12 Martin Fackler, "Japan's Chip Makers Search for a Strategy," New York Times, 2 Jan. 2006. Accessed at https://www.nytimes.com/2006/01/02/technology/japans-chip-makers-search-for-a-strategy.html. GDP and manufacturing output data from United Nations, "Per Capita GDP at Constant 2010 Prices in U.S. Dollars," National Accounts Main Aggregates Database, Dec. 2017. Accessed at https://unstats.un.org/unsd/snaama/dnlList.asp.
13 Shintaro Ishihara, The Japan That Can Say No, trans. Frank Baldwin (New York: Simon & Schuster, 1991).
14 See Fackler.
15 Ralph Peters, Fighting for the Future: Will America Triumph? (Mechanicsburg, PA: Stackpole Books, 1999), 200.
16 Peters, 201.
17 Had the U.S. economy grown at its New Economy boom (1995-2000) rate, per capita GDP circa 2016 would have been 20 percent higher. Derived from UN, "Per Capita GDP."
18 Basil Liddell Hart, Paris, or The Future of War (New York: Garland Publishing, Inc., 1927), 85.
19 Giulio Douhet, The Command of the Air, trans. Dino Ferrari (New York: Coward-McCann, 1942), 142.
20 Donald Kagan and Frederick Kagan, While America Sleeps: Self-Delusion, Military Weakness and the Threat to Peace Today (New York: St. Martin's, 2000), 48-49, 56; Edgerton, Warfare State and Britain's War Machine: Weapons, Resources and Experts in the Second World War (New York: Oxford University Press, 2011).
21 For a discussion of the limitations imposed by the German economy's weaknesses on rearmament, see Adam Tooze, The Wages of Destruction: The Making and Breaking of the Nazi Economy (New York: Penguin, 2006).
22 For a discussion of the tendency of weaponry's failure to provide lasting advantage, also see Martin van Creveld, Technology and War: From 2000 B.C. to the Present (New York: the Free Press, 1989).
23 William L. Shirer, The Collapse of the Third Republic (New York: Simon & Schuster, 1969).
24 Vladimir K. Triandafillov, The Nature of the Operations of Modern Armies, trans. William A. Burhans (Portland, OR: Frank Cass, 1994).
25 Frederick W. Kagan, Finding the Target: the Transformation of American Military Policy (New York: Encounter Books, 2006).
26 See Phillip Darby, British Defence Policy East of Suez (London: Oxford University Press for the Royal Institute of International Affairs, 1973); William P. Snyder, The Politics of British Defense Policy 1945-1962 (Columbus, OH: Ohio State University Press, 1964).
27 See Nader Elhefnawy, "Space War and Futurehype," Space Review, 22 Oct. 2007. Accessed at http://www.thespacereview.com/article/984/1. Elhefnawy, "Revisiting Island One," Space Review, 27 Oct. 2008. Accessed at http://www.thespacereview.com/article/1238/1. Elhefnawy, "Space War and Futurehype Revisited," Space Review, 14 Nov. 2011. Accessed at http://www.thespacereview.com/article/1969/1. Elhefnawy, "Why We Fall for the Hype: Contextualizing Our Thought on Space Warfare," Space Review, 26 Mar. 2012. Accessed at http://www.thespacereview.com/article/2052/1.
28 Exemplary of such hopes was the course of action taken by the U.S. and its allies against Yugoslavia in 1999 and Afghanistan in 2001; and Harlan Ullman's "shock and awe" theory.
29 Lawrence H. Summers, "Larry Summers: The Robots Are Coming, Whether Trump’s Treasury Secretary Admits It or Not," Washington Post, 27 Mar. 2017. Accessed https://www.washingtonpost.com/news/wonk/wp/2017/03/27/larry-summers-mnuchins-take-on-artificial-intelligence-is-not-defensible/?noredirect=on&utm_term=.aa2a87408a98.
30 Jeffrey Lin and P.W. Singer, "China's J-20 stealth fighter jet has officially entered service," Popular Science, 18 Feb. 2018. Accessed https://www.popsci.com/chinas-j-20-stealth-fighter-officially-enters-service. Alex Lockie, "Russia thinks its new advanced fighter jet in Syria will scare off other countries — but nobody's afraid of it," Business Insider, 27 Feb. 2018. Accessed http://www.businessinsider.com/russia-su-57-syria-deter-scaring-nobody-2018-2.
31 Circa 1980 the Soviet Union had, according to one estimate, 15 percent of the world's manufacturing output. Today Russia has perhaps 1 percent, about the same as Indonesia or the Netherlands. For the 1980 figure see Bairoch, Paul, "International Industrialization Levels from 1750 to 1980," Journal of European Economic History 11 (Fall 1982), Table 9. For the more recent one, see Chris Rhodes, "Manufacturing: International Comparisons," House of Commons Library Briefing Paper, No. 05809, 5 Jan. 2018. Accessed https://researchbriefings.parliament.uk/ResearchBriefing/Summary/SN05809#fullreport.
32 Department of Finance, Canada, Fiscal Reference Tables, Sep. 2017, Table 54. Accessed at https://www.fin.gc.ca/frt-trf/2017/frt-trf-17-eng.asp.
33 David Meyer, "Vladimir Putin Says Whoever Leads in Artificial Intelligence Will Rule the World," Fortune, 4 Sep. 2017. Accessed at http://fortune.com/2017/09/04/ai-artificial-intelligence-putin-rule-world/.
Tweet
The modern world has seen profound technological change, but it has also seen equally profound technological hype—extravagant, intensive publicity and promotion not merely of new technologies, but technologies merely thought to be near realization, which has tended to go hand in hand with the exaggeration of their maturity, their capabilities and their implications for actual life. Relative to how large—and often, how damaging—a role such hype has played in recent history, and how big a factor it is in contemporary thinking today, the subject has been little studied. Its implications for the military sphere are no exception to this rule.
This article is intended as an examination of that subject, beginning with a consideration of the factors that make today's observers susceptible to technological hype. It will then consider the implications of such hype for thinking about the distribution of power among nations—both the ways in which they affect economic and geopolitical thinking generally; and in which they affect thinking about military technology more narrowly. From there the discussion proceeds to a consideration of the present—the past few years, and very likely, a good many of the years ahead—as a period of particular vulnerability to such hype.
Understanding Technological Hype
To better appreciate the actual pervasiveness and weight of technological hype, it is helpful to consider four factors from which it derives its strength: our prolonged experience of technological upheaval; the "rumors of the future" we get from futurology, science and technology reporting, and to a greater degree than is widely appreciated, science fiction; popular understandings of technological research and development; and the culture of commerce.
Prolonged Experience of Technological Upheaval
It would be impossible to exaggerate the significance of the shift from a predominantly agrarian, rural, pre-industrial civilization to an urbanized, industrial civilization founded on the expansion of human capability through machinery powered by inanimate power sources (coal-fired steam engines, internal combustion engines, electricity). That change, which in much of the world is still ongoing, and even in the most advanced countries still less than perfectly complete, has been recent enough and dramatic enough to create a general expectation of such change as a constant in modern life.1
It may well be the case that this has led to the overrating of what are arguably the more modest changes seen since that time, like the revolution in digital computing and communications, so often accorded a significance comparable to that of the Industrial Revolution, which has exaggerated the impression of the rate and depth of more recent technological change.2 Such exaggeration may be reinforced by a weak historical sense on the part of the public (especially among younger people), and perhaps the popular obsessions with entertainment, social media, consumerism and "lifestyle" generally.3 However one explains it, in the end the perception of such continuous change is there, and interacts with other factors, not least the fact that more such change is continually promised to them in the media.
"Rumors of the Future"
The preoccupation with the ways in which the future will be different from the present, and still more, the investment of rigorous effort in working out those differences were in the late nineteenth century still a novelty. Works like Stanley Jevons' The Coal Question, Ivan Bloch's The Future of War, and H.G. Wells' Anticipations, as well as the stories by Jules Verne and other early proponents of science fiction, all owe their place in cultural and intellectual history partly to that novelty. However, during the twentieth century professional futurology became a massive industry, and while its prestige may have fallen somewhat since its peak in the post-World War II period, its presence is perhaps more strongly felt than ever before.4 In the discussion of topics ranging from nuclear proliferation to climate change, from the rate of economic growth to the space program, even the casual consumer of news comes into contact with more or less sophisticated extrapolations not only about what has happened or is happening, but what will happen years or decades hence, and the implications of those possibilities for the present—and so frequently that they scarcely even notice the fact. With news outlets routinely devoting full sections of their papers, magazines and web sites to "Science" and "Technology" News, discussion of technologies only in development, perhaps only very early development, are often the explicit focus of a substantial portion of news coverage itself.
At the same time, the imagery of imaginary technologies has become virtually pervasive across popular culture. In recent years, any list of the highest-grossing movies at the box office tends to consist primarily of action-adventure films packed with futuristic vehicles and other technologies, like the Star Wars and Marvel Comics Universe films. Charles Gannon has usefully characterized such imagery as "rumors of the future," habituating us to the thought of technologies that do not yet exist.5 Indeed, that such technologies actually do exist, if only somewhere, in secret, is a staple of science fiction, which routinely presents intelligence agencies and other such organizations as secretly carrying on ultra-advanced programs to explore space and the like in the present day. (This is the view that the "future is already here—it's just not very evenly distributed."6)
Popular Conceptions of Science and Technology
Even while the products of science, hypothetical as well as real, have become ubiquitous in cultural life, the actual process that is science remains poorly understood by non-scientists on the whole. The reality is that engineering work of the kind relevant to this discussion is often a collaborative, cumulative, slow-moving activity, which may be straining against the limits not only of what has been done, but what is possible with the existing state of the art. It also tends to occur within a business, economic and political context, where imperatives besides mere feasibility are operative, and people who are not scientists or engineers—who may in fact be ill-educated in these matters—are often in control. (The corporation that cuts its R & D budget to make this quarter's earnings report look better is not unknown.7) And even in the best of circumstances technological research and development can be subject to fits and starts, dead ends and flashes-in-the-pan, often resulting in a device that exists but is virtually unworkable in any real-life conditions, or perhaps simply stillborn at the concept stage.
However, the popular image of such work envisions such work as highly individualistic and speedy, with revolutionary breakthroughs cheap and easy. This is, again, partly a matter of fiction favoring the simple, heroic, dramatic story over a more complex reality.8 Yet, fiction is not the sole source of this thinking, greatly encouraged by the manner in which journalism has tended to portray real-life developments, epitomized by the conception of the computing revolution. While the history of mechanical computing goes back centuries, with key computing concepts dating very far back in them (Blaise Pascal, Gottfried Leibniz, Charles Baggage laying down essential groundwork), and electronic computers to the 1940s, after which they were the object of massive government R & D efforts that led to still more key innovations (ENIAC, SAGE, NLS, ARPANET), there is a tendency to imagine it all as having emerged out of garages in California without help from anyone else. This view of technological development as such a quick, simple matter, makes it the easier to imagine that a technology only "in development," and which on closer examination may be a very uncertain or distant prospect, is almost ready for commercial employment.
The Culture of Commerce
Interacting with and reinforcing past experience of technological age, "rumors of the future," and the prevalent, oversimplified view of how technologies are realized in practice, is what one may term the "culture of commerce." It is, of course, the business of advertisers to make the wares on sale appear as attractive and compelling as possible, and one obvious and commonly taken course is to present those wares as being as new and revolutionary as possible—as a practical matter, more new and revolutionary than they really are. Still, this indisputably distorts understanding of the matter. It does not help that journalism is increasingly virtually indistinguishable from advertising, a tendency epitomized by those media outlets that carry what is euphemistically referred to as "paid" or "sponsored" content. However, the mentality pervades even more conventional journalism, dependent as it is on capturing audiences with exciting headlines. Especially when this imperative is combined with a weak grasp of the process by which concepts actually become viable technologies, it easily creates an exaggerated picture of the technology's feasibility. Indeed, the fostering of expectations inflated far beyond what will ever be met are virtually built into the process by which technologies enter into use, the "Gartner Hype Cycle" based on studies of technologies going back to railroads positing a sequence of technological breakthrough drawing attention before the production of usable, commercial products; inflated expectations; a period of disillusionment; a second, rising trend of genuine usage; and then a "plateau" of productivity, in which the technology makes a genuine contribution, if less spectacular than what the optimists imagined during that earlier period of inflated expectations.
The cases of single technologies apart, the pattern of such reportage has a profound shaping influence on perceptions. Any viewer of the news is subject to constant, specific claims of explosive change in one area or another. The individual claims (which tend not to stand up to scrutiny) are forgotten, but the overall impression of that rate of change remains. Moreover, when the old claims which produced that impression are critically examined, the evaluation is often generous—overly generous, the bar for "prophecy" set low, with an obvious example the assessment of Raymond Kurzweil's forecasts.9
When despite this a prediction, or mass of them, proves to have been clearly wrong, a typical response is to shrug off the exaggerations, and suggest that those taking the more critical view of "missing the forest for the trees." Yes, they say, the futurists were wrong that time (and that time, and perhaps that time too), but they "know" that the overall reading of the trend as one of explosive change is correct. (The ruthless mockery to which those bullish about the pace of change subject those who dare to ask "Where are the flying cars?" is exemplary of this.10)
Techno-Hype and the Balance of Power
In private economic life, technological hype often leads to the misallocation of economic resources—to questionable purchases of goods and assets that can in the most extreme cases lead to speculative bubbles that disrupt whole economies, from the wonders claimed for the South Sea Company in its heyday, to the information technology bubble that burst in 2000. However, hype about such technologies has implications well beyond the purely civilian sphere. It has often had implications for perceptions about the prosperity of particular nations, and the power they derive from it, including the military power that in the modern world is founded on economic and industrial power. There has in fact been a recurring tendency to take an exaggerated view of the implications of an extraordinary lead in one or a few technologies for the wealth and power of a particular country closely identified with that technology, imagining that a disproportionate market share here can readily translate to a wildly disproportionate prosperity or power. This has frequently been reinforced by a tendency to claim that the country in question possesses some unique comparative advantage that will make its lead in that technology enduring.
An obvious case is Britain in the post-World War II period. Despite its post-war economic situation and decolonization, expectations existed that British accomplishments in cutting-edge fields as computers, aerospace and nuclear energy would sustain its economic and strategic position in the post-war world. Reality fell far short of such hopes in a history which saw such disappointments as the de Havilland Comet, the "Magnox" gas-cooled nuclear reactor, and the Concorde, which showed its advantages to be less robust than imagined, and the exploitation of these advantages more complex and uncertain.11
The tendency has grown more rather than less pronounced over time, with Japan's extraordinary dominance in microchips during the late 1980s another, even more germane example. Despite accounting for less than 3 percent of the world's population, 12 percent of world GDP and 16 percent of manufacturing, the country produced over half the world's chips at the time.12 In this context Japanese politician Shintaro Ishihara created a sensation with the claim in his book The Japan That Can Say No that Japan's strength in microchips put the Cold War balance of power in its hands, and would be a pillar of its enduring stature in the post-Cold War era.13 (This was to see it draw the former Soviet sphere into its own orbit, and manage the world's affairs jointly with the United States as the "Group of Two.")
While Ishihara's predictions have on the whole fared very poorly, it is notable that the country's extraordinary lead in microchip production proved fragile and brief, in part because it rested on quite different, more complex and less enduring foundations than he imagined. Japan's position as a microchip producer was based on its position as an exporter of consumer electronics more generally—the makers of such goods buying their chips from domestic producers, underlining the reality that its exceptional market share reflected a broader industrial supremacy, though even that proved fleeting. When Japanese consumer electronics lost their market dominance, the share they represented of the chip market declined. Moreover, the highly touted virtues of Japanese management and workers were nowhere near enough to compensate for a badly flawed restructuring of the industry amid a more competitive environment in the 1990s.14
While its power has been broader-based and more enduring, the United States, too, has been susceptible to such hype, similarly based on misconceptions about computer technology, and perceptions of the United States as more fitted than others to develop and use it. A pointed example is Ralph Peters' vision of the U.S. enjoying complete, continuous, global battlespace dominance based on a space-based military system so extensive and powerful its
attack systems deployed in near space . . . can, on order, destroy the vehicles, aircraft, missiles, and ships of any aggressor anywhere on the planet the moment a hostile actor violates or even threatens the territory of another state or entity—or uses military means to disrupt the internal rule of law in his own state.15The means for paying for this system were to be the "revolution of technological and informational wealth creation," if the marketplace, and a crackdown of corruption, succeeded in taming the excesses of defense contractors.16 Such explosive expectations of "New Economy" technological and economic growth, and at least as much, the idea that they would so thoroughly conduce to the power of the United States relative to other nations, have long since become passé.17
Ultimately, each case has been a reminder of the reality that a single technology is too narrow a base for any nation to retain a preponderant share of the world's wealth and power, and in any event, technological capabilities tend to diffuse. Perhaps especially when the technology in question is evolving and its market growing, and the technology is a consumer's rather than a producer's technology, that diffusion is especially prone to be rapid and deep. The result is that in the longer run a nation's weight tends toward what is suggested by that bigger complex of factors, from the extent and richness in natural resources of a nation's territory, to the size and education of its population, to the commitment of its government to such policies as make for industrial success or failure, rather than narrow and fleeting technological advantages.
Techno-Hype in the Military Sphere
Each of the factors operative in the civilian sphere is also operative in discussions of technologies designed specifically for military use. However, discussions of military technology here have their own, distinct tendencies, including the exaggerated idea of what a particular technology could do for the party that developed it. Generally, what has occurred is the idea of a technology's capability far outrunning its actuality because of the perceived desirability of that capability. Since World War I in particular there has been a preoccupation with the possibility of new technologies providing a means to quick, cheap, decisive military victory in a major interstate contest. This is exemplified by the '20s and '30s-era futurology regarding "mechanization," specifically the ability of small armored and air forces to substitute for large armies and total war efforts, with Basil Liddell Hart boldly claiming that planes and tanks would end the era of "'absolute war' and . . . its fungus growth—'the nation in arms,'" by enabling swift knockout blows by one power against another.18 Guilio Douhet was still more extravagant, claiming that a fleet of even twenty operational planes put a nation's "aero-chemical arm" in a position "to break up the whole social structure of the enemy in less than a week, no matter what his army and navy may do."19
While not going as far in the development of these technologies as some critics may suggest, expectations of the revolutionary nature of mechanization did shape national policies. In the interwar period Britain's leaders invested in the largest air force and most mechanized army of the day.20 Likewise, an expansion-minded Nazi leadership relied on these technologies as the only way in which its program of European conquests could be plausible, reflected in the emphasis on the Luftwaffe, and the development of an armored corps proficient in operating in the manner prescribed by the new theorists.21
When the war actually came armor and air power did indeed revolutionize the battlefield, but they fell far short of that ideal of quick, cheap, decisive victory. The simple reality is that technologies diffuse quickly in the modern world, and with similar speed spur the development of countermeasures that neutralize them.22 Consequently Germany, helped by an implausible combination of a high-risk strategy with a profoundly mismanaged French defense, won a quick, cheap victory in the 1940 Battle of France.23 However, already at that date other armies were taking note and learning to use their mechanized forces in the same offensive manner, or apply similar techniques to the problem of defense. The result was that Germany did not see another comparable victory for the rest of the war, with the failure of German operations to achieve the desired speedy victory in the early phase of Operation Barbarossa dooming it to the long war that virtually spelled defeat.
Likewise, even with air forces vastly larger, better-equipped and deployed in far more sophisticated fashion than anything Hart or Douhet discussed, strategic bombing was an attritional affair, practiced not just massively but over the extreme long term, with results that were ultimately ambiguous—in part because of the advent of radar, and its use to direct similarly larger and more capable forces of fighter aircraft. As a result Britain withstood prolonged German bombing in 1940-1941, and Germany itself withstood such prolonged bombing which, whatever else can be claimed for it, did not secure the promised speedy collapse of German willingness or ability to prosecute the war.
Ultimately it was not Liddell-Hart or Douhet who was proven right, but rather the less well-known and less heeded Vladimir Triandafillov, who much more astutely realized that the next war would be both a high-tech affair, and a mass affair, continuing even longer and taking a greater toll in blood and treasure than that First World War against which they reacted so strongly.24 Indeed, new technologies did not so much supplant earlier methods of war-fighting as increase the stock of weapons, systems, capabilities that a major power needed to fight its war. Armored fighting vehicles did surprisingly little to reduce the need for infantry able to fight dismounted, with the advent of man-portable anti-tank weapons actually necessitating their escort by protective infantry. At the same time air forces did not eliminate the need for armies or navies—the Luftwaffe not only proving no substitute for a German navy able to land a large invasion force on British soil, while when the tide turned, the Combined Bombing Offensive did not spare the Allies the need to invade Europe and ultimately take Berlin.
As it went with mechanization, so did it go with virtually every significant later development, because either the reality of the capability fell short of the promise, or because the diffusion of the technology, or the development of countermeasures to it, complicated the matter. The extent to which nuclear weaponry could substitute for other kinds of military power proved greatly exaggerated, witness the reform of the U.S. armed forces in the 1950s, and then the backing away from those changes in the period afterward, as the Soviet Union narrowed the nuclear gap, and the rigidities of a policy based on Massive Retaliation became increasingly apparent—as well as its irrelevance to the situations of insurgency that seemed to be of rising importance.25 Strategic arsenals did not eliminate the need for conventional forces of all the known types (land, air, sea), with their increasingly sophisticated expensive equipment, while the problem of insurgency encouraged the advent of special operations forces for dealing with these as well. For cash-strapped post-war Britain, air mobility, "commando carriers" and tactical nuclear weapons also appeared a cheap way of sustaining a worldwide military presence, an illusion repeatedly shown up until the pretension was abandoned in the withdrawal from "east of Suez."26
Time and again, space-based weaponry has been an object of such hype, particularly after the expectations of extremely cheap, reliable, regular space access created by the space shuttle in the 1970s. While this proved false the Strategic Defense Initiative renewed the hype in this area in the 1980s, enduring even after the Cold War (for which many erroneously accord it the credit), and resurging yet again around the turn of the century (the technology itself, as well as buoyant expectations of enduring New Economy boom, key to claims like Peters')—the command of the highest ground of all, again, making major war seem winnable.27 The exaltation of the technologies of the "Revolution in Military Affairs," notably its combination of digital computing and communications with increasingly advanced electronic sensors and precision-guided munitions, has in the view of many analysts led to exaggerated ideas of the capacity of policymakers to substitute them for more expensive and politically problematic commitments of ground forces.28
Hype Resurgent?
There seem two particular reasons for special concern at the present moment. The first is that such hype tends to go through cycles of boom and bust, with a recent bust seemingly being followed by a new boom at the time of this writing. The second would be the stresses of the international environment as it is today, which may leave anxious policymakers more than usually eager to grasp at illusory solutions to their problems.
Boom and Bust and Boom Again?
The late 1990s was a period of particular boom for technological hype, as personal computing, cellular telephony and Internet access became staples of everyday life for the advanced industrial nations' middle classes, and increasingly other groups as well. It was also accompanied by explosive expectations in areas like artificial intelligence, robotics, virtual reality, nanotechnology and genetic engineering, and reinforced by the experience of relatively rapid economic growth in the 1995-2000 period. However, the "tech bubble" burst. The advances in computing and cell phones became increasingly incremental. At the same time other technologies proved less immediately feasible than anticipated, with advances in neural networks proving sluggish, and advances in nano-machinery more elusive than their proponents suggested. Additionally, the spiking of commodity prices and energy prices especially evoked Malthusian catastrophe rather than Wellsian super-technology, and the experience of the "Great Recession" dampened expectations yet again. Rather than explosive technological change, prophecies of ecological doom resurged.
Since then progress in a number of key technological areas, particularly artificial intelligence, but also materials science and renewable energy, has revived expectations of deep, wide, near-term changes, particularly the automation of a great deal of economic activity, with the self-driving car perhaps the most emblematic at the moment. Such developments are already having significant cultural and intellectual effects, evident in the amount of time given to discussing their implications, as seen in the explosion of writing in the press regarding such subjects as appropriate governmental responses to the development of robots years or decades more advanced than anything now in existence. The prospect of mass unemployment as a result of the new technologies, for example, is already being discussed as if it were already a matter of immediate urgency in pieces by the most orthodox of thinkers running in the most mainstream of publications.29 (Indeed, it can seem that future unemployment resulting from automation is treated more earnestly than the current, actual problem of unemployment or underemployment.) One might well argue that such writers are getting ahead of themselves.
An "Easy" Out?
The second factor to be considered here is that such hype may be especially seductive in a period of constrained resources. This may especially be the case when a power which has been predominant, and particularly given to reliance on high technology, finds its position challenged—as the case of Britain dealing with its own decline in the early and mid-twentieth century strongly suggests.
Today the U.S. is similarly facing a changing distribution of wealth and power. In 1992 the U.S. economy was six times' the size of those of Russia and China combined. Today China's economy alone is larger than the U.S.'s when measured in Purchasing Power Parity terms. All of this is increasingly reflected in the hard reality that military capabilities and advantages the U.S. has long held a monopoly on are increasingly within the reach of other states. At the time of this writing, not only has China put its first squadron of fifth-generation (stealth) fighter aircraft into service, but Russia has sent such fighter planes to Syria operationally.30 None of this means that the U.S. has ceased to be the leading military power, or will cease to be that any time soon. It does not even mean that the U.S. has ceased to be the pace-setter for other military powers. However, the combination of quantitative and qualitative superiority it enjoyed in the 1990s is a thing of the past, and unlikely to return. Especially given the extent to which the United States has relied on cutting-edge technology as a basis for military strength, not only in the recent past, but historically; and the continued tendency to think of innovation, especially in the area of computers, as somehow uniquely American; the temptation to imagine some leap in weapons technology recovering the old edge may be great.
However, it should also be remembered that the U.S. is not the only actor susceptible to such illusions by any means, virtually every military power of note today in a comparable condition. Russia endeavors to be a world military power in a manner comparable with its Soviet-era stature, but on the basis of only a small fraction of its power base.31 China is by some measures the largest economy in the world today, but seeing its growth slow down markedly while its per capita income remains 15-30 percent that of the U.S.. The same goes even for less ambitious states. European states like Britain, France and Germany, and Japan, have become more committed to investing in military power, but from a position of slow economic growth and massive and growing indebtedness.32
Additionally, virtually all advanced nations are today anxious about populations which are aging, and perhaps also a decline of civic militarism, and hoping to substitute money and technology for manpower in this as in other areas. Moreover, none of these states has been averse to technophilia, or technological hype—as Vladimir Putin's widely publicized remarks on the prospect of global dominance going to the nation that leads the way in artificial intelligence make all too clear.33 Rather than revolutionary weaponry overturning the global balance, we should expect continued advances in weapons technology among a circle of powers determined by the possession of more traditional foundations of power—not least, scale in population, territory, natural resources, its manufacturing base and financial strength.
Conclusions
The problem of technological hype does not admit of tidy policy solutions. Rather the most that can be hoped for is a greater alertness to the problem on the part of analysts, which would be helped by a remembrance of a handful of lessons strongly suggested by many a reading of history: that hype is virtually built into the cycle of technological adoption; that the road from laboratory demonstration to practical use, let alone really effective use, is long and uncertain; and that a single technology is too narrow a basis for durable power, economic or military. Certainly the track record for new military technologies delivering swift, cheap, decisive victory in anything resembling a conflict between peers is dismal.
None of this is to deny that technology will continue to advance, that those advances will matter, or that those who fail to keep up will suffer untoward consequences. However, the experiences of the past century and above all its most calamitous war demonstrates, the price of acting on excessive expectations of what a technology can do will be just as high.
1 To cite only one of the more obvious metrics, the world's population was only 55 percent urbanized as of 2017, well below the norm for the developed countries. Central Intelligence Agency, "World," CIA World Factbook (Washington D.C.: Central Intelligence Agency, 2018). Accessed https://www.cia.gov/library/publications/the-world-factbook/geos/xx.html.
2 The most notable exponent of this line of argument is perhaps Robert J. Gordon. See Gordon, "Does the 'New Economy' Measure Up to the Great Inventions of the Past?" Journal of Economic Perspectives 14 No. 1 (Fall 2000), 49-74; The Rise and Fall of American Growth: The U.S. Standard of Living Since the Civil War (Princeton, NJ: Princeton University Press, 2016).
3 A search of the New York Times archive in March 2018 showed that where the word "lifestyle" was virtually unused before the late 1960s, its usage increased through the twentieth century, and exploded in the twenty-first. By 2007 it was appearing in the paper a thousand times a year, and has generally stayed near that figure since.
4 For a brief overview of the relevant history, see Nicholas Rescher, Predicting the Future: An Introduction to the Theory of Forecasting (Albany, NY: State University of New York, 1998), 19-33.
5 Charles E. Gannon, Rumors of War and Infernal Machines: Technomilitary Agenda-Setting in American and British Speculative Fiction (Lanham, MD: Rowman & Littlefield, 2003).
6 The quotation is commonly attributed to author William Gibson, who used it on multiple occasions, among them a 1999 appearance on U.S. National Public Radio. See "The Science in Science Fiction," Talk of the Nation, National Public Radio, 30 Nov. 1999. Accessed at https://www.npr.org/templates/story/story.php?storyId=1067220.
7 The tendency is much-noted in the small but growing literature on "short-termism." See Angela Black and Patricia Fraser, "Stock Market Short-termism—An International Perspective," Journal of Multinational Financial Management 12.2 (April 2002), 135-158. One study found that 80 percent of executives would sacrifice R & D for the sake of the "smooth earnings" that are the object of such behavior. John R. Graham, Campbell R. Harvey, and Shivaram Rajgopal, "The Economic Implications of Corporate Financial Reporting," Journal of Accounting and Economics 40 (2005), 3–73.
8 Critics of science fiction actually have a term for such a narrative—John Clute coining the term "Edisonade." See John Clute and Peter Nicholls, The Encyclopedia of Science Fiction (London, Orbit, 1993).
9 A case in point is Kurzweil's predictions for 2009 in his book The Age of Spiritual Machines, noteworthy both for containing many specific, testable forecasts, and the wide attention they have received. Alex Knapp offered a rare, critical examination of those forecasts in 2012, and this was sufficiently novel that Kurzweil himself elected to respond personally to Knapp in his own publication. Alex Knapp, "Ray Kurzweil's Predictions for 2009 Were Mostly Inaccurate," Forbes, 20 Mar. 2012. Accessed at https://www.forbes.com/sites/alexknapp/2012/03/20/ray-kurzweils-predictions-for-2009-were-mostly-inaccurate/#789884b73f9a. Ray Kurzweil, "Ray Kurzweil Defends His 2009 Predictions," Forbes, 21 Mar. 2012. Accessed at https://www.forbes.com/sites/alexknapp/2012/03/21/ray-kurzweil-defends-his-2009-predictions/#95811a94852e.
10 For an example of the tendency on the part of one well-known science fiction author more attentive to these matters than most, see Charles Stross, "Let's Put the Future Behind Us," Antipope.com, 23 Nov. 2014. Accessed at http://www.antipope.org/charlie/blog-static/2006/10/lets_put_the_future_behind_us.html.
11 Correlli Barnett, The Verdict of Peace: Britain Between Her Yesterday and the Future (New York: Macmillan, 2001); David Edgerton, Warfare State: Britain 1920-1970 (New York: University of Cambridge Press, 2006).
12 Martin Fackler, "Japan's Chip Makers Search for a Strategy," New York Times, 2 Jan. 2006. Accessed at https://www.nytimes.com/2006/01/02/technology/japans-chip-makers-search-for-a-strategy.html. GDP and manufacturing output data from United Nations, "Per Capita GDP at Constant 2010 Prices in U.S. Dollars," National Accounts Main Aggregates Database, Dec. 2017. Accessed at https://unstats.un.org/unsd/snaama/dnlList.asp.
13 Shintaro Ishihara, The Japan That Can Say No, trans. Frank Baldwin (New York: Simon & Schuster, 1991).
14 See Fackler.
15 Ralph Peters, Fighting for the Future: Will America Triumph? (Mechanicsburg, PA: Stackpole Books, 1999), 200.
16 Peters, 201.
17 Had the U.S. economy grown at its New Economy boom (1995-2000) rate, per capita GDP circa 2016 would have been 20 percent higher. Derived from UN, "Per Capita GDP."
18 Basil Liddell Hart, Paris, or The Future of War (New York: Garland Publishing, Inc., 1927), 85.
19 Giulio Douhet, The Command of the Air, trans. Dino Ferrari (New York: Coward-McCann, 1942), 142.
20 Donald Kagan and Frederick Kagan, While America Sleeps: Self-Delusion, Military Weakness and the Threat to Peace Today (New York: St. Martin's, 2000), 48-49, 56; Edgerton, Warfare State and Britain's War Machine: Weapons, Resources and Experts in the Second World War (New York: Oxford University Press, 2011).
21 For a discussion of the limitations imposed by the German economy's weaknesses on rearmament, see Adam Tooze, The Wages of Destruction: The Making and Breaking of the Nazi Economy (New York: Penguin, 2006).
22 For a discussion of the tendency of weaponry's failure to provide lasting advantage, also see Martin van Creveld, Technology and War: From 2000 B.C. to the Present (New York: the Free Press, 1989).
23 William L. Shirer, The Collapse of the Third Republic (New York: Simon & Schuster, 1969).
24 Vladimir K. Triandafillov, The Nature of the Operations of Modern Armies, trans. William A. Burhans (Portland, OR: Frank Cass, 1994).
25 Frederick W. Kagan, Finding the Target: the Transformation of American Military Policy (New York: Encounter Books, 2006).
26 See Phillip Darby, British Defence Policy East of Suez (London: Oxford University Press for the Royal Institute of International Affairs, 1973); William P. Snyder, The Politics of British Defense Policy 1945-1962 (Columbus, OH: Ohio State University Press, 1964).
27 See Nader Elhefnawy, "Space War and Futurehype," Space Review, 22 Oct. 2007. Accessed at http://www.thespacereview.com/article/984/1. Elhefnawy, "Revisiting Island One," Space Review, 27 Oct. 2008. Accessed at http://www.thespacereview.com/article/1238/1. Elhefnawy, "Space War and Futurehype Revisited," Space Review, 14 Nov. 2011. Accessed at http://www.thespacereview.com/article/1969/1. Elhefnawy, "Why We Fall for the Hype: Contextualizing Our Thought on Space Warfare," Space Review, 26 Mar. 2012. Accessed at http://www.thespacereview.com/article/2052/1.
28 Exemplary of such hopes was the course of action taken by the U.S. and its allies against Yugoslavia in 1999 and Afghanistan in 2001; and Harlan Ullman's "shock and awe" theory.
29 Lawrence H. Summers, "Larry Summers: The Robots Are Coming, Whether Trump’s Treasury Secretary Admits It or Not," Washington Post, 27 Mar. 2017. Accessed https://www.washingtonpost.com/news/wonk/wp/2017/03/27/larry-summers-mnuchins-take-on-artificial-intelligence-is-not-defensible/?noredirect=on&utm_term=.aa2a87408a98.
30 Jeffrey Lin and P.W. Singer, "China's J-20 stealth fighter jet has officially entered service," Popular Science, 18 Feb. 2018. Accessed https://www.popsci.com/chinas-j-20-stealth-fighter-officially-enters-service. Alex Lockie, "Russia thinks its new advanced fighter jet in Syria will scare off other countries — but nobody's afraid of it," Business Insider, 27 Feb. 2018. Accessed http://www.businessinsider.com/russia-su-57-syria-deter-scaring-nobody-2018-2.
31 Circa 1980 the Soviet Union had, according to one estimate, 15 percent of the world's manufacturing output. Today Russia has perhaps 1 percent, about the same as Indonesia or the Netherlands. For the 1980 figure see Bairoch, Paul, "International Industrialization Levels from 1750 to 1980," Journal of European Economic History 11 (Fall 1982), Table 9. For the more recent one, see Chris Rhodes, "Manufacturing: International Comparisons," House of Commons Library Briefing Paper, No. 05809, 5 Jan. 2018. Accessed https://researchbriefings.parliament.uk/ResearchBriefing/Summary/SN05809#fullreport.
32 Department of Finance, Canada, Fiscal Reference Tables, Sep. 2017, Table 54. Accessed at https://www.fin.gc.ca/frt-trf/2017/frt-trf-17-eng.asp.
33 David Meyer, "Vladimir Putin Says Whoever Leads in Artificial Intelligence Will Rule the World," Fortune, 4 Sep. 2017. Accessed at http://fortune.com/2017/09/04/ai-artificial-intelligence-putin-rule-world/.
Tweet
Subscribe to:
Posts (Atom)