Wednesday, August 19, 2020

Yes, Tony Blair Was a Neoliberal

Recently surveying Tony Blair's record as party leader and prime minister I saw that the pretense of Blair not being a neoliberal is just as risible as Bill Clinton's not being a neoliberal, given his not only acquiescing in the profound changes wrought in English economic and social life by his predecessors (privatization, union-breaking, financialization, etc.), but his particular brand of budgetary austerity with its tax breaks and deregulation for corporations and stringency with and hardness toward the poor, his backdoor privatization of basic services such as health and education (with college tuition running up from zero into the thousands of pounds a year on his watch), his hostility to government regulation of business, his inane New Economy vision of Cool Britannia (groan), and the rest. (Indeed, examining his record, and reexamining that of his predecessors, I was staggered by how much of it I had seen before reviewing the comparable history in the United States.)

That said, even considering the ways in which offended and disappointed many on the left, can seem halcyon in comparison with what has been seen since. The economic disasters and brutal austerity seen since his departure from office, which really does seem to spell the final doom of the post-war welfare state--the shift to an American-style regime with regard to higher education, a slower but still advancing shift in the same direction with the country's health care system, the raising of the regressive Value Added Tax yet again to 20 percent, the renewed assault on the social safety net of yet another Welfare Reform Act (2012) that delivered Universal Credit and bedroom tax, the hundreds of thousands of "excess deaths" in recent years traceable to cuts in care facilities, the plans to raise the retirement age (perhaps all the way to age 75, effectively abolishing retirement for most)--and all that, before the current public health/economic crisis.

I admit that next to that Blair's tenure does not look quite so bad--until one remembers the extent to which his policies did so much to pave the way for all of it, in carrying forward what apologists for New Labour tend to think of as Conservative projects, and his general lowering of the bar for what constitutes tolerable government. That led to this. And so lends the question "Was Tony Blair's Prime Ministership neoliberal?" an additional, very contemporary, significance, the more so with the Labour Party, for the time being, still walking the Blairite road.

Tuesday, August 18, 2020

Was Tony Blair a Neoliberal?

In recent years figures like Jonathan Chait have made it fashionable to deny the existence or salience of neoliberalism as a concept--and this has especially been the case in regard to the term's use as a descriptor for the (nominally) left of center parties of the United States and Britain.

My personal experience of discussion with those who espouse this view showed differences among those making the case in these respective countries. Those I encountered on social media who denied that Bill Clinton was a neoliberal were never equipped with any facts, only bullying and abusiveness that gave the impression they were professional trolls intent on silencing anyone who publicly espoused such an opinion. That only underlined how they had nothing to say on behalf of a position that even slight familiarity with Clinton's actual policy record makes appear risible--a line of thought which had me soon finding that there was a scarcity of comprehensive, systematic and thoroughly grounded assessments of that record to make this clear.

The thought of, if only in a small way, redressing that deficiency led to my paper, "Was the Clinton Administration Neoliberal?" and after that a book examining the U.S. policy record from the 1970s on in more comprehensive fashion (The Neoliberal Age in America: From Carter to Trump), both of which endeavor to offer an explicit, testable definition of neoliberalism, and then systematically consider the record of the administrations in question against it.

Those who contested Blair's labeling as neoliberal, however, assumed a different tone--in part, I suppose, because they did have something to say for themselves. They would point in particular to his establishment of a minimum wage and other rights for British workers that, certainly by American standards, appear very generous; and his funding of social services, which, again by American standards, also appeared very generous at the time. It did, at least, compel me to think about what they said, the more in as I was less familiar with the finer points of Blair's policy record than I was with Clinton's, or for that matter, Margaret Thatcher's, or Harold Wilson's, or Clement Attlee's.

In that I do not think I was alone. My impression is that Blair's domestic record has been overshadowed to a considerable degree by his foreign policy record--above all his supporting the U.S. invasion of Iraq in 2003 and bringing Britain's forces into the invasion along with it even as longtime NATO allies France and Germany (to say nothing of other powers like Russia and China) forcefully and publicly opposed the move. Moreover, critical examination of Blair's ministership would seem to have been inhibited by, on top of the generally lousy job with these things done by public intellectuals these days, the extreme resistance of the neoliberals in the Labour Party, whose hostility to any change of course was made all too plain in the pathetic lows to which they descended in their campaign against Jeremy Corbyn.

Still, examine Blair's record I did. And in doing so I saw that the pretense of Blair not being a neoliberal is just as risible as Clinton's not being a neoliberal, given his not only acquiescing in the profound changes wrought in English economic and social life by his predecessors (privatization, union-breaking, financialization, etc.), but his particular brand of budgetary austerity with its tax breaks for corporations and stringency for the poor, his backdoor privatization of basic services, his hostility to government regulation of business, his embrace of flaky New Economy thinking, and the rest. (Indeed, examining his record, and reexamining that of his predecessors, I was staggered by how much of it I had seen before reviewing the comparable history in the United States.)

You can check out my examination of Blair's record--which also includes an equally detailed examination of Margaret Thatcher's record--here at the web site of the Social Sciences Research Network.

Thursday, July 30, 2020

Gripen vs. Viggen and the Rising Cost of Fighter Aircraft

Recently writing about the Gripen I found myself thinking again about the lengthy, rapid rise in the cost of fighter aircraft, and from the start how it constrained the country's ambitions in this area from the start.

As I observed in the prior post, the Swedish program for a fourth-generation fighter aimed only for a light fighter, and was content to produce an aircraft delivered only fairly late in that cycle (the Swedish air force taking its Gripens in the '90s, when the U.S. was already flight-testing the Raptor, and the Eurofighter Typhoon was similarly being tested).

The country had been more ambitious when procuring the earlier, third-generation Viggens. They went for a medium fighter, not a light fighter, one reflection of which is that the later jet actually had a lighter maximum payload than the earlier plane (5300 kg to the Viggen's 7000 kg). It might be acknowledged that with its first deliveries made only in 1971 the jet can look like a relative latecomer compared with the F-4 Phantom (1960), but still came into service just behind the MiG-23 (1970) and a little ahead of the better-known Mirage F-1 (1973). Moreover, if there were earlier third generation type jets the Viggen was still in many ways a cutting-edge fighter, incorporating many relatively novel features, including the terrain-following radar and integrated circuit-based airborne computer just starting to appear in tactical aircraft at the time, a then ground-breaking canard design and thrust reverser, and in its afterburning turbofan engine, look down/shoot down capability and multi-function displays, technologies we associate with fourth-generation jets. (In fact, it does not seem unreasonable to think of the Viggen as a generation 3+ or 3.5 plane rather than just a gen-3.)

None of that, of course, detracts from the quality of the Gripen aircraft, which was a well-regarded aircraft at the time of its introduction, and has notably been upgraded in a number of respects, with the latest "E" version having a supercruise-capable engine and an AESA (active electronically scanned array) radar, turning generation 4 into generation 4+, while some impressive claims have also been made for its electronic warfare systems (with the most bullish arguing for them as an acceptable substitute for full-blown stealth capability). Still, the shift in strategy does reflect the way even affluent, highly industrialized nations with good access to the world market in the required inputs have been pinched by the mounting cost of this kind of program--which has already seen the biggest air powers in the world, with fifth-gen jets in service, buying up upgraded fourth-generation to fill out the ranks--while raising additional question marks about just how really "sixth generation" the next sixth generation of jets will actually be.

Thursday, July 23, 2020

How Could Sweden Afford the Saab-39 Gripen Fighter Program? A Postscript

In the end the answer to the question, "How Could Sweden Afford the Saab-39 Gripen Fighter Program?"--how a small (if rich and industrially advanced) country could afford its own fourth-generation fighter--is that there is a significant extent to which it did not afford it. The country's government ultimately counted on others to develop most of the requisite technology, which it accessed via licensing and outsourcing; and then where the final product was concerned, on others to share the cost by buying their own copies. Additionally, even that required a willingness to settle for an aircraft that, while very good, did not represent the outer limit of its generation's capability or the cutting edge of fighter design when it appeared (generation 5 was just beginning to come online when the first deliveries were made), while the country committed a disproportionate share of its defense resources to the program, as it could only do because of its specific geopolitical situation. (Had Sweden been obliged to fund a bigger navy, the competition for resources might have been too much.)

That there was a considerable gamble ought not to be overlooked, with the planes a very long-term investment that could easily have suffered were technological change more aggressive (even now the plan is to have them flying into the 2040s), or if the export market were less open. (It is worth remembering that the Cold War was heating up during the program's early days--that the preceding Saab-37 Viggen completely failed to line up foreign orders--and that by the '90s the export market was very uncertain.) Still, in the end it seems to have been a success.

Saturday, July 18, 2020

How Could Sweden Afford the Saab-39 Gripen Fighter Program?

The exploding cost of fighter aircraft has made programs to build domestically an up-to-date fighter decreasingly affordable for even the largest and richest countries, with even G-7 states increasingly forgoing that course. They find that given the resources at their disposal, the diseconomies of scale of producing an aircraft they alone might end up using, it just does not pay to go it alone.

Naturally I have found myself wondering how Sweden--a nation which, however affluent and technologically advanced, was still of a mere 10 million people, and did not commit a drastic proportion of its national income to defense spending during the relevant time period--managed to successfully produce a well-regarded fourth-generation fighter, the Saab-39 "Gripen," and to do that in apparently quite cost effective fashion (with Gripen Cs recently marketed for as little as $30 million).

Four factors seem to have made the difference.

1. Sweden Spent Less Than Other States, but Also Differently--Giving it Room for One Fighter Program if it Prioritzed it (and it Did)
For comparison purposes, let us use Britain. That country had a GDP six times Sweden's, and a defense budget eight times as big in 1979.1 Yet Britain had already given up on building its own current-generation fighters all by itself, relying on partnerships with other European countries to build its next generation of such planes--notably Germany and Italy in the Panavia Tornado program.

However, it has to be remembered that Britain also had numerous expenditures Sweden did not--on a nuclear arsenal and large navy it constructed domestically, and on a global network of bases and overseas garrisons (not least the big one in West Germany), all bound up with a complex array of international commitments.

Sweden did not have these expenses, instead being oriented to a fully conventional defense of its limited national territory, while it might be added that Sweden placed a very high priority on its air force. While, as noted before, Sweden was a much smaller country than Britain in the relevant ways, it operated almost as big a fleet of combat aircraft (still 400+ jets in the late Cold War, compared with the 500 or so Britain was generally operating at the time, as the RAND Facing the Future study on the Swedish air force remarked at the time).

It might also be added that even where procurement was concerned Britain insisted on an array of different combat aircraft, pursuing besides the Tornado, which was coming in fighter and strike versions, the Anglo-French Jaguar and the idiosyncratic VTOL Harrier (while operating a sizable fleet of F-4 Phantoms incorporating British engines and other components, and already thinking about what was to become the Eurofighter Typhoon). Had Britain not pursued so many types it would have had an easier time affording its own design. And that was what Sweden did, going with just the Gripen.

2. Sweden Was Content to Let Others Go First, and Settle for Less Than the Maximum Possible Capability
It is worth noting that besides going for just one fighter program instead of many such projects at once, Sweden did not strive for the ultimate. The Gripen is, as the exclusive focus on it required for justification's sake, a multirole aircraft. However, unlike the twin-engined, swing-wing Tornado with its low-level deep penetration capability and high payload, the Gripen was a single-engined, multi-role fighter of shorter range and lighter armament. To put it into U.S. Air Force terms it was more F-16 than F-15, with all that implied with regard to price.

Of course, even if the jet is more F-16 than F-15, the Gripen is still a fourth-generation jet, and again, a well-regarded one. Yet, consider the timing of its appearance. The U.S. Air Force received the delivery of its first true fourth-generation jet, the F-15, in 1974. As indicated above the Gripen program did not even begin until five years later, and the Swedish Air Force did not receive its first production copy of the aircraft until 1993, fourteen years after that--by which time the U.S. Air Force was already flight-testing the fifth-generation F-22. In a less dramatic way it is the same story with the British and their partners, who had their Tornado going into production just as the Gripen was emerging as a concept, while the Typhoon was to make its first flight as the Swedish Air Force formed its first Gripen squadrons.

In short, the Swedish government was ready to wait fifteen to twenty years longer than others to get even a light fourth-generation jet, and in the meantime make do with third-generation Saab-37 Viggens. Saving money, after all, was a necessity for even the Swedes at this stage in the history of fighter development, and walking a beaten path does that--not least because of one thing that did much to bring down costs, namely that

3. Sweden Outsourced and Licensed the Necessary Technology Where Feasible Rather Than Making Everything From Scratch
While the Gripen is Swedish-made, it is not all-Swedish, with crucial components developed jointly, or derived from other, established products, for the sake of cost (as much as a third of the aircraft sourced from the U.S. alone). The Gripen's first engine is an obvious example. While constructed by Volvo, it is a licensed derivative of the engine that General Electric made for the F-18. (It may also be worth noting that Saab had prior experience developing fighters in such a fashion, key systems on the prior Viggen being similarly sourced--and that the stress on minimizing cost can be contrasted with Japan's emphasis on developing technical know-how in the F-2 program, which produced a very advanced but also very costly F-16 derivative.)

4. Sweden Banked Big on Exports
Finally, in addition to its readiness to focus its resources on this one program, its moderation in its demands, and its willingness to use technology developed by others, Sweden counted heavily on the prospect of foreign sales did help make the Gripen project more plausible financially. Of course, in contrast with members of the globally active, NATO-affiliated, military aid-providing powers like Britain, or France, let alone the U.S.. And the Gripen's successes there are, at least thus far, a far cry from those of other fourth-gen single-engine jets like the French Mirage 2000 (almost 270 sold to eight different foreign customers) or the generation's most popular fighter, the F-16 (with nearly two thousand jets serving in some twenty-five air forces alongside the vast American fleet). Still, the fighter has already found a number of customers (Hungary, Czechia, South Africa, Thailand, Brazil, the last by itself looking to buy 108 aircraft), with as many as two dozen reportedly ongoing bids holding out the hope of still more (two of them to Canada and India, which could by themselves take another 200 aircraft, and make the Gripen a bestseller yet).

The gamble, in short, looks as if it is paying off. Still, it is worth noting that Sweden, like most countries, sat out the pursuit of a fifth generation of fighters, making do with upgraded Gripens--while in apparently taking an interest in the sixth generation it is not going it alone, joining the British-led "Tempest" program. I admit to not being bullish on that program, just as I have not been bullish on the sixth generation fighter given the technological claims made for it (initially, at least). However, it does seem safe to say that by this point the strategy that let Sweden build a fourth-generation fighter has long since run up against its limits.

1. As measured by the UN in 2015 U.S. dollars in its current National Accounts data set, Britain had a GDP of $1.33 trillion to Sweden's $232 billion, while spending a higher proportion of its GDP on defense--4.2 percent versus 3.1 percent--giving it a budget of $55 billion to Sweden's $7 billion.

Friday, May 1, 2020

THE MILITARY TECHNO-THRILLER: A HISTORY



"[A] multi-century tour de force . . . comprehensive . . . easily readable, making it the best of both worlds . . . a lot of fascinating insights . . . an excellent book that examines an overlooked genre through a variety of interesting perspectives in a highly readable way. I cannot recommend The Military Techno-Thriller: A History enough for fans of the genre." -Fuldapocalypse Fiction

THE MILITARY TECHNO-THRILLER: A HISTORY takes a close look at this widely read but still little studied genre, tracing its origins from the Victorian-era invasion story, to its 1980s heyday as king of the bestseller list in the hands of authors like Tom Clancy, down to today, considering its interaction with other genres and other media throughout. In the process, this book also tells the larger story of how the ways in which we think about, imagine and portray war evolved during the last century to bring us to where we are now.

Now available in print and e-book formats from Amazon and other retailers.

Get your copy today.

Monday, March 2, 2020

THE TECHNOLOGICAL SINGULARITY AND THE TRAGIC VIEW OF LIFE

Originally published at SSRN on January 29, 2019

The theorizing about what has come to be known as the "technological Singularity" holds that human beings can produce a greater-than-human sentience technologically, and that this may come within the lifetimes of not only today's children, but much or even most of its adult population as well.1 Much of this anticipates that this will be part of a larger trend of accelerating technological change, with advances in computers matched by advances in other areas, like genetic engineering and nano-scale manipulation of matter; or be a source of acceleration itself as superhuman artificial intelligences perform technological feats beyond human capability.2 In either case, as the term "singularity" suggests, the consequences become unpredictable, but a common view is that shortly after the arrival of the big moment, we will see the most fundamental aspects of human existence—birth, growth, aging, senescence, mortality; the limits of an individual consciousness' functioning in space and time; material scarcity and the conditions it sets for the physical survival of human beings—altered so radically that we would become transhumans, on the way to becoming outright posthumans. Those who describe themselves as Singularitarians expect these changes to be not merely profound, but a liberation of the species from the constraints that have cruelly oppressed it since its first appearance on this planet.

All this, of course, is mind-bending stuff. Indeed, no one alive today can really, fully wrap their mind around it, even those most steeped in the idea. Still, the difficulty of the concepts lies not only in the complete alienness of such conditions to our personal and collective experience, but also in their flying in the face of the conventional expectations—not least, that even if we have grown used to technology changing constantly, the really important things in life do not, cannot change. Indeed, passive acceptance of the world as it is; a view of it as unchanged, unchanging and unchangeable; and given that this applies to a great deal that is unquestionably bad, an ironic attitude toward the prospects for human happiness; are commonly equated with "wisdom." And rejection of things as they are, a desire to alter them for the better, a belief that human beings have a right to happiness, are likewise equated with not just the opposite of wisdom, but the cause of disaster.

This is all, of course, a terribly bleak and unattractive perspective to any but the most complacent of us—and not altogether rational, inconceivable as it is without the idea that the cosmos is somehow metaphysically rigged against human beings. Why should this morbid outlook enjoy such force? Especially in a modern world where it has been proven that life does indeed change? And, frankly, that meaningful improvement in the terms of existence of human beings is achievable?

The Tragic View
One obvious reason is the weight of a very old, long tradition that has deeply suffused just about every human culture, Western culture by no means least. In ancient times, as people began to think about the world around them, they could see that the world was not what human beings would like it to be. Life was poor, nasty, brutish, short—with hunger, illness, violence, insecurity (to say nothing of innumerable lesser hardships and frustrations) utterly saturating their existence. Childbirth was dangerous to all concerned, and few of the children born made it all the way to adulthood. A more settled existence brought somewhat greater affluence and security, but this was only relative, and purchased at a high price in toil and oppression, with daily existence defined by a more regimented routine of labor, and a more stratified society. The many worked for the enrichment of a few—and even the comparative luxury in which the few lived only exempted them from so much. Even those who ate well suffered their share of disease in an age of primitive medicine, and violence too when being part of a ruling caste meant being a warrior. And of course, even the most sheltered existence meant aging and death.

It also seemed to them that there was not much they could do about it. The faintness of their understanding of how the world about them worked, the primitiveness of the means to hand that necessarily followed from this, the dimness of their awareness that knowledge of the world could be applied to the end of enlarging economic productivity, meant that one could picture only so much improvement in the economic output that in hindsight we know to be key to deep or lasting material progress.

The crying out in anguish against all this was the birth of tragedy. One can see it in that oldest of stories, the Epic of Gilgamesh. Gilgamesh, seeing a bug fall out of the nose of his fallen friend Enkidu (the image crops up again and again in the poem) is horrified by the reality of death, sets out in quest of immortality—and all too predictably fails to achieve it, falling asleep while a snake gobbles up the herb that would have let him live forever before he can take it. A very large part of higher culture has been a development of this sensibility. In the Abrahamic religious tradition we have the temptation of Adam and Eve, original sin, the Fall, expulsion from Eden, a punishment compounded by the familiar limits of the "human condition": traumatic birth, a life spent in toil, death. So does it likewise go in the Classical tradition, where humans, whose Golden Age lies in the past, have their lives spun out, measured and cut by the Fates, and the details of those lives not decided by the Fates determined by the whims of gods intent on keeping them humble.3 (Poseidon could not keep Odysseus ever from getting home—outside his purview, that—but he did see that it was a ten year odyssey, and did a good many worse things with a good deal less reason; while "wise" Athena was not so far above petty jealousy as to refrain from turning the human who bested her as a weaver into a spider.)

Eventually developed, too, was an element of compensation for all this. Human beings suffer in this world—but if they bow their heads and obey, they will eventually be blessed in this world, or if they don't get so blessed, find something better in another one on the other side of death. And in at least some traditions, human suffering as a whole does end, a Millennium arriving and all well with the world after that.

Still, the connection between good behavior and reward was necessarily fuzzy, and even in those traditions notes of doubt about the rightness of all this are evident. As God inflicts on his exceptionally faithful servant Job one horrific suffering after another simply for the sake of a bet with the Devil, when he has had all he can take (he is huddling shit to keep warm), Job cries out "Why?"

Oedipus, approaching his death (in Oedipus at Thebes, the most accomplished but least-read of the trilogy by Sophocles), wonders at the same thing. After all, killing a man who challenged him in that sort of roadside confrontation and marrying a queen were only turned from incidents in a tale of heroic adventure into cosmic crimes by the fact that the man was his father, the woman his mother, both of which details were totally unknown to him—while the whole sequence of events was triggered by his father's attempt to evade punishment for an unspeakable crime of his own by ordering his own son's infanticide. Where was the justice in that?

Of course, no satisfactory answer is forthcoming to such questions. Indeed, to modern, rational eyes, tales like those of Job and Oedipus are about the subjection of human beings through no fault of their own to horrors by the will of arbitrary, cruel gods, whose right to do such things is a simple matter of their having the power to do it and get away with it.

And as it happens, there are also doubts that things really have to be this way. The idea that humans could become like those gods, acquire the power to be like them, and even overthrow them, but that this was forbidden to them and they were slapped down when they tried, cropped up again and again. Gilgamesh, after all, may not have attained his goal but he did come very, very close, only at the very last minute losing the herb that would have made him live forever. In the Garden of Eden the sin of Adam and Eve was to eat of the fruit of the tree of Knowledge of Good and Evil, knowledge which could make them like gods. Zeus begrudged man the gift of fire, and punished his benefactor Prometheus by chaining him to Mount Elbrus and having a vulture tea out and eat his liver, after which it grew back at night so that it could be torn out and eaten again the next day, and the day after that, and the day after that . . . but human beings kept the knowledge of fire nonetheless.

All the same, these admittedly not insignificant details are exceptional, contrary hints, and not more than that in a narrative that, pretty much always, reiterated again and again passivity and awe before the majesty of a design beyond our ken.

"KNOW YOUR PLACE!" it all thunders.

And by and large, it was exceedingly rare that anyone carried the thought further. There was, after all, much more emphasis on what had been and what was than what might be—and little enough basis for thinking about that. Even among the few who had leisure to think and the education and associations to equip them with the best available tools with which to do it, mental horizons were bounded by the crudity of those tools. (Two thousand years ago, syllogisms were cutting-edge stuff.) By the narrowness of personal experience. (Communities were small, movement across even brief distances a privilege of very few and even that difficult and dangerous, while the "Known World" of which people even heard was a very small thing indeed.) The slightness of the means of communication. (Illiteracy was the norm; books hand-copied on papyrus and parchment and wax tablets were rare and expensive things; and the more complex and less standardized grammar and spelling, the tendency to use language decoratively rather than descriptively, the roundabout methods of argument and explanation—such as one sees in Socratic dialogue—likely made deep reading comprehension a rarer thing than we realize.) And even the brevity of life. (Life expectancy was thirty, imposing a fairly early cut-off on how much the vast majority of people could learn, even if they had the means and opportunity.)

Moreover, the conventional ideas enjoyed not only far more powerful cultural sanction than they possess now, through the force of religious belief and custom, but were backed up by all the violence of which society was capable. This was the more so not only because the powerful found it easier to take such views (it is one thing to be philosophical about man's condemnation to hard toil when one is a slave in the fields, another when one is a wealthy patrician who has never done any sitting on his shaded porch enjoying a cool drink), but because, within the limits of the world as they knew them, their situation was quite congenial to them.

Carroll Quigley, who wrote at some length about the conflict between democratically inclined, socially critical "progressives" and oligarchical "conservatives" in ancient Greece in The Evolution of Civilizations observed that the latter settled on the idea "that change was evil, superficial, illusory, and fundamentally impossible" as a fundamental of their thought. This applied with particular force to terms of social existence, like slavery, which they held to be "based on real unchanging differences and not upon accidental or conventional distinctions." Indeed, the object pursued by those who would have changed such things—a redress of material facts—was itself attacked by the associated view that "all material things" were "misleading, illusory, distracting, and not worth seeking."

In short—the world cannot be changed, trying to change it will make life even worse than it is, and anyway, you shouldn't be thinking about the material facts at all. This anti-materialism went hand in hand with a denigration of observation and experiment as a way of testing propositions about the world—an aristocrat's musings superior to actually seeing for oneself how things actually stood. With the concrete facts of the world trivialized in this way, the conventional wisdom handed down from the past was that much further beyond challenge (while, of course, this outlook did not exactly forward the development of technological capability). Ultimately the promotion of these ideas by the "oligarchs" (and their rejection of the ideas not to their liking), helped by the primitiveness of communication (works the rich would not pay to copy did not endure), was so effective that, as Quigley noted, "the works of the intellectual supporters of the oligarchy, such as Plato, Xenophon, and Cicero" have survived, but "the writings of the Sophists and Ionian scientists," "of Anaxagoras and Epicurus," have not.

In the wake of all this it may be said that philosophy was less about understanding the world (let alone changing it) than accommodating oneself to it—by learning to be better at being passive. One picks up Marcus Aurelius' Meditations, and finds the celebrated work by the famous emperor-philosopher to be . . . a self-help book. And rather than a philosophy concerned with nature or politics, the metaphysics of the thoroughly anti-worldly Plotinus (who taught that the ultimate good lay in one's turning away from the low world of material sense-reality to the higher one of the spirit as the path to ecstatic union with the Divine) was arguably the most influential legacy of the latter part of the Classical era.

The pessimism of the Classical outlook, particularly by way of Plotinus, did much to shape and even blend with the Abrahamic tradition as Jewish and early Christian thinkers used Greek philosophy in interpreting religious texts—while the work of the Greeks endured as the cornerstone of secular learning in its own right. Of course, Classical civilization crumbled in Western Europe. Yet, in its way that strengthened rather than weakened the hold of such thinking. Of the major institutions of the Western Roman Empire, only the Church survived and flourished, playing a larger role than ever in the centuries that followed. Meanwhile, amid these "Dark Ages," there was a nostalgia for ancient times, and with it a propensity for exalting its practical achievements as unsurpassed and unsurpassable. The Greeks, the Romans, were held to have all the answers, and it was thought best to consult them rather than try to find out new things, the means for which that Classical philosophy, of course, marginalized. Along with the prevailing religiosity, this whole outlook directed philosophers' attentions away from the workaday world—to the point of being famously satirized as debates over how many angels could dance on the head of a pin. And of course, if reason said one thing and religion another, then one had to accept that religion was right—or face the Inquisition. Unsurprisingly much energy went into attempts to reconcile the world's very real horrors with the existence of a divine plan by an all-good and all-powerful Supreme Being—pessimism about whether things could have been or could be better confusingly passed off as the optimism that this is the "best of all possible worlds."

Tragedy, Modernity and Reaction
So things went until the Renaissance, and the flowering of humanism with it, and the intellectual developments that succeeded it. In the seventeenth century thinkers like Francis Bacon and Renee Descartes not only explicitly formulated and advocated a scientific method based precisely on the value of study of the material world. They also declared for the object of uncovering all nature's secrets and applying the knowledge to the end of "relieving man's estate." Moreover, such thinking was quickly extended by others to social, economic and political life. Opposing barbarous custom and superstition, they identified and defended the rights of all individual human beings enjoyed specifically because they are human beings (life, liberty, property), extending to the right to choose their own government—even rebelling when an existing government failed to perform even the bare minimum of its duty (as Thomas Hobbes did), or became repressive (as John Locke did).

In short, the prospect of positive, meaningful, humanly conceived and controlled change was raised explicitly and forcefully by the Scientific Revolution, by liberalism, and by the Enlightenment more generally—and raised them successfully. However, that scientific inquiry, applied science and political liberalism flourished in modern times as they did not in the ancient did not mean that they were unopposed. The old ideas never ceased to have their purchase, and certainly vested interests in the modern world (as with Churchmen concerned for their influence and privileges) could not look on such talk with equanimity any more than their forebears did. Conservatives threatened by all this clung to tradition, to the religious institutions that usually sided with it, to the unchanging verities of the "ancients" over the reason and science of the "moderns." The now-obscure English theorist of divine right, Robert Filmer, insisted in De Patriarcha that kings were granted by God paternal rights over their peoples, which extended to the power of life or death—and revolutionary, democratic alternatives doomed to short, bloody and chaotic lives ending in failure.

Filmer's arguments (which Locke eviscerated in his First Treatise on Government) were belied by the unprecedented peace and prosperity that England enjoyed after the 1688 Revolution. However, conservatives responded to the Enlightenment with the Counter-Enlightenment, identifying reason and change more generally with disaster, stressing the original sin Christianity held tainted human beings, and even rejecting the idea of the individual human being as a meaningful unit of social analysis.

Indeed, it became common to oppose to the universalism of the Enlightenment a politics of identity in the manner of Joseph de Maistre (who famously remarked that he had met Frenchmen, Italians, Russians, but knew nothing of "man"), with identity usually defined in terms hostile to progressive ideas. Reason, secularism, democracy, were commonly characterized by such thinkers as alien, reflecting the character of another people—typically a character held to be less virtuous, less "pure," less "spiritual" than "our own." If such things worked at all, they said, it was only for those less virtuous, less pure, less spiritual people; certainly they cannot work for us, which is assuredly a good thing as our traditionalism, religiosity, monarchism, serf-lord relationships and the like express and sustain a greater wisdom than those Others can ever know, and which importing their ways could only corrupt.

Going hand in hand with this was much romanticizing of the rural, agrarian element of the population as a repository of those older ways—unlike these rootless city types, especially the ones with a modicum of "book learning," which seemed not an altogether good thing. Worst of all in their eyes those "overeducated" types who, "educated above their intelligence," perhaps defectively born with too much brain and too little body, too little blood, have become alienated from their roots and their natural feelings—internal foreigners. And indeed the visions of reform to which they so often inclined, they said, showed that while they spoke of the people they did not know, understand or respect them—and said that what they needed most of all was some hardship and toil among the lower orders to teach them "the real world." (Thus does it go in Leo Tolstoy's War and Peace, where Pierre Bezukhov goes from Western-educated cosmopolitan intellectual to apostle of the peasant Karataev, who passively accepts whatever life brings until he dies, marching in the French army's prisoner train as it retreats from Russia.)

All this naturally converged with a whole host of prejudices, old and new, exemplified perhaps in Victorian-era theorists of "Aryanism," who identified the conservative, traditionalist stances (an acceptance of the unchanging nature of things, idealism over materialism, etc.) with spiritually superior Aryan cultures, and liberal/radical, modern outlooks with inferior "non-Aryans"—even when they made up their lists of who went under each heading in mutually exclusive ways. Thus one had the absurdity of German and Russian nationalists each insisting that their country was the purest bearer of the Aryan legacy—while the other nation was not Aryan at all.4

Of course, this reaction did not turn back the clock, the Old Regime never returning and those who even really wished for such an outcome becoming something of a lunatic fringe, but it still had its successes. Religious, nationalistic, traditionalist, anti-intellectual and "populist" appeals on behalf of the status quo and its greatest beneficiaries helped make the spread of formal democracy a less threatening thing to the contented. Meanwhile, as conscious, explicit belief in specific religious doctrines weakened, what might be called the "religious frame of mind" remained, plainly evident in the phrases of everyday language, like the need "to have faith," the idea that "things are meant to be" or "not meant to be," and of course, that there are "things man is not meant to know." (Faith in what, exactly? If things are "meant" or "not meant," just who—or Who—is doing the "meaning?")

And of course, as the old guard of throne, aristocracy and established church declined, the bourgeoisie that had once been revolutionary, now enjoying power and privilege, and anxious about the lower orders and the social questions their existence raised, became conservative in its turn, likewise inclining toward that idea of change as "evil, superficial, illusory, and fundamentally impossible," and reason and its prescriptions as a thing best not pushed "too far." It was less of a stretch than might be imagined—the bourgeois outlook, after all, being focused on the individual, specifically an ethic of individual success-striving and individual moral responsibility, the existing social frame so utterly taken for granted that it did not seem to exist for them at all. (Indeed, as Margaret Thatcher made clear, at times their politics has explicitly denied that it does.)

Unsurprisingly, those favoring constancy over change found more rationalistic-seeming supports for their outlook. The dark view of radical social change taken by the French Revolution's enemies, which identified the French Revolution not with the Declaration of the Rights of Man or the abolition of feudal oppressions but guillotines, Napoleon and Restoration, colored the popular imagination of the event—driving home the idea that if revolution was not a crime against God, then it was still bloody, messy and doomed to failure.

And this was not simply founded on a vague sense of society's machinery as a complex thing not easily tinkered with, or insecurity about whether the state of the art in social engineering is up to such a challenge (the position of a Karl Popper, for example), but a whole host of newer rationales for the unchangeable nature of the world, the ineradicability of the evils in it, the obvious implications for human happiness, and the wisdom of letting everything alone. Like Malthusian scarcity, which attributed poverty and its attendant miseries not to economic or social systems, but "the passion between the sexes." And its extension in the Social Darwinism of Herbert Spencer, in which perhaps God did not assign different people different stations, but Nature did in its uneven distribution of "fitness," and the still more uneven rewards accruing to it. Or the Nietzschean will-to-power. Or Freudian psychoanalysis, which declared the repression of basic human drives (the pursuit of sex, the aversion to work as we commonly think of it) as essential to civilized life.

Or postmodernism, ostensibly secular adherents of which speak of "the problem of evil" in the mystical tones of Medieval theologians, with the subject-object separation of their epistemology an apparent stand-in for the Fall, while in their attachment to identity politics they echo de Maistre's remarks about never having met Man, all of which adds up to a hostility to "grand narratives" as ferocious as any other attack ever launched on the idea of progress—in as thoroughly obscurantist a language as any clergy ever devised. And of course, there are more popular streams of thought, not least the self-help culture, which promotes a conservative idealism scarcely distinguishable from that of the ancient Greek oligarchs. (You can't change the world! There is no world! There's just you, and how you look at it, so change yourself instead!)

All of this so thoroughly saturates our culture that there is no getting away from it—even those most educated for the task of critical thought. The age, ubiquity, association of such an outlook with those most prestigious philosophical and literary texts that have never ceased to be the cornerstone of an education in the humanities (from Aristotle to Shakespeare, from Milton to Tolstoy) is itself be enough to endow such an outlook with enormous prestige to which few intellectuals are totally immune. (Indeed, it was a nineteenth century radical of some note who observed that "the past weighs like a nightmare on the brain of the living"—and a twentieth century radical who remarked that in his youth England and its empire were ruled by "men who could quote Horace but had never heard of algebra.") That there is never a shortage of rationalists who, in disappointment or despair personal or political—or simple self-interest—repudiate their old beliefs to take up exactly opposite ones at the other end of the political spectrum also encourages the tendency. (After all, just as much as ever, intellectual, cultural and especially political life remain the preserve of the privileged, who remained inclined to conservatism, while privilege remains ready to place its prestige and wealth behind conservative thinkers and thought.)

Little wonder, then, that ultra right-wing postmodernism, passed off as daring leftishness to the confusion of nearly all, has become the conventional wisdom of the humanities, the social sciences, and much else, while popular culture serves up a diet of pessimism in one movie about science and technology going wrong after another. We may not know what will happen, exactly, but we are usually safe in assuming that something bad will come of that experiment, that invention. Safe in assuming that if there is a robot, it will rebel. And with the far, far more radical prospects opened up by the Singularity, the dread is commensurately greater.

Those who would discuss the subject rationally have to realize that they are up against all this baggage. And if they wish to be persuade the public of the positive possibilities that our technology has already opened up, and might continue to open up in the coming years, they must be prepared not only to promise the realization of long thwarted human hopes, but challenge the colossal weight of millennia of dark, misanthropic irrationality with the still more overpowering force of reason, too little appreciated these days, but as great as it ever was.

1 The term "technological Singularity" is generally credited to Vernor Vinge. Other writers in this area include Hans Moravec and perhaps most famously Ray Kurzweil. See Vernor Vinge, "The Coming Technological Singularity: How to Survive in the Post-Human Era," VISION-21 Symposium, NASA Lewis Research Center and the Ohio Aerospace Institute, 30-31 Mar. 1993; Hans Moravec, Mind Children: The Future of Human and Robot Intelligence (Cambridge, MA: Harvard University Press, 1990) and Moravec, Robot (New York: Oxford University Press, 2000); Ray Kurzweil, The Age of Spiritual Machines: When Computers Exceed Human Intelligence (New York: Viking, 1999) and Kurzweil, The Singularity is Near: When Humans Transcend Biology (New York: Viking, 2005).
2 As Irving John Good put it in an early paper on the subject, "the first ultraintelligent machine is the last invention that man need ever make." Irving John Good, "Speculations Concerning the First Ultra-Intelligent Machine." In Franz L. Alt and Morris Rubinoff, Advances in Computers 6 (New York: Academic Press, 1965), 31-88.
3 Writing of the ancient Greeks John Ransom Crowe characterized their view of the world as something that not only "resists mastery, is more mysterious than intelligible . . . a world of appearances," but also "perhaps . . . more evil than good." John Crowe Ransom, The New Criticism (Westport, CN: Greenwood Press, 1979), 335.
4 German theorists, of course, excluded the Slavs from the Aryan category, while the Russian Slavophile Aleksey Khomyakov regarded the Slavs as Europe's true Aryans, while Germans were non-Aryan "Kushites."

Technological Hype and the Military Balance

Originally published through SSRN, June 3, 2018

The modern world has seen profound technological change, but it has also seen equally profound technological hype—extravagant, intensive publicity and promotion not merely of new technologies, but technologies merely thought to be near realization, which has tended to go hand in hand with the exaggeration of their maturity, their capabilities and their implications for actual life. Relative to how large—and often, how damaging—a role such hype has played in recent history, and how big a factor it is in contemporary thinking today, the subject has been little studied. Its implications for the military sphere are no exception to this rule.

This article is intended as an examination of that subject, beginning with a consideration of the factors that make today's observers susceptible to technological hype. It will then consider the implications of such hype for thinking about the distribution of power among nations—both the ways in which they affect economic and geopolitical thinking generally; and in which they affect thinking about military technology more narrowly. From there the discussion proceeds to a consideration of the present—the past few years, and very likely, a good many of the years ahead—as a period of particular vulnerability to such hype.

Understanding Technological Hype
To better appreciate the actual pervasiveness and weight of technological hype, it is helpful to consider four factors from which it derives its strength: our prolonged experience of technological upheaval; the "rumors of the future" we get from futurology, science and technology reporting, and to a greater degree than is widely appreciated, science fiction; popular understandings of technological research and development; and the culture of commerce.

Prolonged Experience of Technological Upheaval
It would be impossible to exaggerate the significance of the shift from a predominantly agrarian, rural, pre-industrial civilization to an urbanized, industrial civilization founded on the expansion of human capability through machinery powered by inanimate power sources (coal-fired steam engines, internal combustion engines, electricity). That change, which in much of the world is still ongoing, and even in the most advanced countries still less than perfectly complete, has been recent enough and dramatic enough to create a general expectation of such change as a constant in modern life.1

It may well be the case that this has led to the overrating of what are arguably the more modest changes seen since that time, like the revolution in digital computing and communications, so often accorded a significance comparable to that of the Industrial Revolution, which has exaggerated the impression of the rate and depth of more recent technological change.2 Such exaggeration may be reinforced by a weak historical sense on the part of the public (especially among younger people), and perhaps the popular obsessions with entertainment, social media, consumerism and "lifestyle" generally.3 However one explains it, in the end the perception of such continuous change is there, and interacts with other factors, not least the fact that more such change is continually promised to them in the media.

"Rumors of the Future"
The preoccupation with the ways in which the future will be different from the present, and still more, the investment of rigorous effort in working out those differences were in the late nineteenth century still a novelty. Works like Stanley Jevons' The Coal Question, Ivan Bloch's The Future of War, and H.G. Wells' Anticipations, as well as the stories by Jules Verne and other early proponents of science fiction, all owe their place in cultural and intellectual history partly to that novelty. However, during the twentieth century professional futurology became a massive industry, and while its prestige may have fallen somewhat since its peak in the post-World War II period, its presence is perhaps more strongly felt than ever before.4 In the discussion of topics ranging from nuclear proliferation to climate change, from the rate of economic growth to the space program, even the casual consumer of news comes into contact with more or less sophisticated extrapolations not only about what has happened or is happening, but what will happen years or decades hence, and the implications of those possibilities for the present—and so frequently that they scarcely even notice the fact. With news outlets routinely devoting full sections of their papers, magazines and web sites to "Science" and "Technology" News, discussion of technologies only in development, perhaps only very early development, are often the explicit focus of a substantial portion of news coverage itself.

At the same time, the imagery of imaginary technologies has become virtually pervasive across popular culture. In recent years, any list of the highest-grossing movies at the box office tends to consist primarily of action-adventure films packed with futuristic vehicles and other technologies, like the Star Wars and Marvel Comics Universe films. Charles Gannon has usefully characterized such imagery as "rumors of the future," habituating us to the thought of technologies that do not yet exist.5 Indeed, that such technologies actually do exist, if only somewhere, in secret, is a staple of science fiction, which routinely presents intelligence agencies and other such organizations as secretly carrying on ultra-advanced programs to explore space and the like in the present day. (This is the view that the "future is already here—it's just not very evenly distributed."6)

Popular Conceptions of Science and Technology
Even while the products of science, hypothetical as well as real, have become ubiquitous in cultural life, the actual process that is science remains poorly understood by non-scientists on the whole. The reality is that engineering work of the kind relevant to this discussion is often a collaborative, cumulative, slow-moving activity, which may be straining against the limits not only of what has been done, but what is possible with the existing state of the art. It also tends to occur within a business, economic and political context, where imperatives besides mere feasibility are operative, and people who are not scientists or engineers—who may in fact be ill-educated in these matters—are often in control. (The corporation that cuts its R & D budget to make this quarter's earnings report look better is not unknown.7) And even in the best of circumstances technological research and development can be subject to fits and starts, dead ends and flashes-in-the-pan, often resulting in a device that exists but is virtually unworkable in any real-life conditions, or perhaps simply stillborn at the concept stage.

However, the popular image of such work envisions such work as highly individualistic and speedy, with revolutionary breakthroughs cheap and easy. This is, again, partly a matter of fiction favoring the simple, heroic, dramatic story over a more complex reality.8 Yet, fiction is not the sole source of this thinking, greatly encouraged by the manner in which journalism has tended to portray real-life developments, epitomized by the conception of the computing revolution. While the history of mechanical computing goes back centuries, with key computing concepts dating very far back in them (Blaise Pascal, Gottfried Leibniz, Charles Baggage laying down essential groundwork), and electronic computers to the 1940s, after which they were the object of massive government R & D efforts that led to still more key innovations (ENIAC, SAGE, NLS, ARPANET), there is a tendency to imagine it all as having emerged out of garages in California without help from anyone else. This view of technological development as such a quick, simple matter, makes it the easier to imagine that a technology only "in development," and which on closer examination may be a very uncertain or distant prospect, is almost ready for commercial employment.

The Culture of Commerce
Interacting with and reinforcing past experience of technological age, "rumors of the future," and the prevalent, oversimplified view of how technologies are realized in practice, is what one may term the "culture of commerce." It is, of course, the business of advertisers to make the wares on sale appear as attractive and compelling as possible, and one obvious and commonly taken course is to present those wares as being as new and revolutionary as possible—as a practical matter, more new and revolutionary than they really are. Still, this indisputably distorts understanding of the matter. It does not help that journalism is increasingly virtually indistinguishable from advertising, a tendency epitomized by those media outlets that carry what is euphemistically referred to as "paid" or "sponsored" content. However, the mentality pervades even more conventional journalism, dependent as it is on capturing audiences with exciting headlines. Especially when this imperative is combined with a weak grasp of the process by which concepts actually become viable technologies, it easily creates an exaggerated picture of the technology's feasibility. Indeed, the fostering of expectations inflated far beyond what will ever be met are virtually built into the process by which technologies enter into use, the "Gartner Hype Cycle" based on studies of technologies going back to railroads positing a sequence of technological breakthrough drawing attention before the production of usable, commercial products; inflated expectations; a period of disillusionment; a second, rising trend of genuine usage; and then a "plateau" of productivity, in which the technology makes a genuine contribution, if less spectacular than what the optimists imagined during that earlier period of inflated expectations.

The cases of single technologies apart, the pattern of such reportage has a profound shaping influence on perceptions. Any viewer of the news is subject to constant, specific claims of explosive change in one area or another. The individual claims (which tend not to stand up to scrutiny) are forgotten, but the overall impression of that rate of change remains. Moreover, when the old claims which produced that impression are critically examined, the evaluation is often generous—overly generous, the bar for "prophecy" set low, with an obvious example the assessment of Raymond Kurzweil's forecasts.9

When despite this a prediction, or mass of them, proves to have been clearly wrong, a typical response is to shrug off the exaggerations, and suggest that those taking the more critical view of "missing the forest for the trees." Yes, they say, the futurists were wrong that time (and that time, and perhaps that time too), but they "know" that the overall reading of the trend as one of explosive change is correct. (The ruthless mockery to which those bullish about the pace of change subject those who dare to ask "Where are the flying cars?" is exemplary of this.10)

Techno-Hype and the Balance of Power
In private economic life, technological hype often leads to the misallocation of economic resources—to questionable purchases of goods and assets that can in the most extreme cases lead to speculative bubbles that disrupt whole economies, from the wonders claimed for the South Sea Company in its heyday, to the information technology bubble that burst in 2000. However, hype about such technologies has implications well beyond the purely civilian sphere. It has often had implications for perceptions about the prosperity of particular nations, and the power they derive from it, including the military power that in the modern world is founded on economic and industrial power. There has in fact been a recurring tendency to take an exaggerated view of the implications of an extraordinary lead in one or a few technologies for the wealth and power of a particular country closely identified with that technology, imagining that a disproportionate market share here can readily translate to a wildly disproportionate prosperity or power. This has frequently been reinforced by a tendency to claim that the country in question possesses some unique comparative advantage that will make its lead in that technology enduring.

An obvious case is Britain in the post-World War II period. Despite its post-war economic situation and decolonization, expectations existed that British accomplishments in cutting-edge fields as computers, aerospace and nuclear energy would sustain its economic and strategic position in the post-war world. Reality fell far short of such hopes in a history which saw such disappointments as the de Havilland Comet, the "Magnox" gas-cooled nuclear reactor, and the Concorde, which showed its advantages to be less robust than imagined, and the exploitation of these advantages more complex and uncertain.11

The tendency has grown more rather than less pronounced over time, with Japan's extraordinary dominance in microchips during the late 1980s another, even more germane example. Despite accounting for less than 3 percent of the world's population, 12 percent of world GDP and 16 percent of manufacturing, the country produced over half the world's chips at the time.12 In this context Japanese politician Shintaro Ishihara created a sensation with the claim in his book The Japan That Can Say No that Japan's strength in microchips put the Cold War balance of power in its hands, and would be a pillar of its enduring stature in the post-Cold War era.13 (This was to see it draw the former Soviet sphere into its own orbit, and manage the world's affairs jointly with the United States as the "Group of Two.")

While Ishihara's predictions have on the whole fared very poorly, it is notable that the country's extraordinary lead in microchip production proved fragile and brief, in part because it rested on quite different, more complex and less enduring foundations than he imagined. Japan's position as a microchip producer was based on its position as an exporter of consumer electronics more generally—the makers of such goods buying their chips from domestic producers, underlining the reality that its exceptional market share reflected a broader industrial supremacy, though even that proved fleeting. When Japanese consumer electronics lost their market dominance, the share they represented of the chip market declined. Moreover, the highly touted virtues of Japanese management and workers were nowhere near enough to compensate for a badly flawed restructuring of the industry amid a more competitive environment in the 1990s.14

While its power has been broader-based and more enduring, the United States, too, has been susceptible to such hype, similarly based on misconceptions about computer technology, and perceptions of the United States as more fitted than others to develop and use it. A pointed example is Ralph Peters' vision of the U.S. enjoying complete, continuous, global battlespace dominance based on a space-based military system so extensive and powerful its
attack systems deployed in near space . . . can, on order, destroy the vehicles, aircraft, missiles, and ships of any aggressor anywhere on the planet the moment a hostile actor violates or even threatens the territory of another state or entity—or uses military means to disrupt the internal rule of law in his own state.15
The means for paying for this system were to be the "revolution of technological and informational wealth creation," if the marketplace, and a crackdown of corruption, succeeded in taming the excesses of defense contractors.16 Such explosive expectations of "New Economy" technological and economic growth, and at least as much, the idea that they would so thoroughly conduce to the power of the United States relative to other nations, have long since become passé.17

Ultimately, each case has been a reminder of the reality that a single technology is too narrow a base for any nation to retain a preponderant share of the world's wealth and power, and in any event, technological capabilities tend to diffuse. Perhaps especially when the technology in question is evolving and its market growing, and the technology is a consumer's rather than a producer's technology, that diffusion is especially prone to be rapid and deep. The result is that in the longer run a nation's weight tends toward what is suggested by that bigger complex of factors, from the extent and richness in natural resources of a nation's territory, to the size and education of its population, to the commitment of its government to such policies as make for industrial success or failure, rather than narrow and fleeting technological advantages.

Techno-Hype in the Military Sphere
Each of the factors operative in the civilian sphere is also operative in discussions of technologies designed specifically for military use. However, discussions of military technology here have their own, distinct tendencies, including the exaggerated idea of what a particular technology could do for the party that developed it. Generally, what has occurred is the idea of a technology's capability far outrunning its actuality because of the perceived desirability of that capability. Since World War I in particular there has been a preoccupation with the possibility of new technologies providing a means to quick, cheap, decisive military victory in a major interstate contest. This is exemplified by the '20s and '30s-era futurology regarding "mechanization," specifically the ability of small armored and air forces to substitute for large armies and total war efforts, with Basil Liddell Hart boldly claiming that planes and tanks would end the era of "'absolute war' and . . . its fungus growth—'the nation in arms,'" by enabling swift knockout blows by one power against another.18 Guilio Douhet was still more extravagant, claiming that a fleet of even twenty operational planes put a nation's "aero-chemical arm" in a position "to break up the whole social structure of the enemy in less than a week, no matter what his army and navy may do."19

While not going as far in the development of these technologies as some critics may suggest, expectations of the revolutionary nature of mechanization did shape national policies. In the interwar period Britain's leaders invested in the largest air force and most mechanized army of the day.20 Likewise, an expansion-minded Nazi leadership relied on these technologies as the only way in which its program of European conquests could be plausible, reflected in the emphasis on the Luftwaffe, and the development of an armored corps proficient in operating in the manner prescribed by the new theorists.21

When the war actually came armor and air power did indeed revolutionize the battlefield, but they fell far short of that ideal of quick, cheap, decisive victory. The simple reality is that technologies diffuse quickly in the modern world, and with similar speed spur the development of countermeasures that neutralize them.22 Consequently Germany, helped by an implausible combination of a high-risk strategy with a profoundly mismanaged French defense, won a quick, cheap victory in the 1940 Battle of France.23 However, already at that date other armies were taking note and learning to use their mechanized forces in the same offensive manner, or apply similar techniques to the problem of defense. The result was that Germany did not see another comparable victory for the rest of the war, with the failure of German operations to achieve the desired speedy victory in the early phase of Operation Barbarossa dooming it to the long war that virtually spelled defeat.

Likewise, even with air forces vastly larger, better-equipped and deployed in far more sophisticated fashion than anything Hart or Douhet discussed, strategic bombing was an attritional affair, practiced not just massively but over the extreme long term, with results that were ultimately ambiguous—in part because of the advent of radar, and its use to direct similarly larger and more capable forces of fighter aircraft. As a result Britain withstood prolonged German bombing in 1940-1941, and Germany itself withstood such prolonged bombing which, whatever else can be claimed for it, did not secure the promised speedy collapse of German willingness or ability to prosecute the war.

Ultimately it was not Liddell-Hart or Douhet who was proven right, but rather the less well-known and less heeded Vladimir Triandafillov, who much more astutely realized that the next war would be both a high-tech affair, and a mass affair, continuing even longer and taking a greater toll in blood and treasure than that First World War against which they reacted so strongly.24 Indeed, new technologies did not so much supplant earlier methods of war-fighting as increase the stock of weapons, systems, capabilities that a major power needed to fight its war. Armored fighting vehicles did surprisingly little to reduce the need for infantry able to fight dismounted, with the advent of man-portable anti-tank weapons actually necessitating their escort by protective infantry. At the same time air forces did not eliminate the need for armies or navies—the Luftwaffe not only proving no substitute for a German navy able to land a large invasion force on British soil, while when the tide turned, the Combined Bombing Offensive did not spare the Allies the need to invade Europe and ultimately take Berlin.

As it went with mechanization, so did it go with virtually every significant later development, because either the reality of the capability fell short of the promise, or because the diffusion of the technology, or the development of countermeasures to it, complicated the matter. The extent to which nuclear weaponry could substitute for other kinds of military power proved greatly exaggerated, witness the reform of the U.S. armed forces in the 1950s, and then the backing away from those changes in the period afterward, as the Soviet Union narrowed the nuclear gap, and the rigidities of a policy based on Massive Retaliation became increasingly apparent—as well as its irrelevance to the situations of insurgency that seemed to be of rising importance.25 Strategic arsenals did not eliminate the need for conventional forces of all the known types (land, air, sea), with their increasingly sophisticated expensive equipment, while the problem of insurgency encouraged the advent of special operations forces for dealing with these as well. For cash-strapped post-war Britain, air mobility, "commando carriers" and tactical nuclear weapons also appeared a cheap way of sustaining a worldwide military presence, an illusion repeatedly shown up until the pretension was abandoned in the withdrawal from "east of Suez."26

Time and again, space-based weaponry has been an object of such hype, particularly after the expectations of extremely cheap, reliable, regular space access created by the space shuttle in the 1970s. While this proved false the Strategic Defense Initiative renewed the hype in this area in the 1980s, enduring even after the Cold War (for which many erroneously accord it the credit), and resurging yet again around the turn of the century (the technology itself, as well as buoyant expectations of enduring New Economy boom, key to claims like Peters')—the command of the highest ground of all, again, making major war seem winnable.27 The exaltation of the technologies of the "Revolution in Military Affairs," notably its combination of digital computing and communications with increasingly advanced electronic sensors and precision-guided munitions, has in the view of many analysts led to exaggerated ideas of the capacity of policymakers to substitute them for more expensive and politically problematic commitments of ground forces.28

Hype Resurgent?
There seem two particular reasons for special concern at the present moment. The first is that such hype tends to go through cycles of boom and bust, with a recent bust seemingly being followed by a new boom at the time of this writing. The second would be the stresses of the international environment as it is today, which may leave anxious policymakers more than usually eager to grasp at illusory solutions to their problems.

Boom and Bust and Boom Again?
The late 1990s was a period of particular boom for technological hype, as personal computing, cellular telephony and Internet access became staples of everyday life for the advanced industrial nations' middle classes, and increasingly other groups as well. It was also accompanied by explosive expectations in areas like artificial intelligence, robotics, virtual reality, nanotechnology and genetic engineering, and reinforced by the experience of relatively rapid economic growth in the 1995-2000 period. However, the "tech bubble" burst. The advances in computing and cell phones became increasingly incremental. At the same time other technologies proved less immediately feasible than anticipated, with advances in neural networks proving sluggish, and advances in nano-machinery more elusive than their proponents suggested. Additionally, the spiking of commodity prices and energy prices especially evoked Malthusian catastrophe rather than Wellsian super-technology, and the experience of the "Great Recession" dampened expectations yet again. Rather than explosive technological change, prophecies of ecological doom resurged.

Since then progress in a number of key technological areas, particularly artificial intelligence, but also materials science and renewable energy, has revived expectations of deep, wide, near-term changes, particularly the automation of a great deal of economic activity, with the self-driving car perhaps the most emblematic at the moment. Such developments are already having significant cultural and intellectual effects, evident in the amount of time given to discussing their implications, as seen in the explosion of writing in the press regarding such subjects as appropriate governmental responses to the development of robots years or decades more advanced than anything now in existence. The prospect of mass unemployment as a result of the new technologies, for example, is already being discussed as if it were already a matter of immediate urgency in pieces by the most orthodox of thinkers running in the most mainstream of publications.29 (Indeed, it can seem that future unemployment resulting from automation is treated more earnestly than the current, actual problem of unemployment or underemployment.) One might well argue that such writers are getting ahead of themselves.

An "Easy" Out?
The second factor to be considered here is that such hype may be especially seductive in a period of constrained resources. This may especially be the case when a power which has been predominant, and particularly given to reliance on high technology, finds its position challenged—as the case of Britain dealing with its own decline in the early and mid-twentieth century strongly suggests.

Today the U.S. is similarly facing a changing distribution of wealth and power. In 1992 the U.S. economy was six times' the size of those of Russia and China combined. Today China's economy alone is larger than the U.S.'s when measured in Purchasing Power Parity terms. All of this is increasingly reflected in the hard reality that military capabilities and advantages the U.S. has long held a monopoly on are increasingly within the reach of other states. At the time of this writing, not only has China put its first squadron of fifth-generation (stealth) fighter aircraft into service, but Russia has sent such fighter planes to Syria operationally.30 None of this means that the U.S. has ceased to be the leading military power, or will cease to be that any time soon. It does not even mean that the U.S. has ceased to be the pace-setter for other military powers. However, the combination of quantitative and qualitative superiority it enjoyed in the 1990s is a thing of the past, and unlikely to return. Especially given the extent to which the United States has relied on cutting-edge technology as a basis for military strength, not only in the recent past, but historically; and the continued tendency to think of innovation, especially in the area of computers, as somehow uniquely American; the temptation to imagine some leap in weapons technology recovering the old edge may be great.

However, it should also be remembered that the U.S. is not the only actor susceptible to such illusions by any means, virtually every military power of note today in a comparable condition. Russia endeavors to be a world military power in a manner comparable with its Soviet-era stature, but on the basis of only a small fraction of its power base.31 China is by some measures the largest economy in the world today, but seeing its growth slow down markedly while its per capita income remains 15-30 percent that of the U.S.. The same goes even for less ambitious states. European states like Britain, France and Germany, and Japan, have become more committed to investing in military power, but from a position of slow economic growth and massive and growing indebtedness.32

Additionally, virtually all advanced nations are today anxious about populations which are aging, and perhaps also a decline of civic militarism, and hoping to substitute money and technology for manpower in this as in other areas. Moreover, none of these states has been averse to technophilia, or technological hype—as Vladimir Putin's widely publicized remarks on the prospect of global dominance going to the nation that leads the way in artificial intelligence make all too clear.33 Rather than revolutionary weaponry overturning the global balance, we should expect continued advances in weapons technology among a circle of powers determined by the possession of more traditional foundations of power—not least, scale in population, territory, natural resources, its manufacturing base and financial strength.

Conclusions
The problem of technological hype does not admit of tidy policy solutions. Rather the most that can be hoped for is a greater alertness to the problem on the part of analysts, which would be helped by a remembrance of a handful of lessons strongly suggested by many a reading of history: that hype is virtually built into the cycle of technological adoption; that the road from laboratory demonstration to practical use, let alone really effective use, is long and uncertain; and that a single technology is too narrow a basis for durable power, economic or military. Certainly the track record for new military technologies delivering swift, cheap, decisive victory in anything resembling a conflict between peers is dismal.

None of this is to deny that technology will continue to advance, that those advances will matter, or that those who fail to keep up will suffer untoward consequences. However, the experiences of the past century and above all its most calamitous war demonstrates, the price of acting on excessive expectations of what a technology can do will be just as high.

1 To cite only one of the more obvious metrics, the world's population was only 55 percent urbanized as of 2017, well below the norm for the developed countries. Central Intelligence Agency, "World," CIA World Factbook (Washington D.C.: Central Intelligence Agency, 2018). Accessed https://www.cia.gov/library/publications/the-world-factbook/geos/xx.html.
2 The most notable exponent of this line of argument is perhaps Robert J. Gordon. See Gordon, "Does the 'New Economy' Measure Up to the Great Inventions of the Past?" Journal of Economic Perspectives 14 No. 1 (Fall 2000), 49-74; The Rise and Fall of American Growth: The U.S. Standard of Living Since the Civil War (Princeton, NJ: Princeton University Press, 2016).
3 A search of the New York Times archive in March 2018 showed that where the word "lifestyle" was virtually unused before the late 1960s, its usage increased through the twentieth century, and exploded in the twenty-first. By 2007 it was appearing in the paper a thousand times a year, and has generally stayed near that figure since.
4 For a brief overview of the relevant history, see Nicholas Rescher, Predicting the Future: An Introduction to the Theory of Forecasting (Albany, NY: State University of New York, 1998), 19-33.
5 Charles E. Gannon, Rumors of War and Infernal Machines: Technomilitary Agenda-Setting in American and British Speculative Fiction (Lanham, MD: Rowman & Littlefield, 2003).
6 The quotation is commonly attributed to author William Gibson, who used it on multiple occasions, among them a 1999 appearance on U.S. National Public Radio. See "The Science in Science Fiction," Talk of the Nation, National Public Radio, 30 Nov. 1999. Accessed at https://www.npr.org/templates/story/story.php?storyId=1067220.
7 The tendency is much-noted in the small but growing literature on "short-termism." See Angela Black and Patricia Fraser, "Stock Market Short-termism—An International Perspective," Journal of Multinational Financial Management 12.2 (April 2002), 135-158. One study found that 80 percent of executives would sacrifice R & D for the sake of the "smooth earnings" that are the object of such behavior. John R. Graham, Campbell R. Harvey, and Shivaram Rajgopal, "The Economic Implications of Corporate Financial Reporting," Journal of Accounting and Economics 40 (2005), 3–73.
8 Critics of science fiction actually have a term for such a narrative—John Clute coining the term "Edisonade." See John Clute and Peter Nicholls, The Encyclopedia of Science Fiction (London, Orbit, 1993).
9 A case in point is Kurzweil's predictions for 2009 in his book The Age of Spiritual Machines, noteworthy both for containing many specific, testable forecasts, and the wide attention they have received. Alex Knapp offered a rare, critical examination of those forecasts in 2012, and this was sufficiently novel that Kurzweil himself elected to respond personally to Knapp in his own publication. Alex Knapp, "Ray Kurzweil's Predictions for 2009 Were Mostly Inaccurate," Forbes, 20 Mar. 2012. Accessed at https://www.forbes.com/sites/alexknapp/2012/03/20/ray-kurzweils-predictions-for-2009-were-mostly-inaccurate/#789884b73f9a. Ray Kurzweil, "Ray Kurzweil Defends His 2009 Predictions," Forbes, 21 Mar. 2012. Accessed at https://www.forbes.com/sites/alexknapp/2012/03/21/ray-kurzweil-defends-his-2009-predictions/#95811a94852e.
10 For an example of the tendency on the part of one well-known science fiction author more attentive to these matters than most, see Charles Stross, "Let's Put the Future Behind Us," Antipope.com, 23 Nov. 2014. Accessed at http://www.antipope.org/charlie/blog-static/2006/10/lets_put_the_future_behind_us.html.
11 Correlli Barnett, The Verdict of Peace: Britain Between Her Yesterday and the Future (New York: Macmillan, 2001); David Edgerton, Warfare State: Britain 1920-1970 (New York: University of Cambridge Press, 2006).
12 Martin Fackler, "Japan's Chip Makers Search for a Strategy," New York Times, 2 Jan. 2006. Accessed at https://www.nytimes.com/2006/01/02/technology/japans-chip-makers-search-for-a-strategy.html. GDP and manufacturing output data from United Nations, "Per Capita GDP at Constant 2010 Prices in U.S. Dollars," National Accounts Main Aggregates Database, Dec. 2017. Accessed at https://unstats.un.org/unsd/snaama/dnlList.asp.
13 Shintaro Ishihara, The Japan That Can Say No, trans. Frank Baldwin (New York: Simon & Schuster, 1991).
14 See Fackler.
15 Ralph Peters, Fighting for the Future: Will America Triumph? (Mechanicsburg, PA: Stackpole Books, 1999), 200.
16 Peters, 201.
17 Had the U.S. economy grown at its New Economy boom (1995-2000) rate, per capita GDP circa 2016 would have been 20 percent higher. Derived from UN, "Per Capita GDP."
18 Basil Liddell Hart, Paris, or The Future of War (New York: Garland Publishing, Inc., 1927), 85.
19 Giulio Douhet, The Command of the Air, trans. Dino Ferrari (New York: Coward-McCann, 1942), 142.
20 Donald Kagan and Frederick Kagan, While America Sleeps: Self-Delusion, Military Weakness and the Threat to Peace Today (New York: St. Martin's, 2000), 48-49, 56; Edgerton, Warfare State and Britain's War Machine: Weapons, Resources and Experts in the Second World War (New York: Oxford University Press, 2011).
21 For a discussion of the limitations imposed by the German economy's weaknesses on rearmament, see Adam Tooze, The Wages of Destruction: The Making and Breaking of the Nazi Economy (New York: Penguin, 2006).
22 For a discussion of the tendency of weaponry's failure to provide lasting advantage, also see Martin van Creveld, Technology and War: From 2000 B.C. to the Present (New York: the Free Press, 1989).
23 William L. Shirer, The Collapse of the Third Republic (New York: Simon & Schuster, 1969).
24 Vladimir K. Triandafillov, The Nature of the Operations of Modern Armies, trans. William A. Burhans (Portland, OR: Frank Cass, 1994).
25 Frederick W. Kagan, Finding the Target: the Transformation of American Military Policy (New York: Encounter Books, 2006).
26 See Phillip Darby, British Defence Policy East of Suez (London: Oxford University Press for the Royal Institute of International Affairs, 1973); William P. Snyder, The Politics of British Defense Policy 1945-1962 (Columbus, OH: Ohio State University Press, 1964).
27 See Nader Elhefnawy, "Space War and Futurehype," Space Review, 22 Oct. 2007. Accessed at http://www.thespacereview.com/article/984/1. Elhefnawy, "Revisiting Island One," Space Review, 27 Oct. 2008. Accessed at http://www.thespacereview.com/article/1238/1. Elhefnawy, "Space War and Futurehype Revisited," Space Review, 14 Nov. 2011. Accessed at http://www.thespacereview.com/article/1969/1. Elhefnawy, "Why We Fall for the Hype: Contextualizing Our Thought on Space Warfare," Space Review, 26 Mar. 2012. Accessed at http://www.thespacereview.com/article/2052/1.
28 Exemplary of such hopes was the course of action taken by the U.S. and its allies against Yugoslavia in 1999 and Afghanistan in 2001; and Harlan Ullman's "shock and awe" theory.
29 Lawrence H. Summers, "Larry Summers: The Robots Are Coming, Whether Trump’s Treasury Secretary Admits It or Not," Washington Post, 27 Mar. 2017. Accessed https://www.washingtonpost.com/news/wonk/wp/2017/03/27/larry-summers-mnuchins-take-on-artificial-intelligence-is-not-defensible/?noredirect=on&utm_term=.aa2a87408a98.
30 Jeffrey Lin and P.W. Singer, "China's J-20 stealth fighter jet has officially entered service," Popular Science, 18 Feb. 2018. Accessed https://www.popsci.com/chinas-j-20-stealth-fighter-officially-enters-service. Alex Lockie, "Russia thinks its new advanced fighter jet in Syria will scare off other countries — but nobody's afraid of it," Business Insider, 27 Feb. 2018. Accessed http://www.businessinsider.com/russia-su-57-syria-deter-scaring-nobody-2018-2.
31 Circa 1980 the Soviet Union had, according to one estimate, 15 percent of the world's manufacturing output. Today Russia has perhaps 1 percent, about the same as Indonesia or the Netherlands. For the 1980 figure see Bairoch, Paul, "International Industrialization Levels from 1750 to 1980," Journal of European Economic History 11 (Fall 1982), Table 9. For the more recent one, see Chris Rhodes, "Manufacturing: International Comparisons," House of Commons Library Briefing Paper, No. 05809, 5 Jan. 2018. Accessed https://researchbriefings.parliament.uk/ResearchBriefing/Summary/SN05809#fullreport.
32 Department of Finance, Canada, Fiscal Reference Tables, Sep. 2017, Table 54. Accessed at https://www.fin.gc.ca/frt-trf/2017/frt-trf-17-eng.asp.
33 David Meyer, "Vladimir Putin Says Whoever Leads in Artificial Intelligence Will Rule the World," Fortune, 4 Sep. 2017. Accessed at http://fortune.com/2017/09/04/ai-artificial-intelligence-putin-rule-world/.

Saturday, February 29, 2020

Just Out . . . The Neoliberal Age in America: From Carter to Trump

As we enter 2020 it seems as if the country's politics are undergoing nothing less than a tectonic shift—one result of which is that the word "neoliberalism" has passed out of the usage of academics, into general parlance. Those trying to make sense of it all find that the market is flooded with public affairs books—but most are longer on political hacks' rants than substance, or too busy telling colorful stories, to offer answers to such obvious and essential questions as

•Just what is neoliberalism anyway? (And why is there so much confusion about this anyway?)

•What did the Reagan administration actually do, and what were the results?

•What was the policy of the Clinton administration, and did it justify its characterization by critics as neoliberal? (Ditto Obama.)

•What was the country's economic record before and after "the neoliberal turn?"

However, THE NEOLIBERAL AGE IN AMERICA: FROM CARTER systematically examines Federal policy from the 1970s through the Presidencies of Carter, Reagan, the two Bushes, Clinton and Obama, emphasizing specifics and hard data to offer a picture of just what happened in these years as a matter of practical policy, and its consequences—answering these questions and more as we confront this era of crisis, and what may be a historic election this upcoming November.




Available in ebook and paperback formats at Amazon and other retailers.

Subscribe Now: Feed Icon