Thursday, June 30, 2022

A Generation On: Clifford Stoll's 1995 Essay on the Internet

Clifford Stoll's 1995 Newsweek piece "Why the Web Won't Be Nirvana" has been the butt of many a joke over the years, but not because of its title. Had Stoll limited himself to merely arguing what his title claims we might well remember him as having been clearer-eyed than his contemporaries. Had he somewhat more ambititously argued that the advent of this technology would not in itself translate to not merely nirvana, but not even utopia, we might have accorded him yet greater plaudits. And had he, in more nuanced fashion, argued that some of the much-hyped developments might not come for a long time, if ever, he would also have been right, as we are all too aware looking at exactly some of those things of which he was so dismissive, like telecommuting or the substitution of online for in-person education, or some radical advance for democracy.

However, he was dismissive of the whole thing, not only in the very near term, but, it could seem, any time frame meaningful to people of the 1990s, and on very particular grounds that seem to me more telling than the prediction itself. While paying the limits of the technology as it stood at the time some heed (noting the sheer messiness of what was online, or the awkwardness of reading a CD-ROM while on a '90s-era desktop), he did not stress the limits of the technology as it was then, and likely to remain for some time, even though he could have very easily done so. (What we are in 2022 vaguely talking about as the "Metaverse" was, at the time, widely portrayed as imminent amid the then-insane hyping of Virtual Reality--while what we really had was pay-by-the-hour dial-up in a time in which Amazon had scarcely been founded, and Google, Facebook, Netflix were far from realization.) Nor did Stoll acknowledge the hard facts of economics and politics and power that would a generation on see even those bosses who have made the biggest fortunes in the history of the world out of technological hype broadcast to the whole world their extreme hostility to the very idea of telecommuting, or make the Internet a weapon in the arsenal of Authority against Dissent as much or more than the reverse. (That was not the kind of thing one was likely to get in Newsweek then any more than now.)

Rather what Stoll based his argument on was the need for "human contact," which he was sure the Internet would fail to provide. The result was that where his predictions were correct he was far off the mark in regard to the reasons why (those matters of economics, politics, power), and totally wrong about other points, like his dismissal of online retail and the possibility that it might threaten the business of brick-and-mortar stores, or the viability of online publishing. The truth is that when it comes to mundane tasks like buying cornflakes and underwear convenience, and cheapness, counts for infinitely more than "human contact" with the hassled, time- and cash-strapped great majority of us--while where their performance is concerned human contact is, to put it mildly, overrated. Indeed, it is often a thing many, not all of them introverts, would take some trouble to avoid. (Do you really love encountering pushy salespersons? Long checkout lines where you encounter more rude people? Sales clerks of highly variable competence and personability? For any and all of whom dealing with you may not exactly be the highlight of their own day, one might add?) Indeed, looking at a college classroom in recent years one sees two of his predictions belied, as they are reminded that, while Stoll may indeed be right that "[a] network chat line is a limp substitute for meeting friends over coffee," the average college student much prefers that "limp substitute" to chatting with their neighbors, let alone attending to that instructor right there in the room with them, whom large numbers of them happily replace with an online equivalent whenever this becomes practical.

Thus does it go with other "entertainments." Stoll may well be right that "no interactive multimedia display comes close to the excitement of a live concert," but how often do most people get to go to those? In the meantime the multimedia display has something to commend it against the other substitutes (like the Walkman of Stoll's day). And this is even more the case with his remark that no one would "prefer cybersex to the real thing." After all, the "real thing" isn't so easy for many to come by (even when they aren't coping with pandemic and economic collapse), while even for those for whom it might be an option it seems that not merely cybersex with the real, but "love with the virtual," is competitive enough with the real kind to make many a social critic wag their tongues (with, I suspect, what is treated as a Japanese phenomenon today, like the "hikikomori," likely to prove far from unique to that country in the years ahead).

Far more than Stoll, Edward Castronova and Jane McGonigal seem to have been on the right track when writing about how poorly our workaday world comes off next to the freedoms, stimulation, satisfaction of virtuality, especially when we consider what that reality is like not for the elite who generally get to make a living offering their opinions in public, but the vast majority of the population on the lower rungs of the social hierarchy, facing a deeply unequal, sneering world which every day and in every possible way tells them "I don't care about your problems." Indeed, while a certain sort of person will smugly dismiss any remark about how the world is changing with a brazen a priori confidence that things are always pretty much the same, it seems far from implausible that things are getting worse that way (it's hard to argue with a thing like falling life expectancy!), while it seems there is reason to think that the virtual is only getting more alluring, with people actually wanting it more, not less, as it becomes more familiar to them--a familiar friend rather than something they know only from the Luddite nightmares of so much bottom-feeding sci-fi. In fact, it does not seem too extreme to suspect that many have as little as they can to do with the real offline world--and that only because of the unavoidable physical necessities of dealing with it, on terms that make it any more attractive, only underlining the superiority of the virtual in life as they have lived it.

Wednesday, June 29, 2022

The Pandemic and Automation: Where Are We Now?

In the years preceding the pandemic there was enormous hype about automation, particularly in the wake of the 2013 Frey-Osborne study The Future of Employment . Following the pandemic the effort was supposedly in overdrive.

However, a close reading of even the few items about the matter that serve up actual examples of such automation (like the installation of a voice recognition-equipped system for receiving customers' orders as they pass through some fast food outlet's drive-thru lane) reveals that they are clearer on intentions and possibilities than actual developments--like polls telling us that "50% of employers are expecting to accelerate the automation of some roles in their companies" (never mind how much employment those employers account for, or how serious their expectations are, or what "accelerate" and "some" actually mean here). Meanwhile rather more prominent when we look at discussion of actualities rather than possibilities we find ourselves reading about the discontents of the humans. We read of how workers in jobs that have them dealing with the general public face-to-face are burned-out and fed-up, not of how bosses are replacing those workers with new devices--a decade after robot waiters and the like were supposed to be on the verge of becoming commonplace. We read that industrial robot orders are up--but (as, perhaps, we note that the actual number of robots ordered is not really so staggering) we read far more of supposed "labor shortages" than we do of automation filling in the gaps. We also know that, as seen from the standpoint of the whole economy, productive investment--without which no one is automating anything--remains depressed compared with what it was pre-crisis (and remember, the world never really got over that Great Recession), while it also does not seem terribly likely to get much better very soon, with that media darling and icon of techno-hype Elon Musk, even as he promises a humanoid Teslabot by September, publicly raving about recession just around the corner and preemptively slashing his work force in anticipation, not in the expectation of employing fewer humans, just fewer humans with actual salaries (while those Teslabots do not seem to be part of the story, go figure).

Why do we see such a disparity between the expectations and the reality? A major reason, I think, is that those who vaguely anticipated some colossal rush to automate the economy imagined vast numbers of ready, or nearly ready, systems ready to do the job--a natural result of the press tending to imagine that "innovations" at Technology Readiness Level 1 are actually on Level 9, with the truth coming out when push comes to shove, as it so clearly has amid the crisis, the requisite means are not nearly so far along as some would have had us believe. Those observers also underestimated, just as government and the media have generally done, just how disruptive the pandemic was to be--how long the pandemic and its direct disruptions would last, to say nothing of the indirect, and how much all this would matter. In line with the familiar prejudices of the media lockdowns, strikes and the war in Ukraine get plenty of time and space as sources of economic troubles--but critics of central bank monetary policies get very, very little, with one result that the upsurge in inflation took so many "experts" by surprise. And that inflation, and the tightening of credit that has inevitably followed it, however belated and gradual compared with the talk of latterday Volcker shock it may be, are hardly the kind of thing that encourages investors. Nor are the unceasing supply chain problems. (If the country can't even keep itself in baby formula, how can it keep itself in the inputs required for a drastic expansion of revolutionary automation?) The result is that those of us watching this scene would do well to take those reports of some drastic increase in the rate of automation with a measure of skepticism.

Tuesday, June 28, 2022

Telecommuting: What We Imagined Once, and Where We Actually Are Now

In his 1980 classic of futurology The Third Wave Alvin Toffler remarked that an era of general "transportation crisis" was upon the world, in which "mass transit systems [were] strained to the breaking point, roads and highways clogged, parking spaces rare, pollution a serious problem, strikes and breakdowns almost routine, and costs skyrocketing." Toffler noted, too, the savings that a turn from in-person commuting to telecommuting might achieve, from the lower expenditure of energy of a computer terminal against a private car (or even mass transit), to the chance to scale down physical facilities as people worked from home permitting reductions of real estate, utility, tax and other expenditures. Indeed, it seemed to him that sheer market forces would go a long way to doing the trick, while pressure from environmentalists to bring down the ecological footprint of business activity, and government's seeing in a shift to telecommuting potential benefits in ways from the oil import-beleaguered trade balance to the collapsing nuclear family, would easily nudge it further along.

Of course, four decades later it would seem that Toffler could not have been more wrong on this point--with his being so wrong the more striking given that the transportation crisis, the energy crisis, he wrote of did not get better, but only worse and worse; while the proliferation of computing and Internet connections, and the improvements in their performance and price, went far, far beyond anything Toffler anticipated in the near term. The reason is that, whatever those concerned for public issues like the environment, the trade balance or anything else thought about the matter--and certainly whatever employees thought--employers didn't want it, as anyone who understood economic history had to see they wouldn't, because of the simple matter of control. (The Industrial Revolution's shift from the "putting-out" system to the factory, Taylorist time-and-motion study, Fordism--through it all the key to higher productivity and profit has lain through ever-more intricate division and direction of labor conveniently gathered together under management's eye.)

Ultimately that opposition mattered more than anything else. Since that time it was only the shock of the recent pandemic put the matter as high on the agenda as it has been these last two years, with the scrambling to implement it in the wake of lockdown only underlining how little serious effort government and business made in the direction of figuring out how we could actually make large-scale telecommuting work in all those decades. And now with government and business discarding such measures as they took to constrain the pandemic there is a drive on the part of employers to go back to the old ways, with, ironically, those very business leaders who are apt to be most celebrated as technological superheroes by the makers of the conventional wisdom frequently in the vanguard; all these determined to see that the experiment in telecommuting remains just that, with, of course, the full blessings of government and the media, as media outlets like The Atlantic publish articles with obnoxiously wheedling titles like "Admit It, You Miss Your Commute."

Of a course, a good many workers have displayed less enthusiasm for this course than their bosses--fully two-thirds of remote workers declaring that they do not want to return to the office, with discarding that trip to and from their workplace that so many experience as the worst part of their day specifically cited among the biggest benefits of not doing so. (No, contrary to what the folks at The Atlantic would have the impressionable believe, they did not miss their commute.) Meanwhile if the strikes shocking elites the world over are anything to go by (even Britain, the homeland of that icon of union-crushing Margaret Thatcher, is seeing the kind of strikes that the Thomas Friedmans of the world complacently thought safely relegated to the historic past), workers, shocked, stressed, burned-out and all-around less complacent after the pandemic, are asserting themselves in a way they haven't in decades. The result is that while for over a generation employers have seemed to hold all the cards, and never let anyone forget it, one ought not to rush to the conclusion that they will get their way yet again.

Wednesday, June 22, 2022

On the Word "Startup"

When people speak of "startups" what they mean is that someone is "starting up" a business.

But in choosing to speak not of new businesses, or new companies, but instead startups, they make a far grander claim than the establishment of a new enterprise that will, hopefully, produce something of use to others and a living for its employees and a return for its investors.

Instead they promise that here is a technological and commercial revolution!

And I have to admit myself ever more cynical about that heavily used word and its users.

Because, alongside the success stories there are also the innumerable also-rans, many of which never had a reason to be anything but an also-ran. Because so often we have a startup without an actual product or business plan. Because, even if the technology isn't an overhyped fantasy (as it so often is) the reality is that we tend to end up with a single big winner gobbling up the market (a Microsoft, an Amazon, a Google, a Facebook), driving them out of business or swallowing them up for comparative pennies. They may fantasize that their little firm is the one that will be that winner--but all but the one who actually has the winner are wrong (while even the one who is right can hardly know that at the outset), making what the stupid may call "optimism" or "confidence" really just presumption on a staggering scale. And because so often in back of what is necessarily a misrepresentation are not just illusions and delusions and stupidity, but a good deal of bad faith, and outright criminality. (With Theranos Elizabeth Holmes hoped to be the next Steve Jobs. Instead if Theranos is remembered it will be remembered alongside Enron--about which Holmes ought to know something, her father, whose influence she traded on all her ultra-privileged life, having been Vice-President there.)

And as if all that were not enough I am annoyed, too, by the broader understanding of the world tied up with the word, the muddy thinking and outright lies. About, for example, the actual rate of technological change, which has been wildly exaggerated by so-called "experts" throughout my lifetime to a credulous public and credulous investors, who have so often bet big and lost big as a result. About the notion that technological change do not involve Big Science, established firms, government labs and government subsidies; does not involve vast, diffuse global efforts over very long periods of time; but is just nerd-magic done overnight by garage tinkerers aided by venture capitalists. And of course, there is, what all this justifies and glorifies, and where it has left us in a 2022 far, far different, and far, far sadder than the world we would have got if the Silicon Valley hucksters went anywhere near to delivering on the promises about what their "startups" would bring.

What Barry Levinson's Envy Has to Teach Us About Technology

Back in the mid-'00s the "Frat Pack" were the kings of Hollywood comedy, with hits like Old School (the film that gave them their name), Dodgeball, Anchorman and Wedding Crashers. However, alongside these hits there were quite a few less successful films, among them the Barry Levinson-directed Envy, which costarred Ben Stiller and Jack Black.

The film's story revolves around the Jack Black character's making a fortune with a (let us be decorous here) spray that simplifies the task of cleaning up after one's pet. Early in the film, before Black's character got rich, he was endlessly talking about his "inventions"--which did not actually exist, even in blueprint form--as that spray did not when he was talking about it. As Stiller's character explained to him what he was talking about was "not an invention" but rather "an idea."

It's a short, simple exchange--but not meaningless, the distinction Stiller's character makes quite important. An idea has not necessarily been proven--one reason why one cannot patent an idea--only an invention, which is a far more developed thing. Alas, tech journalism too little recognizes the distinction--telling us about technologies that may not have moved past the idea stage as if they were already realities, and doing even worse with technologies only scarcely more advanced. To cite but one example, not long ago a scientist put forth a concept for an unprecedentedly swift space vehicle, which was all well and good. What many of those journalists failed to understand, or at least properly acknowledge, was that the whole thing was premised on its being equipped with a TOKAMAK FUSION REACTOR--a technology at best far away from being developed to the point where we can fit one delivering the required performance in such a spacecraft. But they wrote about the thing as if this fusion-powered spacecraft were already being trucked out to the launch pad for its first flight.

Of course, that does not obviate the importance of ideas. The process of invention has to start somewhere, after all--and considering the matter I find myself recalling how inventor Hugo Gernsback was so much a believer in the value of ideas in themselves here that he saw in it one of the great uses of science fiction when he launched the first science fiction magazine, Amazing Stories, way back in 1926. But all the same, the distinction is an important one, especially when hopes and money start getting involved. And a century after Gernsback made his case it seems to me that we are far less short on ideas for solutions to the world's problems than on the actual inventions that will do the job.

The Pseudo-Environmentalism of Useful Idiots

For a very long time mainstream commentators on the matter of climate change, even when their own inclinations have been toward ecological concern, have indulged the pretension that there was actually a "debate" among climate scientists over whether or not anthropogenic climate change was a reality, and whether it was actually severe enough to demand redress--all of which was false, this a matter of the appearance of controversy manufactured by opponents of action (often by interests fully aware that the problem was real for a very long time before the issue came to public attention).

Those interests continue to do this today--and commentators continue to respectfully call their denial of scientific facts "skepticism"--but less often and less credibly than before, such that, as Michael Mann has remarked, those opposing discussion and redress of the issue are playing a more complex game. Two aspects of that game seem to me to merit particular attention. One is the idea that we should emphasize the consumption choices of individuals (personal "carbon footprint") rather than large-scale action and the programs necessary to bring it about (like the portfolios of utility companies). The other is that the situation has got so bad that there is not much to be done about it now--making calls for action pointless.

Making this issue a matter of individual choices--individual choices on the part of people who mostly can't even pay their bills!--is a strategy that can't work, and was never intended to work. Indeed, assigning all the responsibility to those who have least power (for a start, consumers can only buy what they are offered, within the slight means most of them possess)--railing at them for their alleged crimes as they ignore far, far worse on the part of the powerful, before whom they bow and scrape (by all means, don't say a word about what the CEOs of Big Coal, Oil and Gas are doing, but scream at working people for eating a burger)--was intended to make working people think of talk about climate change as an attack on their own meager living standards, already battered by decades of economic stagnation, austerity and falling incomes. Defeatism is worse still, turning people's attention away from the matter of solving the problem altogether--as with the pernicious sniveling about "grief" (to which the sort who take a Jonathan Franzen seriously so readily incline); to fantasies of "adaptation" (as with the notion that anyone who lives anywhere near a body of water or the tropics can simply relocate themselves somewhere more comfortable with the ease of a billionaire deciding which home they'd like to go to on their private jet today); or given how depressing a world where the defeat has already happened has to be to anyone with the cranial capacity to comprehend the fact, tune the issue out (if only for the sake of their own psychological survival).

In both cases the promulgators of such strategies can say "Mission accomplished!"--again, with the help of people who may have had perfectly good intentions but were simply too unsophisticated to avoid becoming "useful idiots" to those pushing the very agenda they think they are fighting.

If one is at all serious about writing about an issue like climate change, a little self-awareness, a little knowledge, is called for. And if I may make a suggestion--if you speak or write about these matters you should take a good look at yourself, think about whether you are making any positive contribution at all, and if you find that you don't make the cut, either take a break and study up, or get out of the way of those who can do a better job.

Friday, June 17, 2022

Revisiting George and Meredith Friedman's The Future of War in 2022

When George and Meredith Friedman's The Future of War: Power, Technology and American World Dominance in the 21st Century appeared in 1996 the book, while certainly getting its fair share of attention, did so in spite of being distinctly unfashionable. After all, the authors' concern in the book was traditional, interstate, great power, war at a moment when, because of U.S. strength relative to any plausible opponent, and the expectation that globalization and other changes would soon make the nation-state and its traditional security agenda less relevant to the life of the world ("We no longer fight wars against countries, we fight them against individuals!" many a fashionable "analyst" enthused), with the result that that form of conflict not only seemed remote, but likely to go on becoming only more so over time.

Of course, even at the time many were skeptical of this "conventional wisdom"--for various reasons. The Marxist left, for example, with its stress on capitalism's contradictions, and its theories of imperialism (Luxemburg, Bukharin, Lenin), never bought into all this. However the Friedmans came from the opposite end of the political spectrum--and stood on a rather doctrinaire, "life never changes" insistence on the continuing validity of old-fashioned International Relations 101 "billiard ball"-model-of-the-international-system realpolitik. In doing so the Friedmans did not venture guesses as to who would be the belligerents in the armed conflicts this realpolitik-based vision of international relations anticipated as virtually certain (the U.S. apart), or the specific objects for which they could be expected to fight, or any of those things which could be concluded from such premises, like when and where such wars could be expected to break out--perhaps chastened by their having made a colossal recent error here. (Just a few years earlier here George and Meredith, at the peak of American Japanophobia, warned darkly of The Coming War With Japan.) Instead what the Friedmans emphasized was the long-term evolution of the means for fighting such wars--specifically the shift from massed explosive power and the land, sea and air-based platforms which deliver it (tanks, gun-armed vessels, planes) to precision-guided munitions, highly technologized "Starship Troopers"-style infantry, and eventually space platforms ("battlestars"), making senile and eventually obsolete the systems that dominated the twentieth century battlefield.

As yet we remain far from possessing the technologies to the Friedmans' vision. And certainly George's more specific political predictions have since been unimpressive. (In his book The Next 100 Years he forecast that China would collapse in the 2010s--and Russia not long after, and that without anything like the stress of the current war the country is fighting, which he also failed to predict, instead anticipating a Paris-Berlin-Moscow axis.) However, if there is much that he has clearly got wrong already, he seems to have been sounder than his more fashionable colleagues in realizing that the illusions of the '90s were only that, illusions, a fact whose fuller implications many are only now beginning to realize.

Wednesday, June 15, 2022

Will the Cost of Living Ever Go Down?

In the twentieth century the economic game was all about getting consumers to spend more. The reason was that by this point businesses were producing significantly more than the customer could afford to buy, or was inclined to buy even when they did have the money, raising the problem of how that product was to be moved.

The result was the advent of consumer culture, complete with the advertising that said you had to have the newest and latest--and priciest--or you would, even if prepared to suffer a great deal of personal inconvenience by making do with an "inferior" product, be shunned as a social outcast (while the companies always made sure there was a newest and latest, often if this was entirely pointless). There was the built-in obsolescence that forced the consumer to buy the same thing over and over again (even before the new version came out). And, it might be added, there was some help from the prevailing politics, in which society didn't deal with problems and left individuals to cope as best they could--which is to say, as expensively as they could. (Urban life getting you down? Buy a house in the suburbs. And a car with which to get there.) And so on and so forth. In line with such a course central bankers made a point of making the borrowing that let consumers spend beyond their means (this, too, was a key part of consumer culture) cheaper than it might otherwise have been, while government "stimulated" the economy fiscally, too, to keep it running smoothly (not least, with those giant military-industrial complexes). And so on and so forth.

The economic growth model I describe here had a heyday in the post-war period--the boom years of the 1950s, the 1960s, the 1970s, when people were consuming more and the economy was growing and they made enough money out of it to consume still more than that. But before that decade was out it was clear that this model was no longer delivering the goods, because growth was sputtering out, profits down --while inflation was up too, how about that? As a result the model shifted in a lot of ways. ("Cry havoc, and let slip the dogs of finance!" is what the policymakers of the day said, minus the Shakespeare, which I suspect was well above their heads. Politicians in my lifetime haven't been the most literate bunch.) But the part about getting people to consume more stayed--in spite of those pesky environmentalists who went about saying there were "limits to growth."

For all the sound and fury we never really saw anything like the glory days of the mid-century boom, economic growth-wise. What little growth there was didn't really trickle down the way the reformers promised it would. And things have got much, much worse since, especially after the Great Recession--while that limits to growth talk has never ceased to haunt the conversation. The result is that few are really content with how things are going in 2022--not least a generation showing signs of being really and truly fed up with the associated "rat race."

For all that, it hardly seems that any dramatic change is imminent, but one may wonder nonetheless--is it conceivable that instead of so many putting so much effort into making people spend more and more, we could turn that capacity for INNOVATION! of which we hear so much (but, it can often seem, see so little) toward enabling individuals to enjoy an acceptable standard of living while spending less, consuming less, relative to today? Certainly the folks at the RethinkX think tank see us as on track toward a world where people can have "First World" comfort at "Third World" prices (and orders of magnitude less ecological impact to boot). Premised on innovations like cellular agirculture, Transportation-as-a-Service, and the "printing" of food, clothing and shelter, it can seem as if the essentials for it are things that were supposed to be near at hand for so long that this will always remain the case--while it is far from clear how well society will realize the potentials they allow even should the technologies themselves become a reality. Certainly the one part of their vision that has advanced most fully, the extreme cheapening of the price of computing and digital communications, has not quite played out the way the optimists of a generation ago hoped--while, even more than before, we live in a society which, rather than solving a problem, tells people to just "live with it." Those problems include a historic upsurge of inflation. Yet it seems that the failure gives us all the more reason to think the harder about what the conventional wisdom would have us write off as impossible.

Friday, June 10, 2022

Beyond Lithium-Ion Batteries

Everyone with any interest in renewable energy solution as part of the solution to the energy-climate crisis is probably sick of hearing it--the renewables-basher's sneer that "The sun doesn't shine and the wind doesn't blow all the time" (as if everyone didn't already know that!).

They are likely sicker still of the renewables-bashers' inflicting on them the conclusion derive from this--namely that renewables cannot and will never supply more than a fraction of our energy. So it's fossil fuels yesterday, fossil fuels today, fossil fuels tomorrow, fossil fuels forever! Sorry/not sorry, hippies! We wash our Sport Utility Vehicles with your tears!

Of course, this argument is a weak one that has been getting weaker all the time. After all, renewables have been getting cheaper--so much cheaper that even equipped with battery storage (also getting cheaper) they are becoming competitive with natural gas, even without special government favors being shown to renewables (even as natural gas, along with the rest of the fossil fuels sector, enjoys past, accumulated largesse, and continued favors, on an immense scale conveniently overlooked by those who whine endlessly about tax credits for solar and wind and the like). Indeed, the RethinkX think tank has made a fascinating case for, assuming that these price drops continue a little longer, it will be the cost-effective thing to build renewable up to the level of "Clean Energy Super Power," with local surpluses of SWB (Solar-Wind-Battery)-based energy making electricity as cheap as bandwidth.

In fairness, battery storage is not without is difficulties. The lithium ion batteries that remain the go-to type, after all, rely on rare minerals concentrated in a handful of conflict-ridden regions (like the lithium of Bolivia and Afghanistan, the cobalt of the Congo) where they are mined in conditions which are brutal for the workers and damaging to the local environment, while there is the additional ecological problem of what to do with the batteries at the end of their useful lives. And the renewables-bashers, of course, make the most of this, too (applying the same double-standard they do to environmental effects and working conditions that they do to such matters as tax credits; after all, they forget just how many products the same can be said of, the fossil fuels to which they are so loyal included). However, besides being sanctimonious in the extreme, they are also wrong about these evils being necessary costs of any attempt to shift the world's energy base. It is far from being the case that lithium-ion constitutes the only option for electrochemical batteries. Quite a number of alternatives based on abundant, low-cost materials, capable of delivering the requisite power and energy density, are in development--with the obstacles falling away regularly enough (cobalt-free batteries not only exist but comprise a growing share of Tesla's production, while Samsung and Panasonic are moving beyond the stuff as well) that, in contrast with so many areas where tech writers hawk baseless optimism, there seem grounds here for the expectation of working technical solutions.

Meanwhile, electrochemical batteries are far from being the sole electricity storage option in every area. Notable here is gravitic storage, which entails raising a weight to a given height using some of the electricity amassed, and then dropping it, releasing its potential energy. Pumped hydroelectric is a well-established type of such storage, used since the nineteenth century in hydroelectric power operations--and in fact its usability as a storage method in connection with solar and other non-hydro forms is likewise well-established. More novel, we are seeing increased interest in the use of towers like those being built by the Swiss Energy Storage firm (which utilize concrete blocks in similar fashion)--a method which is already becoming economically competitive. Such forms of storage may not get a car from A to B--but even in a world where the present hopes set on more sustainably sourced batteries fall short of the present expectations, their enhancement of the viability of renewables-powered electric grid will in itself greatly reduce the problem of supplying the batteries keeping an electrified transport system running.

Thursday, June 9, 2022

Why I'm Sick of Hearing the Term "Carbon Footprint"

Not long ago I argued that the time had long passed when simply screaming about climate change did any good--that those who wanted to make a positive contribution had to talk about solutions.

The piece, unsurprisingly, wasn't terribly popular so far as I can tell--and I was reminded in what response I got that where talking solutions is concerned here people have a tendency to set the bar very low indeed, in a watery way saying things like "The solutions are all around us" or "The solutions are in us" or "The solutions are in the choices we make."

Ah, but where around us and where in us and in which choices? That is what would elevate this past the level of banality--and the real bar that this discussion has to clear, and which so little of it does. And what does get specific makes matters still worse. As climate scientist Michael Mann made clear in an interview in Scientific American that I think ought to be required reading for anyone seriously interested in the climate crisis, the opponents of meaningful action on climate change have gone to great lengths to shift the discussion away from the actions of large and powerful entities like corporations and governments to individuals--not least via that invention of the folks at BP, "carbon footprint."

You likely recall the old adage that "With great power comes great responsibility" (most likely from watching Spiderman, though it is no less valid for that). Reducing everything to personal carbon footprint, and while we are on it ignoring the limited means and limited options of the vast majority of the planet (even in the First World U.S. a third of the public cannot meet a mere $400 emergency out of their own means), this is yet another case of the opposite--of assigning all the responsibility to those who have none of the power (or so close to it as makes no difference), and treating them like eco-criminals for failing to accomplish the impossible task set for them. In spite of that the overwhelming evidence is that the public as a whole recognizes the problem--and wants something done about it, as it shows again and again not just in the polls but at the ballot box--but this brand of pseudo-environmentalism is setting the effort back rather than advancing it precisely because of the ways in which it alienates the very public to which it is presuming to appeal.

Wednesday, June 8, 2022

A Financial Singularity? The Stock Market Bubble of the 1990s

During the stock market boom of the late twentieth century the capitalization of the stock market grew more than 150 percent between 1994 and 1999 alone--a growth wildly disproportionate to the growth of the underlying economy, even in that period of historically brisk expansion. (Where the stock market capitalization-to-GDP ratio stood at a mere 71 percent in 1994 it was over 153 percent in 1999, more than twice as high, even while U.S. Gross Domestic Product grew by better than a fifth, even after adjustment for inflation.)

It was the conventional wisdom among at least a significant section of financial "experts" that this was some wonderful new normal, with the surge in asset values to somehow continue for a good long time to come, these apparently trying to outbid each other for the public's attention with ever-higher predictions of how high the Dow Jones average would go within the next several years. (Dow Jones 36,000! Dow 40,000! Even Dow 100,000!)

Assuming anything at all but the market's having been released from the laws of economic gravity, this was a big bet on just how well the "real" economy was going to do in these years over euphoria over computing, the Internet and related technologies and the possibilities some claimed to see in them--one sees how much so when they think about what it would have meant if the economy lived up to the investor expectations implied in those stock prices. If, for example, the stock market's capitalization had grown at that rate for the next two decades, and the real economy fallen no further behind the growth of the stock market's capitalization than it was in 1999.

That would have meant a roughly 20 percent a year real economic growth rate for the next two decades, and a nearly forty-fold expansion of the U.S. economy, producing a U.S. GDP of some $700 trillion in today's dollars. Alas, U.S. GDP in 2019 was, in today's terms, more like $24 trillion--a mere thirtieth that sum. And long before the disparity could grow so stark the bubble went bust, just a few months into 2000.

Looking back it is impossible to picture what those decades of 20 percent a year growth would have looked like, with the same going for their somehow producing a country thirty times richer than it is today. In fact, it does not seem an exaggeration to characterize the situation as one of financial and economic singularity--which brings to mind that other Singularity that Ray Kurzweil said so much about in 1999. Something like that technological Singularity would seem the only way in which such a financial boom could have proven a winning bet--such that it seems we can speak of Wall Street's behavior giving the impression that Kurzweil's Singularity really was imminent.

Tuesday, June 7, 2022

What Role Might Superconductors Play in the Energy Transition?

Superconductivity has been in the news quite a bit these past couple of years, in large part because of a major breakthrough in 2020--namely the observation of room-temperature superconductivity for the first time in history. Of course, this occurrence was in a lab, under extremely specific and difficult circumstances (with the material put under pressure equal to over two thousand times the pressure at the bottom of the Mariana Trench). Still, if only usable only in very special circumstances the fact remains that room-temperature superconductivity is a proven physical reality, and a great many are watching the progress in this field toward superconducting materials that can work in everyday conditions with interest.

A major reason has been the pursuit of a more efficient electric grid. Of particular importance the density of current in superconducting materials, relative to those presently in use. As a result generators using superconducting coils produce larger and stronger magnetic fields, extracting more power from a given amount of current--with one result that lighter, more compact generators, can deliver the same power as heavier, larger units. When made of a superconducting material wires of a given width transmit up to five times as much electricity as their copper equivalents, and do so with far less loss over long distances. And the storage of electricity in batteries using superconducting materials likewise diminishes the problem of losses, yielding additional efficiencies.

All of this can permit a more efficient exploitation of any energy source, but seems especially helpful in compensating for the intermittency of renewables that has, thus far, slowed the improvement of their cost advantage over fossil fuels and nuclear. Practical experiments have already demonstrated the possibilities of squeezing more power out of windmills equipped with superconducting magnets of given sizes. Superconducting materials' potential for lowering the cost of long-distance power transmission enables them to better connect sun and wind-rich areas with others where demand may outweigh what is reliably available at hand, or simply provide a convenient back-up if demand goes up or local power generation goes down. (Renewables-bashers love to sneer that the sun doesn't always shine and the wind doesn't always blow, but at any given time the sun is probably shining and the wind blowing somewhere, and superconductivity goes a long way to making transmission across those distances cost-effective.) Meanwhile, in contrast with fossil fuel-based power generation, renewables in particular would benefit from their usefulness in storing electricity itself. (Indeed, it is already the case that superconductor-equipped storage is being used on a small scale for the sake of evening out grid fluctuations--while an argument has been made for the plausibility of equipping windmills and photovoltaic banks may be with their own superconducting storage units.)

Altogether such possibilities mean that, even if superconductors get much less attention than other technologies, progress in this area may yet play an important role in the energy transition—and warrant that much more interest on the part of observers looking to make it work, especially if they have the long run in mind.

Monday, June 6, 2022

What Ever Happened to Superconductors?

Cold fusion and fifth-generation computers were among those technologies that in the 1980s were supposed to be on the verge of changing everything--but over three decades on have amounted to pretty much nothing.

In the same years one also heard a great deal about superconductors, specifically materials which, under appropriate conditions, cease to resist the passage of electrical current, so that it can flow absolutely without loss--becoming, as the name indicates, super conductors. That implies the possibility of enormous efficiencies in a very great deal of what we do with electricity--which can seem just about everything, with the list getting longer all the time.

In considering the publicity afforded the concept in the 1980s one should note that the concept was not new even then. The phenomenon of superconductivity was first observed way, way back in 1911. However, prior to the '80s the known superconductors only worked at extremely low, near-absolute zero temperatures--which meant that they required enormous amounts of energy for refrigeration (especially with electricity passing through them and heating them up). This, of course, left them with little practical use--while achieving better than that was thought not only an engineering difficulty but a theoretical impossibility. What made superconductors seem newly relevant was the discovery of a ceramic (lanthanum barium copper oxide) that could work as a superconductors at relatively high temperature. (I stress relatively, because the '80s-era discovery meant superconductors operating at 90 Kelvin--which is about three hundred degrees below zero for those of us using the Fahrenheit scale.)

That may not seem very promising, but it did arouse expectations about the rate of progress in the field (there were fantasies that "superconductor supremacy" was going to very soon mean world economic supremacy)--which soon proved rather exaggerated. Still, the research effort continued, and happily, so does progress, with the use of different materials enabling them to achieve superconductivity achieved at higher and higher temperatures until, two years ago physicists actually achieved superconductivity at "room temperature" (in fact, achieved it at 58 degrees Fahrenheit, the average temperature in Bergen, Norway, in July and August) garnering significant attention back in 2020.

What has been less widely covered in the coverage aimed at a non-specialist audience has been the specific circumstances of the achievement of that superconductivity. The superconductor in question (a mix of hydrogen, carbon and sulfur) worked because it was under a pressure of 270 gigapascals--a figure more often mentioned than explained. Those unfamiliar with that unit of measurement should know that it is equivalent to well over 2.6 million times sea level atmospheric pressure, or under about 16,000 miles of water--which is to say, more than two thousand times the submarine hull-squashing pressure at the bottom of the Mariana Trench.

As this shows researchers in the field have traded one set of extreme conditions (cold) for another (pressure), so much so that those who imagined from the press reports that commercially useful room-temperature superconductors were imminent may, as is so often the case when looking more closely at pop science stories that make us think a technology at Technology Readiness Level 1 is already up at Level 9 find this a damp squib. But all the same, it is undeniably a breakthrough, proving that room-temperature superconductivity is, at least, possible, and perhaps yielding insights into how it might be achieved in less extreme conditions--while, for what it is worth, work has begun on making those superconductors work at lower pressures than that.

Moreover, it would be a mistake to think that this means that superconductors have amounted to as little as those other technologies previously mentioned have done to date. If without much fanfare, superconductors have already entered a wide variety of practical, everyday uses, with the most significant, perhaps, Magnetic Resonance Imaging (MRI) machines. Seventy percent of those installed worldwide use superconducting magnets to enable more rapid and comprehensive scanning of the patient. And in that we have a reminder of something else, namely that even short of room-temperature superconductivity the technology is being put to practical use, with another breakthrough previously thought an impossibility--a superconductor through which electricity flows in only one direction--opening the door to the use of the technology in computing to produce microprocessors hundreds of times faster than those operating today. Of course, the refrigeration requirements make our seeing this in consumer devices anytime soon implausible--but the head of the research team which made the breakthrough has himself argued for its possible applicability to server farms and supercomputers. If true, this could well prove revolutionary enough in itself.

Are Those "Spreading Awareness" About Climate Change Aware of What Kind of "Awareness" They Are Spreading?

While thinking about the problem of climate change in recent years I have found myself increasingly concerned with the consequences of so many commentators relentlessly promulgating the bleakest possible view of the situation. These think, or at least give the impression of thinking, that they are "promoting awareness" and somehow contributing to resolving the problem. In fact many, maybe most, are simply promoting defeatism and despair.

Why do they do what they do?

I suspect that they don't understand, or don't want to understand, how politics really works, how and why things do and do not get done. Unable to give the public reasons for hope, and so they put all their energy into exercising the other option for moving it, fear. In spite of the ample evidence that the public already knows all about the problem, and has long been anxious for something to be done about it--so anxious that it is literally sick over it--and time and again elects politicians who promise to do something about it (even if those "leaders" break every promise) these tell themselves that there must not be enough fear out there, and keep doing it over and over again expecting a positive result. Encouraging them in this terribly problematic course is the evident, enormous self-satisfaction persons of weak and unserious mind derive from inflicting disaster porn-riddled jeremiads on the public.

Naturally they never think of the possibility that they have exhausted the usefulness of fear, perhaps a very long time ago, and that continuing to use fear, at least in the manner they have been doing, has become counterproductive; that past a certain point fear can simply make people shut down rather than acting; that rather than screaming alarums they now face the more difficult yet totally indispensable task of explaining frankly and seriously why society has so miserably failed to meet the problem and think seriously and frankly of how it can stop failing and lend their voices to whatever proposals might redress the issue; and that if they are not up to the task (as persons of such caliber generally are not) that they are only getting in the way of those who might be.

Friday, June 3, 2022

Centrism: A Primer

We hear the word "centrist" tossed about a lot--but little about what it really means.

If you want a fuller explanation, supporting everything said here in great deal, to the point of having twenty-five pages of single-spaced endnotes attached, you can go here.

If you want the short version, just keep reading.

Simply put, centrism--certainly in the sense in which we use the term in the U.S.--isn't just middle-of-the-roadness, even if it overlaps with middle-of-the-roadness much of the time, or at least seems to do so. Rather this outlook can more usefully be characterized as classical conservatism updated for a society where liberal institutions have replaced those of the Old Regime which may be said. In line with that conservatism centrists take a dark view of human nature, and are pessimistic about the ability of human beings to rationally understand, direct, "engineer" society and its course. They are especially doubtful about the wisdom and goodness of the "common" man or woman--their ability to understand the issues, and to act rationally when they enter onto the political stage. This leaves them comparatively fearful of and hostile to societal change, especially when that change comes "from below." Instead they favor leadership by an elite able to use its trained judgment, for which they regard no substitute as existing.

However, the twentieth century is also not the eighteenth. As stated previously the feudal-agrarian world of the classical conservative has given way to a capitalist and democratic society, which is the form of life they are stuck with, and stuck with defending. All this being the case, if no lovers of 1789, it is 1917 that haunts them, and against which they define themselves. Thus they accept the fact of a democracy with universal suffrage and liberal rights like freedom of speech--but believe that democracy can only safely operate on very specific lines, keeping its politics "civil" and "pluralist."

What does this mean? It means that people check "ideology"--structured views of what the world is like, how it works, how to operate in it--at the door when they enter into the political arena. They do not raise the matter of how society is structured, who has advantage and who does not, what is right or wrong (much of which they regard as beside the point because of the uncertainties of social life in light of their epistemological doubts, and because they hold that in a liberal society power is so diffuse among voters and consumers that no one really has power over anyone else, for example, corporations against workers or consumers). Instead the practitioner of a centrist politics thinks of society as a plurality of interests, which they assume to all be equally legitimate so long as they abide by those rules in regard to ideology. These interests, within this arena, compete for support and negotiate among themselves in a process advised and guided by experts regarded as objectively treating of value-free facts, for the sake of preventing societal conflicts from escalating to a society-destabilizing degree--or, put more positively, the maintenance of "consensus."

Of course, all that said centrism has tended to embrace particular positions over time. In the mid-twentieth century centrism was for a defensive, containment-oriented anti-Communism in foreign policy, for the New Deal at home (if not necessarily enthusiastic about extending it), for the civil rights movement (in its moderate form), as against a right represented by figures like Barry Goldwater which took a still harder line against Communism (not containment, but "rollback"), sought a return to the nineteenth century with respect to government involvement in the economy, and opposed the civil rights movement as an infringement on the rights of lower levels of government (and not necessarily just on those grounds). Later in the century the ascent of the right (identifiable with Ronald Reagan, who succeeded where Goldwater failed) and other factors (the end of the Cold War, globalization, etc.) saw centrism move a long way to the right on key issues, becoming more like the neoconservative right in its foreign policy, and trading in the New Deal for neoliberalism. Its record in regard to the country's cultural conflicts seems a different thing. Still, it shifted away from a leftishly universalist civil rights movement in favor of a very different identity politics (which the right and many others characterize as "left" but which is readable as very much of the right in its premises, more Maistre than Martin Luther King).

Looking back it seems to me that this version of the center had its heyday in the '90s, when Bill Clinton's administration solidly established the Democratic Party's identification with it in office, governing as it did along these lines, while for the time being the prospect of great power conflict appeared on the wane in a world where Lexuses mattered more than olive trees, and it seemed to many (whether viewing the fact positively or not) that "political correctness" was inexorably in the ascendant. Since then this political vision has faced far more challenge, exemplified by the country's polarization through the twenty-first century--by the contested election of 2000, by the Iraq War and the general expansion of U.S. military involvement in the Middle East, by a succession of economic crises (the tech boom's going bust in 2000, the inflationary energy crisis of the '00s, the Great Recession), by the more recent pandemic, by the escalation of culture war and identity politics, and so much else. In the face of it all thus far the center has generally stuck to its turn-of-the-century positions (partied like it's 1999, so to speak), with the Democratic Party's leadership and officials doing so even as their electoral base has shifted leftward, but it may well be that in the face of the multitude of conflicting pressures centrism will adapt yet again.

Thursday, June 2, 2022

A Note on What Ideology Means

When we speak seriously of ideology--of liberalism, conservatism and so forth--we are speaking of a philosophy which addresses fundamental questions about the human condition, and on the basis of the answers it offers to those questions, the problems of economic, political, cultural and social life. Arguably the most important of those questions are:

1. What can we know about the world, and especially the human, social world?
2. What are human beings like--individually and collectively, in society?
3. Given what we know about human beings, what should we consider to be society's goals?
4. If we think that society should be something other than what it is, can we change it for the better? Would the potential gain outweigh the risks?
5. How far can we rely on reason in changing our social arrangements--our economic system, our political system, our culture--for the better?

Conservatism, liberalism, and the rest, all have very specific answers to these questions, which determine their address of specific political questions. For now let us stick with conservatism and liberalism, in the classical sense of each of those terms, which retain some usefulness from this vantage point (even as much else may have changed).

Conservatism takes a dark view of human nature (think Thomas Hobbes), and is pessimistic about the applicability of reason to society. This leaves conservatives more concerned with keeping human badness in check than with, for example, achieving a society affording its members greater freedom, justice or equality, which generally seem to them unrealistic aspirations in the circumstances. Thus they think that the prospects of change for the better are very dim, while tending to regard the social arrangements that have emerged over time, "organically" in response to specific situations--what is often called "tradition," and where following tradition in doctrinaire fashion does not settle the matter, judgments by an elite respectful of tradition based on its own personal, practical experience--as likely to be superior to any human "plan." (As the foundational Joseph de Maistre argued in his Considerations on France, a person can grow a tree, but they cannot "make" a tree--and so it is with a society in the conservative's view.)

Liberalism takes a different view of these matters, seeing human nature as a broader, more flexible thing than conservatives give it credit for, what they might conceive as a timeless, unchanging, unchangeable (and nasty) human nature substantially formed by circumstances. (The liberal John Locke characterized the human mind as a blank slate at birth in his Essay on Human Understanding.) They also have a higher opinion of the capacities of human reason--and therefore see room for better, much better, than we have been doing up to this point, and with that, much more scope existing for a freer, fairer world than history has known. Indeed, they may regard the exercise of reason for the sake of creating a better set of social arrangements as not merely desirable and possible, but obligatory, given that their starting point for thought about society is an individual they regard as having inalienable rights, not least to freedom. They may even regard such change as a practical necessity, for their reason tells them time and again that the world changes, and the "old ways" often fail to meet the new demands it throws up. (Consider, for instance, the interrelated matters of nationalism, militarism, war. The conservative does not see such things going away any time soon, but the liberal points to them as having ceased to be tolerable in an age of globalization, and of nuclear weapons.)

Of course, confronted with this tiny, tiny bit of philosophy 101 many snap "People don't use the word like that!" And certainly most people don't--in part because there has been some awkward shuffling of labels (conservatives having been forced to reckon with liberalism, liberalism having bifurcated into more conservative and more radical versions, etc., etc.) creating a fair amount of superficial confusion. However, more important than any such confusion is the fact that so few thought about the matter long enough to be confused by it; that very few of those who identify as "conservative," "liberal," or anything else have ever considered the questions discussed here at all, let alone in any great depth. All the same, there seem to me to be two rejoinders to their dismissive attitude:

1. Their using the terms in a shallow, unthinking, politically illiterate way does not make those who use the terms in the ways long established in political philosophy and political science somehow incorrect. (To suggest otherwise is more than saying "My ignorance is as good as your knowledge." It is saying "My ignorance is better than your knowledge.") If anything, there is a far better case for the matter being the other way.

2. The more casual, ill-informed usages often turn out to be more consistent with the deeper ones than people generally realize. While people reduce labels like "conservative" or "liberal" to responses to particular hot-button issues about which they may be speaking emotionally rather than intellectually the conservative or liberal position tends to reflect those deeper philosophical assumptions just discussed here (even where the person in question never considered the issue on that level). Thus does it go with such a matter as gender (e.g. gender roles, gender identity, reproductive rights, sexual freedom), with the conservative inclining to the traditional practice, the liberal or radical seeing more scope and reason for change, on the basis of what they rationally judge to be fair and right, and in line with the demands of human rights, including freedom.

The result is that the vulgarian snapping "People don't use the word like that!" has probably done so plenty of times without even knowing it.

Wednesday, June 1, 2022

Has the Aircraft Carrier Had its Day?

Just this month I was surprised to see a piece in Vanity Fair titled "'Floating Pointlessness': Is This the End of the Age of the Aircraft Carrier?"

The question has been asked again and again for decades--indeed, since the end of the Second World War when the atom bomb and guided missiles cast doubt on the value of nearly every kind of weapons platform then in existence.

Still, the matter is rarely raised in such a forum as Vanity Fair--a fact reflecting its increasing salience, for three reasons:

1. The extent to which more states that had, due to economic constraints and the post-Cold War mood, limited their investment in long-range power projection systems, are now pouring money into them. (Thus Japan has bought buying four heli-carriers, and converted two of them into F-35-carrying attack carriers. Meanwhile Germany seems to be taking an interest.)

2. The resurgence of great power conflict--which means that rather than those carriers simply being used in environments and actions where the opposition had little capability to threaten them (as with every war in which the U.S. has used such carriers since 1945) there is a rising prospect of violent clashes between major navies which may possess significant means foe threatening or even neutralizing carriers.

3. The advent of new anti-shipping weapons that may significantly increase the risk to carriers. These include land and air-based Anti-Ship Ballistic Missiles and hypersonic cruise missiles, against which such ships may be without effective defense (with Russia and China at the forefront of the development). They also include the advent of quieter conventional submarines that may be quite able to slip past the most robust anti-sub protection.

All of this, of course, would seem to be coming to a head in the sharp escalation of conflict between Russia and the West in the wake of open, full-blown interstate war between Russia and Ukraine in a manner that may have no precise parallel since 1945 (even when one includes the break-up of Yugoslavia, which so much of the media seems to have totally forgot about). This is all the more the case for two incidents during that conflict:

1. The first combat use of hypersonic missiles (even if it has been solely against land targets, so far, with questions raised about the weapons' accuracy).

2. The incident foregrounded in the Vanity Fair article, namely the sinking of the Russian cruiser Moskva--the single biggest warship loss since at least the 1982 Falklands conflict, and if one excludes the sinking of the Argentine cruiser General Belgrano in that conflict, the biggest since 1945. This is particularly significant because, at least to go by the version of the incident standard in the American press, the Ukrainian navy accomplished the feat with a pair of Neptune anti-ship missiles--relatively short-range, subsonic missiles far less threatening than any working hypersonic weapon, which nonetheless took out a large ship bristling with anti-air sensors and weapons (including ten surface-to-air missile launchers firing navalized versions of the SA-8 and SA-10 SAMs, and a half dozen close-in weapons systems). The result is that even in the absence of cutting-edge ASBMs, hypersonic cruise missiles and the like large vessels seem to already be more vulnerable than has generally been acknowledged.

Considering the matter I find myself referring back to George Friedman's discussion of the issue in The Future of War, when considering the viability of carriers and other such systems in the age of the guided missiles. He wrote of them as senile rather than obsolescent--which is to say that these systems, ever more endangered, required ever more protection and so yielded, in striking power and other ways, less return on investment, as seen when one considers the extent to which carrier air wings are devoted to fending off threats to the carrier rather than attacking, and the necessity of large numbers of very heavily equipped, sophisticated escorts to make them survivable in a hostile environment. In 2022 the carrier would seem to be that much further along that trajectory--albeit, without any real substitute available. (As yet ship-launched cruise missiles, at least when conventionally equipped, still fall far short of the striking power of a supercarrier's air wing.) The result is a reminder of just how much more unbelievably costly and dangerous modern war keeps on getting, so much so that even more than Friedman Ivan Bloch is, once more, a better guide than he to where we have found ourselves in this regard.

Just don't expect anyone abiding by the conventional wisdom to take the lesson.

Subscribe Now: Feed Icon