Tuesday, March 14, 2023

The Upcoming British Defence Review

The British government was supposed to have a defense review out on March 7--one the more eagerly anticipated because of the fact that, as its appearing a mere two years after the last rather dramatic review indicates (in the post-Cold War era, recalling the 1994 Defence Costs Study after the monetary-fiscal crisis of the early '90s, the "New Chapter" to the 1998 Strategic Defence Review after the September 11 attacks), it was expected to address a painful shock, or rather several of them. The protraction of the pandemic, Britain's especially weak economic condition afterward, the worsening of matters by "Trussonomics"--and of course, the war in Ukraine, amid a generally deteriorating international scene--make it easy to picture changes coming.

The review, of course, does not seem to have appeared as of the time of this writing (March 9). On March 7, the day it was supposed to actually be out, we were told that it "will be published in the coming days" (around the time of the Spring Budget 2023, which is scheduled for March 15)--against a backdrop of speculation that dispute within the government over policy and its resourcing was likely to compel such a delay--the Ministry of Defence apparently objecting to being asked to do more without a greater infusion of cash.

We will have to wait until then to see how the government will try to square the circle of its ends and means--or rather, pretend to do so. Since the nineteenth century Britain has been consistently, severely, overstretched militarily--the long-range big-picture history one retrenchment after another. Especially since 1945 the retrenchments have come thick and fast--in part because the cuts to commitments always significantly lagged the cuts to forces, which in turn lagged the cuts to the armed forces' budget, the reduction of which was never in line with the reduction in what Britain could afford. The result was that the situation quickly compelled a new round of adjustments--generally, cuts--of all of the above, each inadequate in the same way as before, Britain requiring its armed forces to do more than was reasonable, while being penurious with the forces it did have, and even that penurious spending more than the British fiscal state or the British economy on which it stood could bear (even, one might add, as British budget-cutters inflicted increasing pain on their populations with ever-greater stringency in services and social programs, down through the death rate-elevating 2010-2019 round of austerity), in a vicious circle that just went on and on.

There is not much left to cut these days, with British forces, which have shrunk almost every year since the Korean War, now under 150,000 strong, with an army that cannot boast a single proper heavy division (as against the three armored divisions it had in Germany during the Cold War), and a fleet stretched to the limit merely providing escort to its handful of carriers and nuclear ballistic missile submarines.

How could this be when Britain has been, and remains, one of the world's biggest spenders? The story is a complex one, but a key part has been British governments' hanging on to certain superpower-type trappings--a nuclear deterrent, a global base network (extending from Alberta to Kenya, from Cyprus to Oman, from the Falklands to Singapore), a relatively robust defense-industrial base that has the country building its own tanks and warships and fancying itself building sixth-generation super-fighters. Surprisingly recent decades have seen the government bring back things it had admitted that it could not afford back in the 1960s--not least the aforementioned big carriers (the seeds of which lie in the 1998 review), and an "east of Suez" orientation (emergent with the "Global Britain" rhetoric and alarums about China, the essential thrust of the "tilt toward the Indo-Pacific" of the 2021 review). Meanwhile, ever-extending the gap, Larry Elliott's quips about a Britain with a "fantasy island" economy sinking toward Third World status looks less and less like hyperbole all the time. (Crunching the numbers recently it seemed to me that the first industrialized nation is approaching the condition of a developing nation, certainly if the manufacturing-intensiveness of the economy is the standard.)

The more critical, quite fairly, speak of post-imperial "grandeur" prevailing over the government's realistically living "within its means," all as the same government wrecks the economic foundation providing those means for decade after decade. Still, severe as the situation has become, I do not think retrenchment or rationalization are to be expected from the coming review--just as I have no confidence that the current British government, nor one headed by its Parliamentary opposition, is likely to shift the country's economic course to any appreciable degree. Rather I expect that the new policy will be grandiosely described, materially overambitious, inadequately resourced, and (the more so for combination with that dismal economic situation) necessitate overhaul in another very short period of time--likely, before 2028. Because in our time the pattern of Establishment decision-making, in all things and everywhere, is "If it's broke, don't fix it."

Saturday, March 11, 2023

Kristin Tate, Artificial Intelligence and the Decline of the White Collar: Thoughts

Recently suggesting that artificial intelligence might displace "white collar" workers before it displaced "blue collar" workers performing more mobility and dexterity-demanding manual labor I had the impression that I was unlikely to see anyone arguing along similar lines in a mainstream forum anytime soon. Naturally it was a surprise to see Kristin Tate making a version of the case in The Hill.

Considering Ms. Tate's argument I find her very bullish on the prospect--more than I think is warranted. I have simply seen artificial intelligence oversold far too highly for far too long to be anything but guarded toward the more recent claims for "the coming artificial intelligence economic revolution." (Remember Ray Kurzweil back at the turn of the century? Or how profoundly the hacks in the press misunderstood the Frey-Osborne study from 2013, creating the material for panic out of their highly qualified, rather uncertain conclusions?) I also think Ms. Tate overlooks the important fact that (as Frey and Osborne noted, not that anyone was paying attention to what they actually said) its being technically feasible to do something does not make it automatically worthwhile to do that thing--and so underestimates the economic and cultural and political obstacles to actually using these technologies to substitute for human labor, even were the performance of the technology perfectly adequate from a purely technical standpoint. Ms. Tate, for example, acknowledges that humans might be preferred in face-to-face jobs, and counts a certain amount of legal work among them--but overlooks the extent to which professionals, especially those belonging to the more powerful professions, can be expected to resist their replacement by AI (as was demonstrated in their reaction to the plans to have the first "robot lawyer" plead a "client's" case in a California court back in February).

I might also add that the hint of gloating Schadenfreude toward the college-educated "elite" and celebration of "blue collar values," especially to one alert to her politics, makes it easy to picture her reveling in the anticipated ruin of a group of people she despises--or perhaps simply "trolling" them with the thought of their ruin--coming in ahead of critically-minded caution.

Still, the essential argument is not just well worth thinking about but, if we start actually seeing chatbots and other technologies really make their mark, likely to become more commonplace--with the same going for the consideration of their implications.

Thursday, March 9, 2023

Revisiting Chris Hedges' Death of the Liberal Class

Back when it came out I read and reviewed Chris Hedges' Death of the Liberal Class.

More recently I have had occasion to revisit its argument.

As I observed in the initial review, despite the book's title Hedges' focus is so much on liberals' actions, as against the larger, often highly inimical political scene in which they had to operate, that he tells only part of the story, with that part perhaps more fittingly titled "The Suicide of the Liberal Class." Still, in spite of that close attention to their conduct--and his getting a good deal right, like the way in which liberals consistently ganged up with the right to beat up on the left, and the way in which a condition of permanent war has a corrosive effect on liberalism (with Cold War anti-Communism combining the two)--there were significant limitations to his analysis. In particular his address of the relevant mechanisms seemed inadequate--I would now say, because of the weaknesses of his understanding of liberalism itself.

Consider that "liberalism" as presented in the work of mid-century theoreticians such as Arthur Schlesinger Jr. and Daniel Bell, with a key part of their story their response to the horrors of the world war era--fascism, the Holocaust, etc.. There was a leftist reading of all this--that the whole old order of capitalism, nation-states, and the rest, was bankrupt and irredeemable and something better had to be built. However, there was also a conservative reading of all this--namely that it was trying to be build something better that was the cause of the problem. Schlesinger and Bell (among many, many others) inclined to the latter view, with all its implications not only for the socialism that had once attracted them, but plain and simple liberalism. Liberalism, after all, is founded on the efficacy of reason as a way of understanding and directing social life, and accordingly the importance of social criticism and the possibility of progress. However, these "centrists" instead despaired of reason, and uncomfortable with the criticism founded on it, because they were pessimistic about progress--with this going so far as a quasi-theological belief in the existence of "evil," and disdain for the mass as a "swinish multitude." Indeed, that particular liberal form of government, democracy, became something which they were uncomfortable, such that they increasingly endeavored to limit it--insisting on a political discourse stripped of ideology and values and anything else that might call into question the system, or arouse discontented masses, from which has emerged that form of discourse we now speak of as "pluralist" and civil. They also shifted away from that other liberal standard, universalism, toward particularism, with (certainly as seen in the historiography of Daniel Boorstin) the U.S. a country uniquely gifted with a tradition of such discourse in contrast with the ideology-addled Old World.

All of these views--the attitude toward reason, social criticism and progress; the belief in evil and suspicion of masses; the suspicion of democracy, to the point of their being eager to bound it; the stress on difference over universality--together comprise the classical conservative package to such a degree that centrism must be recognized as an adaptation of classical conservatism to the circumstances of mid-twentieth century America. Given its influence it is far to say that American "liberalism" traded their old liberal philosophical foundations for conservatives ones that increasingly came to define American liberalism (as, to his credit, that other mid-century "liberal" intellectual Richard Hofstadter acknowledged in the 1950s).

Of course, that said these liberals still had their differences with the Taft-MacArthur-Goldwater right, but that did not diminish the fact of their conservatism, rather leaving them emphasizing a different side of the conservative tradition. Theirs was the conservatism which "pruned the tree"--which was prepared to compromise and make adaptations to defend what it deemed essential, as against the more uncompromising, or even reactionary, conservatism that Hofstadter, was to call "pseudo-conservatism." And of course, as the Cold War ran its course, the stalling of the post-war boom compelled a reconsideration of American social arrangements, and the Communist menace that had been the main reason to compromise disappeared, conservative centrists saw less reason to do so--and moved rightward accordingly.

Accordingly centrism/liberalism's aligning itself with the right against the left was not a matter of the corruptibility of liberals, but their most natural and predictable course of action, especially in the circumstances liberals went on facing over the twentieth century, and into the twenty-first.

Monday, March 6, 2023

On "The Optimistic, Practical, Forward-Looking" View

In discussing the "power elite" the great sociologist C. Wright Mills wrote about what it takes to get to the top--which was not, in his analysis, competence in any of the senses in which believers in the world as some perfect meritocracy insist. Rather what matters is that one is perceived as loyal to the interests and prejudices of those in charge, making them an acceptable subordinate and successor on that level, essentially agreeable toward those whom they are obliged to be agreeable, and ready to do whatever it takes to get ahead--as with the moral compromises involved in meeting the first two criteria. All of this entails a great many behaviors, among them presenting themselves in a certain way, always speaking "to the well-blunted point" by "mak[ing] the truism seem like the deeply pondered notion," and "soften[ing] the facts into the optimistic, practical, forward-looking, cordial, brisk view," which brushes off hard realities and rather than finding a genuine bright spot in a dark picture and usefully working with that to set things right in the manner one would hope of a responsible and worthy administrator and leader, usually just "bright-sides" the listener or reader.

I have long found that last trait--that speaking to the blunted point, that softening of the facts into the "optimistic, practical, forward-looking" view--especially repugnant and frustrating. And it seems to be part of the general enshittification of the Internet that when we go looking for explanations and insight that "optimistic, practical, forward-looking" view is constantly inflicted on us instead, precisely because what search engines offer ever more these days is not answers to our questions but crappy products no one wants or needs. After all, a considerable portion of that consists of the would-be purveyors of advice of the self-help and related varieties, whose principal stock in trade is "bright-siding" you as they insist that whatever problem you are having is fixable with their glib one-size-fits-all prescriptions--or at least, pretend to be sure, as they really do not care whether that advice fixes anything, for what they really want is YOUR MONEY. Not getting enough readers for your blog (for example)? Well, here's what you must be doing wrong (they just assume), and here is what you can be doing differently (they just assume), but if you really want the whole package, buy this (as clearly they assume some of the people reading such swill will).

More and more of us are despairing of online search as a result--to such a degree that even so Establishment a news outlet as The Atlantic admits that search tools, like that old gray mare, "ain't what they used to be." And it seems to me that if it is indeed the case that Internet search engines as we know them are under serious threat from the latest generation of chatbots ("Did Google order the code red?" "You're Goddamn right it did!") the search engine industry helped make itself vulnerable through the ever-worse quality of its service--rendering itself dispensable through "creative destruction" not of more established products and services, but of themselves.

Realfacts, Goodfacts and "Fake News"

In the episode aired as the finale to the fourth season of Babylon 5, "The Deconstruction of Falling Stars," we got a look at the far future beyond the end of the series' story. In line with the Hegelian premise of the show there was a progressive movement within history--but on the way there Earth would see its share of further troubles, not least a period of rule by a totalitarian dictatorship evocative of Orwell's Oceania, with a government that distinguished between "realfact" and "goodfact." Realfacts are actual facts. Goodfacts may or may not be facts at all, but have the approval of those in power--and that government sees those and not the others as the relevant ones.

Hearing particular speakers rave about "fake news" in our time I have found myself wondering again and again whether the standard they have in mind for what is not "fake" is "realfact" or "goodfact." And as it turned out again and again it has been common for them to mean by "fake news" that news which does not align with the "goodfact"--as they demand that search engines, social media and the other infrastructure of contemporary information and discourse suppress what is inconsistent with their version of goodfact.

Beware those who pass off "goodfacts" as "realfacts"--and beware of those who call for censorship generally.

Last Week's Biden Deepfake: A Few Thoughts

I remember how when "deepfakes" started getting press back in 2018 the emphasis--quite predictably given the inability of anything to compete with prurience or identity politics, and still more the combination of both, for the attention of the media elite--was overwhelmingly on the pornographic possibilities of the technology.

I do not say that the misuse and abuse of the technology was or is unimportant. But it seemed to me that there was less attention than there ought to have been to other uses, with this past week providing an excellent example. It seems that a deepfake of President Biden announcing a reinstitution of the draft for the sake of military confrontation with Russia and China "went viral."

If you have checked out the video for yourself you will have probably noticed that it was laughably crude even to the eyes of a non-expert--about as convincing as the "moving mouth" bits on Conan O'Brien.

But we are told that a good many people thought it was genuine.

One may take this as demonstrative of the public's unsophistication in such matters. (Certainly this is what elitist censorship-lovers prefer to emphasize.) However, this does not seem to me the only factor in their reaction. There is, too, the way they are experiencing the bigger world situation. Even those who prefer to attend as little as possible to the international scene have been less able to ignore it than before, and for anyone too young to remember when The Day After aired the international situation in 2023 may simply have no precedent within living memory. A return to the draft in America may remain a remote prospect for the moment, but all the same, they are conscious of the prospect of war, of old-fashioned, great power, in a way they have not been in generations, and watching the situation only escalate, and widen, as the prices they experience in the grocery store are chalked up to that conflict, they anticipate . . . something, something bad they will feel in their very own lives very soon. What all this means regarding their opinion about the war--whether they are supportive of it, or not supportive of it, or shifting from one attitude to the other--is less clear from this reaction, but it does seem worth remembering that people are less likely to be enthusiastic about an armed conflict when they think of it as something that will touch them personally, and take from them personally, rather than be invisible in their lives as the fighting is carried on solely by people they do not know in a place they cannot find on the map as they go about their daily lives merrily oblivious.

Friday, March 3, 2023

On the Prospect of World War Three. (How Do We Know a World War When We See It?)

These days we are hearing in a way we have not in a long time reference to "World War Three"--with many speaking of the Russo-Ukrainian War as potentially such a war, or, like Emmanuel Todd, telling us such a war is already ongoing. However, in judging such claims, which are hugely significant for how we see the world, it is necessary to consider just what one means by "World War Three"--figuring out which requires us to think about just what is a "World War."

It seems to me fair to say that a world war means a war which meets two requirements:

1. The war is "systemic" in nature. That means that the principal actors in an international system are belligerents in a conflict with system-level stakes--that virtually all the great powers are fighting with the prize going to the victors the dominance of the system. This means that they have scope to realize major goals to which others might be opposed (like claim a larger sphere of influence), or even make the rules for the system (like how the international economy is to be managed).

2. That system in question is a world-system, rather than a merely regional one.

In considering these requirements take that that favorite example of scholars of International Relations, the Peloponnesian War fought among the ancient Greek city-states in the late fifth century B.C.. To the extent that the alliances led by Athens and Sparta pretty much encompassed that system, while the Persian Empire ruling pretty much the rest of what to the Greeks was the "Known World" eventually came into the conflict, one could call it a systemic conflict. But this clash in a portion of the eastern Mediterranean was a far cry from a planetary-level conflict, as were all the other conflicts of ancient and Medieval times (colossal as some of them, like the wars of Chinese unification and reunification, appear even by today's standards).

The possibility of a system on a world scale, and thus wars on a world scale, only really emerged with the spread of the Western colonial empires from the fifteenth century on. Already in 1478 Spain and Portugal were in a position to fight a sea battle in the distant Gulf of Guinea, and the possibility, and actuality, only grew from there with their conquests in the Indian Ocean and the Americas seeing European battles in those places in the sixteenth century. Indeed, a conflict like the Nine Years' War (1688-1697) saw the participants in a predominantly West European conflict (mainly the French, Dutch, English, Spanish and Holy Roman Empires, fighting over essentially local issues) battling each other as far afield as North and South America, the Caribbean, West Africa and India, and on the basis of the results, dividing and redividing the world.

Of course, to say that these powers fought battles around the world is a different thing from saying that these battles were fought in wars of the world--those distant clashes still typically highly localized episodes between powers whose positions were limited and by no means secure in those places, and relatively minor in the life of those faraway regions, certainly next to their own local conflicts. Indeed, as eighteenth century historian Jeremy Black argues in Rethinking Military History, even as inter-European clashes in Indian waters and on Indian soil became increasingly regular and significant "the Mughal conquest of the Sultanate of Delhi in the 1520s, the Persian invasion in 1739–40 . . . and that of the Afghans in the 1750s" were and remained for a long time after "more important . . . than European moves." Likewise, Black reports, the rulers of eighteenth century China were much more concerned with the Dzungars than the European powers with which they had already clashed over the Amur Valley and Formosa (both of which China recovered successfully in the seventeenth century). And so forth.

However, as the colonial empires went on getting bigger and more powerful and more secure, and technology enhanced not just their connectivity but their reach (with railroads, steamships, telegraphs) those conflicts grew not only in extent but in the intensity with which they were waged, such that one could think of the war that broke out among them in 1914 as truly a "world war." Involving as it did the Austro-Hungarian, Serbian, Russian, German, French, Belgian and British governments from its first three days on, this automatically embroiled polities controlling over half the world's territory and population. Shortly afterwards the Ottoman Empire, Italy, Japan, the United States and China, among many more, also joined in, with the result that nearly all of Eurasia, Africa, North America and Australasia involved. Of the major world regions only Latin America was left out to any significant extent, though it should be remembered that in 1917 the proposal of a German alliance with a Mexico in which U.S. troops were intervening (the "Zimmermann Telegram") played a part in drawing the U.S. into World War I, while shortly after the U.S. entry South America's largest country, Brazil, also officially entered the war on the side of the Allies. The result was that nearly all of the world's people lived under governments actively fighting in the war, while the war across semi-colonized territories like China before its entry into the conflict, and the fighting at sea, the economic dislocations the fighting entailed, and much else, meant that very little of humanity was untouched by the war in one way or another. Moreover, that war's consequences were vast, with several empires (not least the Austro-Hungarian, German, Ottoman, Russian) broken up, control of much of the world transferred from one power to another (as with the new empires Britain and France carved out of the old Ottoman territory), and efforts made to set up new arrangements for dealing with the world's problems (like the League of Nations) as Germany's aspiration to world power was defeated (to the point of Germany being shorn of its colonies, its homeland reduced in size, its economy burdened with reparations and its government disarmed) and Britain remained (officially, at least,)the leading power, while the U.S. became a significant factor in European affairs in unprecedented fashion. World War Two was, if anything, even more global than the first in its participants, and its impact on the world's life--ushering in the U.S.-led world order we have now, with its currency and trading arrangements (the U.S. dollar as the principal currency of international trade in the open system ushered in by the General Agreement on Tariffs and Trade), institutions (the United Nations, the World Bank, the International Monetary Fund, the U.S.-led network of military alliances like NATO) and much else.

But what about the wars since? Certain observers, especially of the neoconservative bent, characterized the Cold War as "World War Three," and the post-Cold War conflicts against terrorists, "rogue states" and other such parties "World War Four." Still, that said more about their eagerness to mobilize the public behind the most extravagant pursuit of those conflicts than it did the reality of those wars. If for the citizens of China, Korea, Indochina and other places the Cold War did not stay cold locally--indeed, millions dying in wars that were localized but as intense and brutal as any that had ever been fought--it was never a matter of full-blown war at the systemic level. And conflicts with terrorist groups, disparate states and the rest were far too diffuse a thing to really be considered a coherent conflict. It was also too far removed from the level of the system--as the great powers were not really opposing each other here--for the idea to stand up to scrutiny.

As a result World War Three remained for most observers the war yet unfought.

Of course, if you have read this little lesson in "IR" up to this point you were probably hoping that I would say something about what to make of the current conflict, rather than just treating the theoretical side of the matter, and I will not disappoint you here. Today's great power list includes, at a minimum, the U.S., Russia and China, with Japan and the major European powers (Germany, France, Britain, collectively, at least), and India, also having claims. The current war may be said to involve nearly all of these actors in the conflict in Ukraine to some degree--with Russia fighting a Ukraine whose backing by the U.S.-led, Europe-including NATO alliance is massive and escalating--while Russia has its partnership with China. Meanwhile the conflict of the U.S. and Japan and India with China is similarly intensifying.

It seems to me that NATO's war with Russia escalating to a significant commitment of forces to Ukraine's side in direct contact with their Russian counterparts would at least put the war into World War III territory. The conflict's "World War III" status would become absolutely unambiguous were the European conflict to become linked with China's conflicts with the U.S., Japan and other regional actors, either through China's support of a Russia fighting NATO, or an outbreak of fighting in East Asia itself, a hardly unprecedented development--world wars, after all, typically becoming world wars not on the first day, but as other actors initially outside them find themselves compelled to participate, and differing conflicts merge together in the process. (Initially Germany in northern Europe, Italy in the Mediterranean and Japan in East Asia pursued their imperial ambitions apart from each other. It was only after the fall of France in 1940 that Italy threw its lot in with Germany, and in 1941 that the German attack on the Soviet Union, Japan's attack on Western possessions in Asia and the western Pacific, and the subsequent round of declarations of war, tied the Asian and European conflicts together into one big conflict between the U.S.-British-Soviet-dominated Allies and the German-Italian-Japanese-led Axis.)

By contrast in the absence of direct, large-scale hostilities between great powers the issue becomes much more ambiguous. Just how do we read NATO's present, not potential or even likely but actual-at-this-moment provision of aid to Ukraine? How do we read China's backing of Russia now? That is what determines whether we continue to think of World War Three as a significant possibility--or whether, as Todd argues, the conflict as already begun. With even the most basic facts of the matter incomplete and much disputed, I have to admit that there seems to me room for the kind of argument here. However, there is no argument whatsoever about the fact that, if we are not already seeing World War III we are closer to it than we have ever been since the 1980s, and maybe even since 1945.

Thursday, February 16, 2023

What The Magnificent Ambersons Can (Also) Teach Us About Technological Change

There is little doubt at this point that the media has quite oversold the progress of automation in our era. (Exemplary is the misrepresentation of the Frey-Osborne study on automation, to the degree that it caused a panic at the time and commentators, still repeating what it did not actually say as if it did, now sneer at the study as having got it wrong on those grounds.)

Yet there is the other end of the discourse, with its sneering dismissal of automation. This takes different forms--for instance, the blithe dismissal of the very idea of technological unemployment as if it were some logical impossibility by economists who "function like the caricature of the physicist whose every inquiry begins with 'imagine a perfectly spherical cow of uniform density on a frictionless plane'" (with the fact that this enables them to not worry about unemployment, and the remedies for it they find so deeply distasteful, not irrelevant to their prejudices).

However, what has interested me as of late is the way some react to the incomplete or imperfect implementation of automation in places with humans "backing up" the bots--filling in gaps, offering corrections, etc.--or simply the machine taking over parts of the work process as humans see to others. (For instance, one hears of the fast-food chains operating experimental outlets where the customer never deals face to face with a human, but humans are "in the back," preparing the food.)

One can plausibly see these situations as a matter of experimentation and refinement on the way to, at least some of the time, producing a more thoroughly automated process. But these commentators commonly react dismissively, pointing to such as evidence of some function being inherently "unautomatable." Frankly, they often do so in a heckling manner that reminds me of the idiots in The Magnificent Ambersons who, whenever they saw a car broken down by the side of the road, oafishly taunted its driver with the yell "Git a hoss!"

Well, those cars broke down less and less often, giving them less opportunity to yell "Git a hoss!" Instead the taunters found themselves having to "Git a car!" instead. And so it may go here in many a case--with the taunting helping no one at all.

Wednesday, February 15, 2023

What if it's Actually the Other Way Around? What if it's the "Higher-Skill" Jobs That Are Getting Automated First?

The title of this item may seem counterintuitive--not only to those who adhere to the conventional wisdom, but those who actually know something of the history of technology. The Industrial Revolution, after all, appeared to be about economizing the use of "muscle" rather than "brain," in part through the mobilization of more brain workers to design and build and operate the machines that replaced the earlier workers, as with, for example, the carding, spinning, weaving machinery of the eighteenth century that revolutionized textile production. However, the appearance is deceptive. The machinery replaced not just the physical effort but the mental effort--the mental skills--the learning--that the human workers put into the carding, the spinning, the weaving.

The latter may well have been the more important, and the trend more evident in this age of computer intelligence. As analysts of automation have long observed, down to Carl Benedikt Frey and Michael Osborne, "perception and manipulation," "finger" and "manual" dexterity, the ability to work in "cramped" spaces and "awkward" positions, have been significant bottlenecks to automation--while we take utterly for granted computers' capacity to perform vast quantities of complex calculations with a speed and accuracy far, far beyond that of the most able human. The result has been that the armies of mathematicians who served as "human computers" (like those brought to public attention by the film Hidden Figures) have been easily and quietly replaced by electronic computers in contrast with, for example, the janitors that keep the offices clean, and their less appreciated skills.

So it goes with the more recent wave of automation. The imminence of the self-driving car was oversold in the mid-'10s, such that in 2023 few expect to see them anytime soon. But there is great excitement over the capacity of chatbots like GPT-3 to, among much else, write code, to the point that there is much argument over whether coders will not become obsolete in the manner of human computers (while still more advanced and capable versions of the bot are expected before even the end of this year). If true this will mean that, in line with the bottlenecks previously discussed, artificial intelligence will have "learned to code" before it has "learned to drive"--fitting the aforementioned pattern all too well.

Granted, in all of these cases this elimination of "skilled" workers only went so far. If human computers were replaced, and coders might be replaced, this depended on persons with other skills conventionally regarded as higher still--the computer scientists who created the electronic computers (while in the latter case the rocket scientists the human computers supported remain very much with us). Likewise it is one thing to replace coders, another to replace "higher level" software engineers.

Yet even allowing for all of this the reality still complicates the conveniently hierarchical view so many take of these matters, enough that some rethinking of some widespread assumptions seems warranted. One is just what really should be thought of as constituting "high-level" skills--and the way in which our ideas about the intelligence required for particular forms of work reflects social prejudices (for instance, the tendency to denigrate those who have to move and "use their hands" as against persons in more stationary and less manual jobs). Another is that matter of which jobs will continue to be done longest by humans in an automating world--with the answer the exact opposite of the clichès. "Learn to code," the stereotypical, callous elitist sneers at truck drivers fearful they will be put out of a job--but it may well be that the pool of coding jobs will dry up before the pool of truck driving jobs does. Indeed, those really pessimistic about automated driving may expect that not just the coders but the software engineers will increasingly be put out of a job by improvements in AI even as humans go on driving those trucks. And of course, anyone who takes all this at all seriously should think long and hard about what all this means for glib talk about sending more people to college and more STEM, STEM, STEM! as the pseudo-thinking person's answer to all the woes of the work force.

Will China Follow Japan's Demographic--and Economic--Trajectory?

In 2022 China's population fell for the first time in at least six decades, by 850,000 persons from 1.42 billion the year before.

However, it is worth remembering that China's working-age population (if defined as the 15-64 age category) had already been falling for many years before that--from its peak of 988.3 million in 2015 to 977.1 million by 2021 according to World Bank figures.* A drop of over 11 million, the reduction alone is equal to the entire working population of the seventeenth largest economy in the world, that of the Netherlands.

Moreover, significant as this is in itself still more significant is what it seems to portend, especially given the obvious point of comparison provided by Japan. That nation saw its own working-age population top out at a bit over 87.1 million in 1994-1995--and then steadily fall. A quarter of a century on, in the pre-pandemic year of 2019, it was down to 74.3 million--and as of 2021 down almost another million, to 73.4 million.

In short, the country's working-age population shrank about 15 percent in 25 years--a fact that has had significant, if generally poorly understood, economic consequences. We have heard much about the country's "lost decades" since the 1990s (to much shameful gloating in certain circles disapproving of its manufacturing orientation and its statist inclinations). But the truth is that, if Japan's economic growth during this period has been a far cry from what it was its boom decades, when we adjust the numbers for the contraction of its population we find that it actually grew about as fast as the other advanced industrialized countries with which it may properly be compared. That is to say that when we consider how Japan did per-worker it did as well as they--and perhaps better when we consider the country's continued run of manufacturing success in critical fields like the inputs required for microchip production, or robotics. However, the dwindling of the numbers of those workers (and one might add, the vagaries of currency values) were sufficient to significantly offset its growth at the aggregate level. And so even as Japan succeeded by some measures, its profile receded in others.

Not only is it easy to picture China broadly traveling down Japan's path, but some are predicting exactly that, with China's demographic contraction no less dramatic. As China's overall population falls (perhaps by a hundred million between now and mid-century) its working-age population will likely fall more sharply--with an official Chinese government estimate in 2016 anticipating a fall of nearly a quarter by mid-century, while I would not be surprised to find more recent estimates more aggressive still.

Meanwhile the economic consequences would be even more dramatic. After all, when Japan's demographic contraction got underway the country was not only already solidly "First World," but on its "cutting edge" technologically. By contrast China, with a per capita GDP between a quarter and a third that of the "high income" countries, remains a developing nation. That same fact means that the country, still playing "catch up," can expect to make headway relatively easily for a time, further narrowing the gap with the more industrialized countries, but with that contracting force (and the economic demands of an aging population) increasingly offsetting that progress, which was already slowing even before the disruptions of the pandemic. (In the 1978-2011 period China managed a 9 percent a year per capita growth rate, doubling per capita GDP every eight years--with the country managing a 10 percent a year in that last decade of 2002-2011. Since then it has not done much better than 6 percent, even when we exclude the pandemic. A comparable drop in the growth rate over the next decade would, of course, see China's progress slow to the weak rates of the more developed countries while, again, still well short of their income.)

Of course, what one is talking about (in China's case, at least) is a projection, rather than something that has already happened, and China's government is anxious to alter the country's demographic trajectory--with the conventional wisdom going that Sweden provides a model for how this could be done by making work and other aspects of social life more hospitable for working mothers. However, as Elizabeth Bauer observed, this thinking slights the reality. Apart from the fact that Sweden's fertility rate has been pretty consistently below "replacement level" (2-2.1) these past three decades (the average over 1990-2019 was 1.8, and the 2019 figure actually closer to 1.7), Sweden's rate was never so low as China's is now reported to be (1.3 as of 2020). The result is that the distance between where China is and where China's leaders want it to be is greater--while one might add, too, that Sweden never had China's skewed sex ratio to contend with (China still, as of 2019, seeing the birth of 113 males for every 100 females).

Moreover, to the extent that Sweden's social provision was effective it was provided by a society far richer than China's happens to be now.

Thus, along with the gap to be made up being greater, the resources for any sort of effort of this kind are smaller--and I suspect the politics less conducive. China's young people, like their counterparts elsewhere, if in particularly sharp fashion, are caught between the slave-driving work culture celebrated by a Jack Ma (darling of the business press, likely less beloved by China's working people), and the spread of a counter-culture of "lying flat" by young people balking at the absurdity of the rat race. The result is that it will take a lot more than the kind of band-aids that so-called "pragmatic" policymakers and their media cheer-leaders so prefer to stabilize the country's working-age population--enough so that a sharp contraction along some lines seems much more likely, and figuring out how to manage it perhaps a more appropriate object of their attention than fantasies of avoiding it altogether.

* The Chinese government defines "working-age" in terms of the 15-59 category, reflecting its having set a retirement age of 60, but I am using 15-64 because of its greater convenience for international comparison (while it should be remembered that China's government is openly discussing raising the retirement age, possibly making the 15-59 category less relevant).

Subscribe Now: Feed Icon