Ever since the announcement of the AUKUS partnership and the Australian nuclear submarine program for which it would allow there has been speculation about just how Australia would actually be supplied with such submarines--or at least, what at least the initial plan for such supply would be. In the wake of a joint statement by U.S. President Joseph Biden, British Prime Minister Rishi Sunak, and Australia's Prime Minister Anthony Albanese at Naval Base Point Loma in San Diego the period of speculation is over.
Simply going by the official transcript of the statement by the three figures--if also distilling these speakers' all too characteristically rambling, repetitive and public relations-laden rhetoric into hard fact--one gets the following:
Australia will buy three Virginia-class nuclear-powered (but conventionally-armed) boats starting in the early 2030s, with an option on two more. These acquisitions will then be followed by a British-designed next generation of likewise nuclear-powered but conventionally-armed boats incorporating British, Australian and U.S. technology that both Britain and Australia will use, and which will "share components and parts with the U.S. Navy," to the point of communicating "with the same equipment"--the "AUKUS SSN." Australia will build its copies of those AUKUS boats domestically, and sustain them domestically. They will also be commanded by the Australian Navy as a "sovereign capability"--apart from Australia's reliance on imported nuclear fuel, as it will not produce its own (that capacity, of course, necessarily entailing the capacity to produce fissile material, which would be a breach of its non-proliferation commitments). The first of those boats will be delivered in 2042, with new deliveries every three years on, and an ultimate goal of eight boats.
In the meantime Australian personnel have been undergoing "nuclear power training" in the U.S., while "[b]eginning this year, Australian personnel will embed with U.S. and UK crews on boats and at bases in our schools and our shipyards," and U.S., and British, nuclear submarines will make more port visits to Australia, on the way to the establishment of "a rotational presence of U.S. and UK nuclear-powered subs in Australia to help develop the work force Australia is going to need to build and maintain its fleet" later in the decade.
Rounding out the vision of tri-national cooperation there is also the expectation that, Britain and Australia's sovereign but identically equipped and closely collaborating fleets ("[o]ur submarine crews will train together, patrol together . . . maintain their boats together"), and the similarly linked U.S. fleet, will go a long way to realizing that ideal not only of partnership, but efficiency-achieving homogeneity even as each retains control of its assets.
One may add to this that these figures, particularly Australian PM Albanese, also made much of the submarine program from an "industrial policy" standpoint, claiming that "[t]he scale, complexity, and economic significance of this investment is akin to the creation of the Australian automotive industry in the post-World War Two period."
Of course, what we are hearing reported in the press now is not limited to what was presented in the official statement. There are, of course, questions about just how "sovereign" the Australian boats really will be. (As things stand, Australia's nuclear industry is virtually nonexistent--the capacity to build and service such boats as yet nonexistent, and its actual establishment yet to be seen.) There is also the matter of the immensity of the resources invested. Were the cost of the program over the longer haul is concerned we are hearing figures in the $250 billion (U.S. dollars) range for the next three decades. This sum is about equal to 15 percent of Australian GDP today (circa $1.6 trillion), and equal to some $4 trillion in the case of an economy the size of that of the U.S.--exceeding any comparable expenditure the U.S. has made--and even if spread out over three decades, with the tendency of price tags like these to rise set aside, and generous allowance for the possibility of economic growth aided by the "stimulus" of the program, there is no getting away from the sheer size of the bill relative to the country's resources, with that this implies. (Even those who are supportive of such high military expenditure may, in fact, wonder if this is really the best way of getting "value for the money.") And of course there is the geopolitical rationale for the program. The sole reference in the entire transcript to China was mention of the U.S. having "safeguarded stability in the Indo-Pacific for decades to the enormous benefits of nations throughout the region, from ASEAN to Pacific Islanders to the People’s Republic of China." Yet there would be no discussion of such a project at all were it not for the concern among the country's policymakers, and the governments of its allies, about the possibility of conflict with China --and not all are sanguine about a militarized course. (Indeed, many in Australia, and elsewhere, are convinced that confrontation is the wrong path--however much news outlets like the Sydney Morning Herald, with its "Red Alert" series, insist otherwise.)
The result is that if the joint statement answered some questions a great many others remain to be decided--the more in as a very great deal can happen over the very long time frame assumed in this program, which, even if going entirely as planned and scheduled, will not deliver the first Australian-built AUKUS sub to the country's navy for two decades.
After all, just how much resemblance does the world of 2023 bear to the expectations of two decades prior?
Friday, March 17, 2023
Tuesday, March 14, 2023
The Upcoming British Defence Review
The British government was supposed to have a defense review out on March 7--one the more eagerly anticipated because of the fact that, as its appearing a mere two years after the last rather dramatic review indicates (in the post-Cold War era, recalling the 1994 Defence Costs Study after the monetary-fiscal crisis of the early '90s, the "New Chapter" to the 1998 Strategic Defence Review after the September 11 attacks), it was expected to address a painful shock, or rather several of them. The protraction of the pandemic, Britain's especially weak economic condition afterward, the worsening of matters by "Trussonomics"--and of course, the war in Ukraine, amid a generally deteriorating international scene--make it easy to picture changes coming.
The review, of course, does not seem to have appeared as of the time of this writing (March 9). On March 7, the day it was supposed to actually be out, we were told that it "will be published in the coming days" (around the time of the Spring Budget 2023, which is scheduled for March 15)--against a backdrop of speculation that dispute within the government over policy and its resourcing was likely to compel such a delay--the Ministry of Defence apparently objecting to being asked to do more without a greater infusion of cash.
We will have to wait until then to see how the government will try to square the circle of its ends and means--or rather, pretend to do so. Since the nineteenth century Britain has been consistently, severely, overstretched militarily--the long-range big-picture history one retrenchment after another. Especially since 1945 the retrenchments have come thick and fast--in part because the cuts to commitments always significantly lagged the cuts to forces, which in turn lagged the cuts to the armed forces' budget, the reduction of which was never in line with the reduction in what Britain could afford. The result was that the situation quickly compelled a new round of adjustments--generally, cuts--of all of the above, each inadequate in the same way as before, Britain requiring its armed forces to do more than was reasonable, while being penurious with the forces it did have, and even that penurious spending more than the British fiscal state or the British economy on which it stood could bear (even, one might add, as British budget-cutters inflicted increasing pain on their populations with ever-greater stringency in services and social programs, down through the death rate-elevating 2010-2019 round of austerity), in a vicious circle that just went on and on.
There is not much left to cut these days, with British forces, which have shrunk almost every year since the Korean War, now under 150,000 strong, with an army that cannot boast a single proper heavy division (as against the three armored divisions it had in Germany during the Cold War), and a fleet stretched to the limit merely providing escort to its handful of carriers and nuclear ballistic missile submarines.
How could this be when Britain has been, and remains, one of the world's biggest spenders? The story is a complex one, but a key part has been British governments' hanging on to certain superpower-type trappings--a nuclear deterrent, a global base network (extending from Alberta to Kenya, from Cyprus to Oman, from the Falklands to Singapore), a relatively robust defense-industrial base that has the country building its own tanks and warships and fancying itself building sixth-generation super-fighters. Surprisingly recent decades have seen the government bring back things it had admitted that it could not afford back in the 1960s--not least the aforementioned big carriers (the seeds of which lie in the 1998 review), and an "east of Suez" orientation (emergent with the "Global Britain" rhetoric and alarums about China, the essential thrust of the "tilt toward the Indo-Pacific" of the 2021 review). Meanwhile, ever-extending the gap, Larry Elliott's quips about a Britain with a "fantasy island" economy sinking toward Third World status looks less and less like hyperbole all the time. (Crunching the numbers recently it seemed to me that the first industrialized nation is approaching the condition of a developing nation, certainly if the manufacturing-intensiveness of the economy is the standard.)
The more critical, quite fairly, speak of post-imperial "grandeur" prevailing over the government's realistically living "within its means," all as the same government wrecks the economic foundation providing those means for decade after decade. Still, severe as the situation has become, I do not think retrenchment or rationalization are to be expected from the coming review--just as I have no confidence that the current British government, nor one headed by its Parliamentary opposition, is likely to shift the country's economic course to any appreciable degree. Rather I expect that the new policy will be grandiosely described, materially overambitious, inadequately resourced, and (the more so for combination with that dismal economic situation) necessitate overhaul in another very short period of time--likely, before 2028. Because in our time the pattern of Establishment decision-making, in all things and everywhere, is "If it's broke, don't fix it."
The review, of course, does not seem to have appeared as of the time of this writing (March 9). On March 7, the day it was supposed to actually be out, we were told that it "will be published in the coming days" (around the time of the Spring Budget 2023, which is scheduled for March 15)--against a backdrop of speculation that dispute within the government over policy and its resourcing was likely to compel such a delay--the Ministry of Defence apparently objecting to being asked to do more without a greater infusion of cash.
We will have to wait until then to see how the government will try to square the circle of its ends and means--or rather, pretend to do so. Since the nineteenth century Britain has been consistently, severely, overstretched militarily--the long-range big-picture history one retrenchment after another. Especially since 1945 the retrenchments have come thick and fast--in part because the cuts to commitments always significantly lagged the cuts to forces, which in turn lagged the cuts to the armed forces' budget, the reduction of which was never in line with the reduction in what Britain could afford. The result was that the situation quickly compelled a new round of adjustments--generally, cuts--of all of the above, each inadequate in the same way as before, Britain requiring its armed forces to do more than was reasonable, while being penurious with the forces it did have, and even that penurious spending more than the British fiscal state or the British economy on which it stood could bear (even, one might add, as British budget-cutters inflicted increasing pain on their populations with ever-greater stringency in services and social programs, down through the death rate-elevating 2010-2019 round of austerity), in a vicious circle that just went on and on.
There is not much left to cut these days, with British forces, which have shrunk almost every year since the Korean War, now under 150,000 strong, with an army that cannot boast a single proper heavy division (as against the three armored divisions it had in Germany during the Cold War), and a fleet stretched to the limit merely providing escort to its handful of carriers and nuclear ballistic missile submarines.
How could this be when Britain has been, and remains, one of the world's biggest spenders? The story is a complex one, but a key part has been British governments' hanging on to certain superpower-type trappings--a nuclear deterrent, a global base network (extending from Alberta to Kenya, from Cyprus to Oman, from the Falklands to Singapore), a relatively robust defense-industrial base that has the country building its own tanks and warships and fancying itself building sixth-generation super-fighters. Surprisingly recent decades have seen the government bring back things it had admitted that it could not afford back in the 1960s--not least the aforementioned big carriers (the seeds of which lie in the 1998 review), and an "east of Suez" orientation (emergent with the "Global Britain" rhetoric and alarums about China, the essential thrust of the "tilt toward the Indo-Pacific" of the 2021 review). Meanwhile, ever-extending the gap, Larry Elliott's quips about a Britain with a "fantasy island" economy sinking toward Third World status looks less and less like hyperbole all the time. (Crunching the numbers recently it seemed to me that the first industrialized nation is approaching the condition of a developing nation, certainly if the manufacturing-intensiveness of the economy is the standard.)
The more critical, quite fairly, speak of post-imperial "grandeur" prevailing over the government's realistically living "within its means," all as the same government wrecks the economic foundation providing those means for decade after decade. Still, severe as the situation has become, I do not think retrenchment or rationalization are to be expected from the coming review--just as I have no confidence that the current British government, nor one headed by its Parliamentary opposition, is likely to shift the country's economic course to any appreciable degree. Rather I expect that the new policy will be grandiosely described, materially overambitious, inadequately resourced, and (the more so for combination with that dismal economic situation) necessitate overhaul in another very short period of time--likely, before 2028. Because in our time the pattern of Establishment decision-making, in all things and everywhere, is "If it's broke, don't fix it."
Saturday, March 11, 2023
Kristin Tate, Artificial Intelligence and the Decline of the White Collar: Thoughts
Recently suggesting that artificial intelligence might displace "white collar" workers before it displaced "blue collar" workers performing more mobility and dexterity-demanding manual labor I had the impression that I was unlikely to see anyone arguing along similar lines in a mainstream forum anytime soon. Naturally it was a surprise to see Kristin Tate making a version of the case in The Hill.
Considering Ms. Tate's argument I find her very bullish on the prospect--more than I think is warranted. I have simply seen artificial intelligence oversold far too highly for far too long to be anything but guarded toward the more recent claims for "the coming artificial intelligence economic revolution." (Remember Ray Kurzweil back at the turn of the century? Or how profoundly the hacks in the press misunderstood the Frey-Osborne study from 2013, creating the material for panic out of their highly qualified, rather uncertain conclusions?) I also think Ms. Tate overlooks the important fact that (as Frey and Osborne noted, not that anyone was paying attention to what they actually said) its being technically feasible to do something does not make it automatically worthwhile to do that thing--and so underestimates the economic and cultural and political obstacles to actually using these technologies to substitute for human labor, even were the performance of the technology perfectly adequate from a purely technical standpoint. Ms. Tate, for example, acknowledges that humans might be preferred in face-to-face jobs, and counts a certain amount of legal work among them--but overlooks the extent to which professionals, especially those belonging to the more powerful professions, can be expected to resist their replacement by AI (as was demonstrated in their reaction to the plans to have the first "robot lawyer" plead a "client's" case in a California court back in February).
I might also add that the hint of gloating Schadenfreude toward the college-educated "elite" and celebration of "blue collar values," especially to one alert to her politics, makes it easy to picture her reveling in the anticipated ruin of a group of people she despises--or perhaps simply "trolling" them with the thought of their ruin--coming in ahead of critically-minded caution.
Still, the essential argument is not just well worth thinking about but, if we start actually seeing chatbots and other technologies really make their mark, likely to become more commonplace--with the same going for the consideration of their implications.
Considering Ms. Tate's argument I find her very bullish on the prospect--more than I think is warranted. I have simply seen artificial intelligence oversold far too highly for far too long to be anything but guarded toward the more recent claims for "the coming artificial intelligence economic revolution." (Remember Ray Kurzweil back at the turn of the century? Or how profoundly the hacks in the press misunderstood the Frey-Osborne study from 2013, creating the material for panic out of their highly qualified, rather uncertain conclusions?) I also think Ms. Tate overlooks the important fact that (as Frey and Osborne noted, not that anyone was paying attention to what they actually said) its being technically feasible to do something does not make it automatically worthwhile to do that thing--and so underestimates the economic and cultural and political obstacles to actually using these technologies to substitute for human labor, even were the performance of the technology perfectly adequate from a purely technical standpoint. Ms. Tate, for example, acknowledges that humans might be preferred in face-to-face jobs, and counts a certain amount of legal work among them--but overlooks the extent to which professionals, especially those belonging to the more powerful professions, can be expected to resist their replacement by AI (as was demonstrated in their reaction to the plans to have the first "robot lawyer" plead a "client's" case in a California court back in February).
I might also add that the hint of gloating Schadenfreude toward the college-educated "elite" and celebration of "blue collar values," especially to one alert to her politics, makes it easy to picture her reveling in the anticipated ruin of a group of people she despises--or perhaps simply "trolling" them with the thought of their ruin--coming in ahead of critically-minded caution.
Still, the essential argument is not just well worth thinking about but, if we start actually seeing chatbots and other technologies really make their mark, likely to become more commonplace--with the same going for the consideration of their implications.
Thursday, March 9, 2023
Revisiting Chris Hedges' Death of the Liberal Class
Back when it came out I read and reviewed Chris Hedges' Death of the Liberal Class.
More recently I have had occasion to revisit its argument.
As I observed in the initial review, despite the book's title Hedges' focus is so much on liberals' actions, as against the larger, often highly inimical political scene in which they had to operate, that he tells only part of the story, with that part perhaps more fittingly titled "The Suicide of the Liberal Class." Still, in spite of that close attention to their conduct--and his getting a good deal right, like the way in which liberals consistently ganged up with the right to beat up on the left, and the way in which a condition of permanent war has a corrosive effect on liberalism (with Cold War anti-Communism combining the two)--there were significant limitations to his analysis. In particular his address of the relevant mechanisms seemed inadequate--I would now say, because of the weaknesses of his understanding of liberalism itself.
Consider that "liberalism" as presented in the work of mid-century theoreticians such as Arthur Schlesinger Jr. and Daniel Bell, with a key part of their story their response to the horrors of the world war era--fascism, the Holocaust, etc.. There was a leftist reading of all this--that the whole old order of capitalism, nation-states, and the rest, was bankrupt and irredeemable and something better had to be built. However, there was also a conservative reading of all this--namely that it was trying to be build something better that was the cause of the problem. Schlesinger and Bell (among many, many others) inclined to the latter view, with all its implications not only for the socialism that had once attracted them, but plain and simple liberalism. Liberalism, after all, is founded on the efficacy of reason as a way of understanding and directing social life, and accordingly the importance of social criticism and the possibility of progress. However, these "centrists" instead despaired of reason, and uncomfortable with the criticism founded on it, because they were pessimistic about progress--with this going so far as a quasi-theological belief in the existence of "evil," and disdain for the mass as a "swinish multitude." Indeed, that particular liberal form of government, democracy, became something which they were uncomfortable, such that they increasingly endeavored to limit it--insisting on a political discourse stripped of ideology and values and anything else that might call into question the system, or arouse discontented masses, from which has emerged that form of discourse we now speak of as "pluralist" and civil. They also shifted away from that other liberal standard, universalism, toward particularism, with (certainly as seen in the historiography of Daniel Boorstin) the U.S. a country uniquely gifted with a tradition of such discourse in contrast with the ideology-addled Old World.
All of these views--the attitude toward reason, social criticism and progress; the belief in evil and suspicion of masses; the suspicion of democracy, to the point of their being eager to bound it; the stress on difference over universality--together comprise the classical conservative package to such a degree that centrism must be recognized as an adaptation of classical conservatism to the circumstances of mid-twentieth century America. Given its influence it is far to say that American "liberalism" traded their old liberal philosophical foundations for conservatives ones that increasingly came to define American liberalism (as, to his credit, that other mid-century "liberal" intellectual Richard Hofstadter acknowledged in the 1950s).
Of course, that said these liberals still had their differences with the Taft-MacArthur-Goldwater right, but that did not diminish the fact of their conservatism, rather leaving them emphasizing a different side of the conservative tradition. Theirs was the conservatism which "pruned the tree"--which was prepared to compromise and make adaptations to defend what it deemed essential, as against the more uncompromising, or even reactionary, conservatism that Hofstadter, was to call "pseudo-conservatism." And of course, as the Cold War ran its course, the stalling of the post-war boom compelled a reconsideration of American social arrangements, and the Communist menace that had been the main reason to compromise disappeared, conservative centrists saw less reason to do so--and moved rightward accordingly.
Accordingly centrism/liberalism's aligning itself with the right against the left was not a matter of the corruptibility of liberals, but their most natural and predictable course of action, especially in the circumstances liberals went on facing over the twentieth century, and into the twenty-first.
More recently I have had occasion to revisit its argument.
As I observed in the initial review, despite the book's title Hedges' focus is so much on liberals' actions, as against the larger, often highly inimical political scene in which they had to operate, that he tells only part of the story, with that part perhaps more fittingly titled "The Suicide of the Liberal Class." Still, in spite of that close attention to their conduct--and his getting a good deal right, like the way in which liberals consistently ganged up with the right to beat up on the left, and the way in which a condition of permanent war has a corrosive effect on liberalism (with Cold War anti-Communism combining the two)--there were significant limitations to his analysis. In particular his address of the relevant mechanisms seemed inadequate--I would now say, because of the weaknesses of his understanding of liberalism itself.
Consider that "liberalism" as presented in the work of mid-century theoreticians such as Arthur Schlesinger Jr. and Daniel Bell, with a key part of their story their response to the horrors of the world war era--fascism, the Holocaust, etc.. There was a leftist reading of all this--that the whole old order of capitalism, nation-states, and the rest, was bankrupt and irredeemable and something better had to be built. However, there was also a conservative reading of all this--namely that it was trying to be build something better that was the cause of the problem. Schlesinger and Bell (among many, many others) inclined to the latter view, with all its implications not only for the socialism that had once attracted them, but plain and simple liberalism. Liberalism, after all, is founded on the efficacy of reason as a way of understanding and directing social life, and accordingly the importance of social criticism and the possibility of progress. However, these "centrists" instead despaired of reason, and uncomfortable with the criticism founded on it, because they were pessimistic about progress--with this going so far as a quasi-theological belief in the existence of "evil," and disdain for the mass as a "swinish multitude." Indeed, that particular liberal form of government, democracy, became something which they were uncomfortable, such that they increasingly endeavored to limit it--insisting on a political discourse stripped of ideology and values and anything else that might call into question the system, or arouse discontented masses, from which has emerged that form of discourse we now speak of as "pluralist" and civil. They also shifted away from that other liberal standard, universalism, toward particularism, with (certainly as seen in the historiography of Daniel Boorstin) the U.S. a country uniquely gifted with a tradition of such discourse in contrast with the ideology-addled Old World.
All of these views--the attitude toward reason, social criticism and progress; the belief in evil and suspicion of masses; the suspicion of democracy, to the point of their being eager to bound it; the stress on difference over universality--together comprise the classical conservative package to such a degree that centrism must be recognized as an adaptation of classical conservatism to the circumstances of mid-twentieth century America. Given its influence it is far to say that American "liberalism" traded their old liberal philosophical foundations for conservatives ones that increasingly came to define American liberalism (as, to his credit, that other mid-century "liberal" intellectual Richard Hofstadter acknowledged in the 1950s).
Of course, that said these liberals still had their differences with the Taft-MacArthur-Goldwater right, but that did not diminish the fact of their conservatism, rather leaving them emphasizing a different side of the conservative tradition. Theirs was the conservatism which "pruned the tree"--which was prepared to compromise and make adaptations to defend what it deemed essential, as against the more uncompromising, or even reactionary, conservatism that Hofstadter, was to call "pseudo-conservatism." And of course, as the Cold War ran its course, the stalling of the post-war boom compelled a reconsideration of American social arrangements, and the Communist menace that had been the main reason to compromise disappeared, conservative centrists saw less reason to do so--and moved rightward accordingly.
Accordingly centrism/liberalism's aligning itself with the right against the left was not a matter of the corruptibility of liberals, but their most natural and predictable course of action, especially in the circumstances liberals went on facing over the twentieth century, and into the twenty-first.
Monday, March 6, 2023
On "The Optimistic, Practical, Forward-Looking" View
In discussing the "power elite" the great sociologist C. Wright Mills wrote about what it takes to get to the top--which was not, in his analysis, competence in any of the senses in which believers in the world as some perfect meritocracy insist. Rather what matters is that one is perceived as loyal to the interests and prejudices of those in charge, making them an acceptable subordinate and successor on that level, essentially agreeable toward those whom they are obliged to be agreeable, and ready to do whatever it takes to get ahead--as with the moral compromises involved in meeting the first two criteria. All of this entails a great many behaviors, among them presenting themselves in a certain way, always speaking "to the well-blunted point" by "mak[ing] the truism seem like the deeply pondered notion," and "soften[ing] the facts into the optimistic, practical, forward-looking, cordial, brisk view," which brushes off hard realities and rather than finding a genuine bright spot in a dark picture and usefully working with that to set things right in the manner one would hope of a responsible and worthy administrator and leader, usually just "bright-sides" the listener or reader.
I have long found that last trait--that speaking to the blunted point, that softening of the facts into the "optimistic, practical, forward-looking" view--especially repugnant and frustrating. And it seems to be part of the general enshittification of the Internet that when we go looking for explanations and insight that "optimistic, practical, forward-looking" view is constantly inflicted on us instead, precisely because what search engines offer ever more these days is not answers to our questions but crappy products no one wants or needs. After all, a considerable portion of that consists of the would-be purveyors of advice of the self-help and related varieties, whose principal stock in trade is "bright-siding" you as they insist that whatever problem you are having is fixable with their glib one-size-fits-all prescriptions--or at least, pretend to be sure, as they really do not care whether that advice fixes anything, for what they really want is YOUR MONEY. Not getting enough readers for your blog (for example)? Well, here's what you must be doing wrong (they just assume), and here is what you can be doing differently (they just assume), but if you really want the whole package, buy this (as clearly they assume some of the people reading such swill will).
More and more of us are despairing of online search as a result--to such a degree that even so Establishment a news outlet as The Atlantic admits that search tools, like that old gray mare, "ain't what they used to be." And it seems to me that if it is indeed the case that Internet search engines as we know them are under serious threat from the latest generation of chatbots ("Did Google order the code red?" "You're Goddamn right it did!") the search engine industry helped make itself vulnerable through the ever-worse quality of its service--rendering itself dispensable through "creative destruction" not of more established products and services, but of themselves.
I have long found that last trait--that speaking to the blunted point, that softening of the facts into the "optimistic, practical, forward-looking" view--especially repugnant and frustrating. And it seems to be part of the general enshittification of the Internet that when we go looking for explanations and insight that "optimistic, practical, forward-looking" view is constantly inflicted on us instead, precisely because what search engines offer ever more these days is not answers to our questions but crappy products no one wants or needs. After all, a considerable portion of that consists of the would-be purveyors of advice of the self-help and related varieties, whose principal stock in trade is "bright-siding" you as they insist that whatever problem you are having is fixable with their glib one-size-fits-all prescriptions--or at least, pretend to be sure, as they really do not care whether that advice fixes anything, for what they really want is YOUR MONEY. Not getting enough readers for your blog (for example)? Well, here's what you must be doing wrong (they just assume), and here is what you can be doing differently (they just assume), but if you really want the whole package, buy this (as clearly they assume some of the people reading such swill will).
More and more of us are despairing of online search as a result--to such a degree that even so Establishment a news outlet as The Atlantic admits that search tools, like that old gray mare, "ain't what they used to be." And it seems to me that if it is indeed the case that Internet search engines as we know them are under serious threat from the latest generation of chatbots ("Did Google order the code red?" "You're Goddamn right it did!") the search engine industry helped make itself vulnerable through the ever-worse quality of its service--rendering itself dispensable through "creative destruction" not of more established products and services, but of themselves.
Realfacts, Goodfacts and "Fake News"
In the episode aired as the finale to the fourth season of Babylon 5, "The Deconstruction of Falling Stars," we got a look at the far future beyond the end of the series' story. In line with the Hegelian premise of the show there was a progressive movement within history--but on the way there Earth would see its share of further troubles, not least a period of rule by a totalitarian dictatorship evocative of Orwell's Oceania, with a government that distinguished between "realfact" and "goodfact." Realfacts are actual facts. Goodfacts may or may not be facts at all, but have the approval of those in power--and that government sees those and not the others as the relevant ones.
Hearing particular speakers rave about "fake news" in our time I have found myself wondering again and again whether the standard they have in mind for what is not "fake" is "realfact" or "goodfact." And as it turned out again and again it has been common for them to mean by "fake news" that news which does not align with the "goodfact"--as they demand that search engines, social media and the other infrastructure of contemporary information and discourse suppress what is inconsistent with their version of goodfact.
Beware those who pass off "goodfacts" as "realfacts"--and beware of those who call for censorship generally.
Hearing particular speakers rave about "fake news" in our time I have found myself wondering again and again whether the standard they have in mind for what is not "fake" is "realfact" or "goodfact." And as it turned out again and again it has been common for them to mean by "fake news" that news which does not align with the "goodfact"--as they demand that search engines, social media and the other infrastructure of contemporary information and discourse suppress what is inconsistent with their version of goodfact.
Beware those who pass off "goodfacts" as "realfacts"--and beware of those who call for censorship generally.
Last Week's Biden Deepfake: A Few Thoughts
I remember how when "deepfakes" started getting press back in 2018 the emphasis--quite predictably given the inability of anything to compete with prurience or identity politics, and still more the combination of both, for the attention of the media elite--was overwhelmingly on the pornographic possibilities of the technology.
I do not say that the misuse and abuse of the technology was or is unimportant. But it seemed to me that there was less attention than there ought to have been to other uses, with this past week providing an excellent example. It seems that a deepfake of President Biden announcing a reinstitution of the draft for the sake of military confrontation with Russia and China "went viral."
If you have checked out the video for yourself you will have probably noticed that it was laughably crude even to the eyes of a non-expert--about as convincing as the "moving mouth" bits on Conan O'Brien.
But we are told that a good many people thought it was genuine.
One may take this as demonstrative of the public's unsophistication in such matters. (Certainly this is what elitist censorship-lovers prefer to emphasize.) However, this does not seem to me the only factor in their reaction. There is, too, the way they are experiencing the bigger world situation. Even those who prefer to attend as little as possible to the international scene have been less able to ignore it than before, and for anyone too young to remember when The Day After aired the international situation in 2023 may simply have no precedent within living memory. A return to the draft in America may remain a remote prospect for the moment, but all the same, they are conscious of the prospect of war, of old-fashioned, great power, in a way they have not been in generations, and watching the situation only escalate, and widen, as the prices they experience in the grocery store are chalked up to that conflict, they anticipate . . . something, something bad they will feel in their very own lives very soon. What all this means regarding their opinion about the war--whether they are supportive of it, or not supportive of it, or shifting from one attitude to the other--is less clear from this reaction, but it does seem worth remembering that people are less likely to be enthusiastic about an armed conflict when they think of it as something that will touch them personally, and take from them personally, rather than be invisible in their lives as the fighting is carried on solely by people they do not know in a place they cannot find on the map as they go about their daily lives merrily oblivious.
I do not say that the misuse and abuse of the technology was or is unimportant. But it seemed to me that there was less attention than there ought to have been to other uses, with this past week providing an excellent example. It seems that a deepfake of President Biden announcing a reinstitution of the draft for the sake of military confrontation with Russia and China "went viral."
If you have checked out the video for yourself you will have probably noticed that it was laughably crude even to the eyes of a non-expert--about as convincing as the "moving mouth" bits on Conan O'Brien.
But we are told that a good many people thought it was genuine.
One may take this as demonstrative of the public's unsophistication in such matters. (Certainly this is what elitist censorship-lovers prefer to emphasize.) However, this does not seem to me the only factor in their reaction. There is, too, the way they are experiencing the bigger world situation. Even those who prefer to attend as little as possible to the international scene have been less able to ignore it than before, and for anyone too young to remember when The Day After aired the international situation in 2023 may simply have no precedent within living memory. A return to the draft in America may remain a remote prospect for the moment, but all the same, they are conscious of the prospect of war, of old-fashioned, great power, in a way they have not been in generations, and watching the situation only escalate, and widen, as the prices they experience in the grocery store are chalked up to that conflict, they anticipate . . . something, something bad they will feel in their very own lives very soon. What all this means regarding their opinion about the war--whether they are supportive of it, or not supportive of it, or shifting from one attitude to the other--is less clear from this reaction, but it does seem worth remembering that people are less likely to be enthusiastic about an armed conflict when they think of it as something that will touch them personally, and take from them personally, rather than be invisible in their lives as the fighting is carried on solely by people they do not know in a place they cannot find on the map as they go about their daily lives merrily oblivious.
Friday, March 3, 2023
On the Prospect of World War Three. (How Do We Know a World War When We See It?)
These days we are hearing in a way we have not in a long time reference to "World War Three"--with many speaking of the Russo-Ukrainian War as potentially such a war, or, like Emmanuel Todd, telling us such a war is already ongoing. However, in judging such claims, which are hugely significant for how we see the world, it is necessary to consider just what one means by "World War Three"--figuring out which requires us to think about just what is a "World War."
It seems to me fair to say that a world war means a war which meets two requirements:
1. The war is "systemic" in nature. That means that the principal actors in an international system are belligerents in a conflict with system-level stakes--that virtually all the great powers are fighting with the prize going to the victors the dominance of the system. This means that they have scope to realize major goals to which others might be opposed (like claim a larger sphere of influence), or even make the rules for the system (like how the international economy is to be managed).
2. That system in question is a world-system, rather than a merely regional one.
In considering these requirements take that that favorite example of scholars of International Relations, the Peloponnesian War fought among the ancient Greek city-states in the late fifth century B.C.. To the extent that the alliances led by Athens and Sparta pretty much encompassed that system, while the Persian Empire ruling pretty much the rest of what to the Greeks was the "Known World" eventually came into the conflict, one could call it a systemic conflict. But this clash in a portion of the eastern Mediterranean was a far cry from a planetary-level conflict, as were all the other conflicts of ancient and Medieval times (colossal as some of them, like the wars of Chinese unification and reunification, appear even by today's standards).
The possibility of a system on a world scale, and thus wars on a world scale, only really emerged with the spread of the Western colonial empires from the fifteenth century on. Already in 1478 Spain and Portugal were in a position to fight a sea battle in the distant Gulf of Guinea, and the possibility, and actuality, only grew from there with their conquests in the Indian Ocean and the Americas seeing European battles in those places in the sixteenth century. Indeed, a conflict like the Nine Years' War (1688-1697) saw the participants in a predominantly West European conflict (mainly the French, Dutch, English, Spanish and Holy Roman Empires, fighting over essentially local issues) battling each other as far afield as North and South America, the Caribbean, West Africa and India, and on the basis of the results, dividing and redividing the world.
Of course, to say that these powers fought battles around the world is a different thing from saying that these battles were fought in wars of the world--those distant clashes still typically highly localized episodes between powers whose positions were limited and by no means secure in those places, and relatively minor in the life of those faraway regions, certainly next to their own local conflicts. Indeed, as eighteenth century historian Jeremy Black argues in Rethinking Military History, even as inter-European clashes in Indian waters and on Indian soil became increasingly regular and significant "the Mughal conquest of the Sultanate of Delhi in the 1520s, the Persian invasion in 1739–40 . . . and that of the Afghans in the 1750s" were and remained for a long time after "more important . . . than European moves." Likewise, Black reports, the rulers of eighteenth century China were much more concerned with the Dzungars than the European powers with which they had already clashed over the Amur Valley and Formosa (both of which China recovered successfully in the seventeenth century). And so forth.
However, as the colonial empires went on getting bigger and more powerful and more secure, and technology enhanced not just their connectivity but their reach (with railroads, steamships, telegraphs) those conflicts grew not only in extent but in the intensity with which they were waged, such that one could think of the war that broke out among them in 1914 as truly a "world war." Involving as it did the Austro-Hungarian, Serbian, Russian, German, French, Belgian and British governments from its first three days on, this automatically embroiled polities controlling over half the world's territory and population. Shortly afterwards the Ottoman Empire, Italy, Japan, the United States and China, among many more, also joined in, with the result that nearly all of Eurasia, Africa, North America and Australasia involved. Of the major world regions only Latin America was left out to any significant extent, though it should be remembered that in 1917 the proposal of a German alliance with a Mexico in which U.S. troops were intervening (the "Zimmermann Telegram") played a part in drawing the U.S. into World War I, while shortly after the U.S. entry South America's largest country, Brazil, also officially entered the war on the side of the Allies. The result was that nearly all of the world's people lived under governments actively fighting in the war, while the war across semi-colonized territories like China before its entry into the conflict, and the fighting at sea, the economic dislocations the fighting entailed, and much else, meant that very little of humanity was untouched by the war in one way or another. Moreover, that war's consequences were vast, with several empires (not least the Austro-Hungarian, German, Ottoman, Russian) broken up, control of much of the world transferred from one power to another (as with the new empires Britain and France carved out of the old Ottoman territory), and efforts made to set up new arrangements for dealing with the world's problems (like the League of Nations) as Germany's aspiration to world power was defeated (to the point of Germany being shorn of its colonies, its homeland reduced in size, its economy burdened with reparations and its government disarmed) and Britain remained (officially, at least,)the leading power, while the U.S. became a significant factor in European affairs in unprecedented fashion. World War Two was, if anything, even more global than the first in its participants, and its impact on the world's life--ushering in the U.S.-led world order we have now, with its currency and trading arrangements (the U.S. dollar as the principal currency of international trade in the open system ushered in by the General Agreement on Tariffs and Trade), institutions (the United Nations, the World Bank, the International Monetary Fund, the U.S.-led network of military alliances like NATO) and much else.
But what about the wars since? Certain observers, especially of the neoconservative bent, characterized the Cold War as "World War Three," and the post-Cold War conflicts against terrorists, "rogue states" and other such parties "World War Four." Still, that said more about their eagerness to mobilize the public behind the most extravagant pursuit of those conflicts than it did the reality of those wars. If for the citizens of China, Korea, Indochina and other places the Cold War did not stay cold locally--indeed, millions dying in wars that were localized but as intense and brutal as any that had ever been fought--it was never a matter of full-blown war at the systemic level. And conflicts with terrorist groups, disparate states and the rest were far too diffuse a thing to really be considered a coherent conflict. It was also too far removed from the level of the system--as the great powers were not really opposing each other here--for the idea to stand up to scrutiny.
As a result World War Three remained for most observers the war yet unfought.
Of course, if you have read this little lesson in "IR" up to this point you were probably hoping that I would say something about what to make of the current conflict, rather than just treating the theoretical side of the matter, and I will not disappoint you here. Today's great power list includes, at a minimum, the U.S., Russia and China, with Japan and the major European powers (Germany, France, Britain, collectively, at least), and India, also having claims. The current war may be said to involve nearly all of these actors in the conflict in Ukraine to some degree--with Russia fighting a Ukraine whose backing by the U.S.-led, Europe-including NATO alliance is massive and escalating--while Russia has its partnership with China. Meanwhile the conflict of the U.S. and Japan and India with China is similarly intensifying.
It seems to me that NATO's war with Russia escalating to a significant commitment of forces to Ukraine's side in direct contact with their Russian counterparts would at least put the war into World War III territory. The conflict's "World War III" status would become absolutely unambiguous were the European conflict to become linked with China's conflicts with the U.S., Japan and other regional actors, either through China's support of a Russia fighting NATO, or an outbreak of fighting in East Asia itself, a hardly unprecedented development--world wars, after all, typically becoming world wars not on the first day, but as other actors initially outside them find themselves compelled to participate, and differing conflicts merge together in the process. (Initially Germany in northern Europe, Italy in the Mediterranean and Japan in East Asia pursued their imperial ambitions apart from each other. It was only after the fall of France in 1940 that Italy threw its lot in with Germany, and in 1941 that the German attack on the Soviet Union, Japan's attack on Western possessions in Asia and the western Pacific, and the subsequent round of declarations of war, tied the Asian and European conflicts together into one big conflict between the U.S.-British-Soviet-dominated Allies and the German-Italian-Japanese-led Axis.)
By contrast in the absence of direct, large-scale hostilities between great powers the issue becomes much more ambiguous. Just how do we read NATO's present, not potential or even likely but actual-at-this-moment provision of aid to Ukraine? How do we read China's backing of Russia now? That is what determines whether we continue to think of World War Three as a significant possibility--or whether, as Todd argues, the conflict as already begun. With even the most basic facts of the matter incomplete and much disputed, I have to admit that there seems to me room for the kind of argument here. However, there is no argument whatsoever about the fact that, if we are not already seeing World War III we are closer to it than we have ever been since the 1980s, and maybe even since 1945.
It seems to me fair to say that a world war means a war which meets two requirements:
1. The war is "systemic" in nature. That means that the principal actors in an international system are belligerents in a conflict with system-level stakes--that virtually all the great powers are fighting with the prize going to the victors the dominance of the system. This means that they have scope to realize major goals to which others might be opposed (like claim a larger sphere of influence), or even make the rules for the system (like how the international economy is to be managed).
2. That system in question is a world-system, rather than a merely regional one.
In considering these requirements take that that favorite example of scholars of International Relations, the Peloponnesian War fought among the ancient Greek city-states in the late fifth century B.C.. To the extent that the alliances led by Athens and Sparta pretty much encompassed that system, while the Persian Empire ruling pretty much the rest of what to the Greeks was the "Known World" eventually came into the conflict, one could call it a systemic conflict. But this clash in a portion of the eastern Mediterranean was a far cry from a planetary-level conflict, as were all the other conflicts of ancient and Medieval times (colossal as some of them, like the wars of Chinese unification and reunification, appear even by today's standards).
The possibility of a system on a world scale, and thus wars on a world scale, only really emerged with the spread of the Western colonial empires from the fifteenth century on. Already in 1478 Spain and Portugal were in a position to fight a sea battle in the distant Gulf of Guinea, and the possibility, and actuality, only grew from there with their conquests in the Indian Ocean and the Americas seeing European battles in those places in the sixteenth century. Indeed, a conflict like the Nine Years' War (1688-1697) saw the participants in a predominantly West European conflict (mainly the French, Dutch, English, Spanish and Holy Roman Empires, fighting over essentially local issues) battling each other as far afield as North and South America, the Caribbean, West Africa and India, and on the basis of the results, dividing and redividing the world.
Of course, to say that these powers fought battles around the world is a different thing from saying that these battles were fought in wars of the world--those distant clashes still typically highly localized episodes between powers whose positions were limited and by no means secure in those places, and relatively minor in the life of those faraway regions, certainly next to their own local conflicts. Indeed, as eighteenth century historian Jeremy Black argues in Rethinking Military History, even as inter-European clashes in Indian waters and on Indian soil became increasingly regular and significant "the Mughal conquest of the Sultanate of Delhi in the 1520s, the Persian invasion in 1739–40 . . . and that of the Afghans in the 1750s" were and remained for a long time after "more important . . . than European moves." Likewise, Black reports, the rulers of eighteenth century China were much more concerned with the Dzungars than the European powers with which they had already clashed over the Amur Valley and Formosa (both of which China recovered successfully in the seventeenth century). And so forth.
However, as the colonial empires went on getting bigger and more powerful and more secure, and technology enhanced not just their connectivity but their reach (with railroads, steamships, telegraphs) those conflicts grew not only in extent but in the intensity with which they were waged, such that one could think of the war that broke out among them in 1914 as truly a "world war." Involving as it did the Austro-Hungarian, Serbian, Russian, German, French, Belgian and British governments from its first three days on, this automatically embroiled polities controlling over half the world's territory and population. Shortly afterwards the Ottoman Empire, Italy, Japan, the United States and China, among many more, also joined in, with the result that nearly all of Eurasia, Africa, North America and Australasia involved. Of the major world regions only Latin America was left out to any significant extent, though it should be remembered that in 1917 the proposal of a German alliance with a Mexico in which U.S. troops were intervening (the "Zimmermann Telegram") played a part in drawing the U.S. into World War I, while shortly after the U.S. entry South America's largest country, Brazil, also officially entered the war on the side of the Allies. The result was that nearly all of the world's people lived under governments actively fighting in the war, while the war across semi-colonized territories like China before its entry into the conflict, and the fighting at sea, the economic dislocations the fighting entailed, and much else, meant that very little of humanity was untouched by the war in one way or another. Moreover, that war's consequences were vast, with several empires (not least the Austro-Hungarian, German, Ottoman, Russian) broken up, control of much of the world transferred from one power to another (as with the new empires Britain and France carved out of the old Ottoman territory), and efforts made to set up new arrangements for dealing with the world's problems (like the League of Nations) as Germany's aspiration to world power was defeated (to the point of Germany being shorn of its colonies, its homeland reduced in size, its economy burdened with reparations and its government disarmed) and Britain remained (officially, at least,)the leading power, while the U.S. became a significant factor in European affairs in unprecedented fashion. World War Two was, if anything, even more global than the first in its participants, and its impact on the world's life--ushering in the U.S.-led world order we have now, with its currency and trading arrangements (the U.S. dollar as the principal currency of international trade in the open system ushered in by the General Agreement on Tariffs and Trade), institutions (the United Nations, the World Bank, the International Monetary Fund, the U.S.-led network of military alliances like NATO) and much else.
But what about the wars since? Certain observers, especially of the neoconservative bent, characterized the Cold War as "World War Three," and the post-Cold War conflicts against terrorists, "rogue states" and other such parties "World War Four." Still, that said more about their eagerness to mobilize the public behind the most extravagant pursuit of those conflicts than it did the reality of those wars. If for the citizens of China, Korea, Indochina and other places the Cold War did not stay cold locally--indeed, millions dying in wars that were localized but as intense and brutal as any that had ever been fought--it was never a matter of full-blown war at the systemic level. And conflicts with terrorist groups, disparate states and the rest were far too diffuse a thing to really be considered a coherent conflict. It was also too far removed from the level of the system--as the great powers were not really opposing each other here--for the idea to stand up to scrutiny.
As a result World War Three remained for most observers the war yet unfought.
Of course, if you have read this little lesson in "IR" up to this point you were probably hoping that I would say something about what to make of the current conflict, rather than just treating the theoretical side of the matter, and I will not disappoint you here. Today's great power list includes, at a minimum, the U.S., Russia and China, with Japan and the major European powers (Germany, France, Britain, collectively, at least), and India, also having claims. The current war may be said to involve nearly all of these actors in the conflict in Ukraine to some degree--with Russia fighting a Ukraine whose backing by the U.S.-led, Europe-including NATO alliance is massive and escalating--while Russia has its partnership with China. Meanwhile the conflict of the U.S. and Japan and India with China is similarly intensifying.
It seems to me that NATO's war with Russia escalating to a significant commitment of forces to Ukraine's side in direct contact with their Russian counterparts would at least put the war into World War III territory. The conflict's "World War III" status would become absolutely unambiguous were the European conflict to become linked with China's conflicts with the U.S., Japan and other regional actors, either through China's support of a Russia fighting NATO, or an outbreak of fighting in East Asia itself, a hardly unprecedented development--world wars, after all, typically becoming world wars not on the first day, but as other actors initially outside them find themselves compelled to participate, and differing conflicts merge together in the process. (Initially Germany in northern Europe, Italy in the Mediterranean and Japan in East Asia pursued their imperial ambitions apart from each other. It was only after the fall of France in 1940 that Italy threw its lot in with Germany, and in 1941 that the German attack on the Soviet Union, Japan's attack on Western possessions in Asia and the western Pacific, and the subsequent round of declarations of war, tied the Asian and European conflicts together into one big conflict between the U.S.-British-Soviet-dominated Allies and the German-Italian-Japanese-led Axis.)
By contrast in the absence of direct, large-scale hostilities between great powers the issue becomes much more ambiguous. Just how do we read NATO's present, not potential or even likely but actual-at-this-moment provision of aid to Ukraine? How do we read China's backing of Russia now? That is what determines whether we continue to think of World War Three as a significant possibility--or whether, as Todd argues, the conflict as already begun. With even the most basic facts of the matter incomplete and much disputed, I have to admit that there seems to me room for the kind of argument here. However, there is no argument whatsoever about the fact that, if we are not already seeing World War III we are closer to it than we have ever been since the 1980s, and maybe even since 1945.
Thursday, February 16, 2023
What The Magnificent Ambersons Can (Also) Teach Us About Technological Change
There is little doubt at this point that the media has quite oversold the progress of automation in our era. (Exemplary is the misrepresentation of the Frey-Osborne study on automation, to the degree that it caused a panic at the time and commentators, still repeating what it did not actually say as if it did, now sneer at the study as having got it wrong on those grounds.)
Yet there is the other end of the discourse, with its sneering dismissal of automation. This takes different forms--for instance, the blithe dismissal of the very idea of technological unemployment as if it were some logical impossibility by economists who "function like the caricature of the physicist whose every inquiry begins with 'imagine a perfectly spherical cow of uniform density on a frictionless plane'" (with the fact that this enables them to not worry about unemployment, and the remedies for it they find so deeply distasteful, not irrelevant to their prejudices).
However, what has interested me as of late is the way some react to the incomplete or imperfect implementation of automation in places with humans "backing up" the bots--filling in gaps, offering corrections, etc.--or simply the machine taking over parts of the work process as humans see to others. (For instance, one hears of the fast-food chains operating experimental outlets where the customer never deals face to face with a human, but humans are "in the back," preparing the food.)
One can plausibly see these situations as a matter of experimentation and refinement on the way to, at least some of the time, producing a more thoroughly automated process. But these commentators commonly react dismissively, pointing to such as evidence of some function being inherently "unautomatable." Frankly, they often do so in a heckling manner that reminds me of the idiots in The Magnificent Ambersons who, whenever they saw a car broken down by the side of the road, oafishly taunted its driver with the yell "Git a hoss!"
Well, those cars broke down less and less often, giving them less opportunity to yell "Git a hoss!" Instead the taunters found themselves having to "Git a car!" instead. And so it may go here in many a case--with the taunting helping no one at all.
Yet there is the other end of the discourse, with its sneering dismissal of automation. This takes different forms--for instance, the blithe dismissal of the very idea of technological unemployment as if it were some logical impossibility by economists who "function like the caricature of the physicist whose every inquiry begins with 'imagine a perfectly spherical cow of uniform density on a frictionless plane'" (with the fact that this enables them to not worry about unemployment, and the remedies for it they find so deeply distasteful, not irrelevant to their prejudices).
However, what has interested me as of late is the way some react to the incomplete or imperfect implementation of automation in places with humans "backing up" the bots--filling in gaps, offering corrections, etc.--or simply the machine taking over parts of the work process as humans see to others. (For instance, one hears of the fast-food chains operating experimental outlets where the customer never deals face to face with a human, but humans are "in the back," preparing the food.)
One can plausibly see these situations as a matter of experimentation and refinement on the way to, at least some of the time, producing a more thoroughly automated process. But these commentators commonly react dismissively, pointing to such as evidence of some function being inherently "unautomatable." Frankly, they often do so in a heckling manner that reminds me of the idiots in The Magnificent Ambersons who, whenever they saw a car broken down by the side of the road, oafishly taunted its driver with the yell "Git a hoss!"
Well, those cars broke down less and less often, giving them less opportunity to yell "Git a hoss!" Instead the taunters found themselves having to "Git a car!" instead. And so it may go here in many a case--with the taunting helping no one at all.
Wednesday, February 15, 2023
What if it's Actually the Other Way Around? What if it's the "Higher-Skill" Jobs That Are Getting Automated First?
The title of this item may seem counterintuitive--not only to those who adhere to the conventional wisdom, but those who actually know something of the history of technology. The Industrial Revolution, after all, appeared to be about economizing the use of "muscle" rather than "brain," in part through the mobilization of more brain workers to design and build and operate the machines that replaced the earlier workers, as with, for example, the carding, spinning, weaving machinery of the eighteenth century that revolutionized textile production. However, the appearance is deceptive. The machinery replaced not just the physical effort but the mental effort--the mental skills--the learning--that the human workers put into the carding, the spinning, the weaving.
The latter may well have been the more important, and the trend more evident in this age of computer intelligence. As analysts of automation have long observed, down to Carl Benedikt Frey and Michael Osborne, "perception and manipulation," "finger" and "manual" dexterity, the ability to work in "cramped" spaces and "awkward" positions, have been significant bottlenecks to automation--while we take utterly for granted computers' capacity to perform vast quantities of complex calculations with a speed and accuracy far, far beyond that of the most able human. The result has been that the armies of mathematicians who served as "human computers" (like those brought to public attention by the film Hidden Figures) have been easily and quietly replaced by electronic computers in contrast with, for example, the janitors that keep the offices clean, and their less appreciated skills.
So it goes with the more recent wave of automation. The imminence of the self-driving car was oversold in the mid-'10s, such that in 2023 few expect to see them anytime soon. But there is great excitement over the capacity of chatbots like GPT-3 to, among much else, write code, to the point that there is much argument over whether coders will not become obsolete in the manner of human computers (while still more advanced and capable versions of the bot are expected before even the end of this year). If true this will mean that, in line with the bottlenecks previously discussed, artificial intelligence will have "learned to code" before it has "learned to drive"--fitting the aforementioned pattern all too well.
Granted, in all of these cases this elimination of "skilled" workers only went so far. If human computers were replaced, and coders might be replaced, this depended on persons with other skills conventionally regarded as higher still--the computer scientists who created the electronic computers (while in the latter case the rocket scientists the human computers supported remain very much with us). Likewise it is one thing to replace coders, another to replace "higher level" software engineers.
Yet even allowing for all of this the reality still complicates the conveniently hierarchical view so many take of these matters, enough that some rethinking of some widespread assumptions seems warranted. One is just what really should be thought of as constituting "high-level" skills--and the way in which our ideas about the intelligence required for particular forms of work reflects social prejudices (for instance, the tendency to denigrate those who have to move and "use their hands" as against persons in more stationary and less manual jobs). Another is that matter of which jobs will continue to be done longest by humans in an automating world--with the answer the exact opposite of the clichès. "Learn to code," the stereotypical, callous elitist sneers at truck drivers fearful they will be put out of a job--but it may well be that the pool of coding jobs will dry up before the pool of truck driving jobs does. Indeed, those really pessimistic about automated driving may expect that not just the coders but the software engineers will increasingly be put out of a job by improvements in AI even as humans go on driving those trucks. And of course, anyone who takes all this at all seriously should think long and hard about what all this means for glib talk about sending more people to college and more STEM, STEM, STEM! as the pseudo-thinking person's answer to all the woes of the work force.
The latter may well have been the more important, and the trend more evident in this age of computer intelligence. As analysts of automation have long observed, down to Carl Benedikt Frey and Michael Osborne, "perception and manipulation," "finger" and "manual" dexterity, the ability to work in "cramped" spaces and "awkward" positions, have been significant bottlenecks to automation--while we take utterly for granted computers' capacity to perform vast quantities of complex calculations with a speed and accuracy far, far beyond that of the most able human. The result has been that the armies of mathematicians who served as "human computers" (like those brought to public attention by the film Hidden Figures) have been easily and quietly replaced by electronic computers in contrast with, for example, the janitors that keep the offices clean, and their less appreciated skills.
So it goes with the more recent wave of automation. The imminence of the self-driving car was oversold in the mid-'10s, such that in 2023 few expect to see them anytime soon. But there is great excitement over the capacity of chatbots like GPT-3 to, among much else, write code, to the point that there is much argument over whether coders will not become obsolete in the manner of human computers (while still more advanced and capable versions of the bot are expected before even the end of this year). If true this will mean that, in line with the bottlenecks previously discussed, artificial intelligence will have "learned to code" before it has "learned to drive"--fitting the aforementioned pattern all too well.
Granted, in all of these cases this elimination of "skilled" workers only went so far. If human computers were replaced, and coders might be replaced, this depended on persons with other skills conventionally regarded as higher still--the computer scientists who created the electronic computers (while in the latter case the rocket scientists the human computers supported remain very much with us). Likewise it is one thing to replace coders, another to replace "higher level" software engineers.
Yet even allowing for all of this the reality still complicates the conveniently hierarchical view so many take of these matters, enough that some rethinking of some widespread assumptions seems warranted. One is just what really should be thought of as constituting "high-level" skills--and the way in which our ideas about the intelligence required for particular forms of work reflects social prejudices (for instance, the tendency to denigrate those who have to move and "use their hands" as against persons in more stationary and less manual jobs). Another is that matter of which jobs will continue to be done longest by humans in an automating world--with the answer the exact opposite of the clichès. "Learn to code," the stereotypical, callous elitist sneers at truck drivers fearful they will be put out of a job--but it may well be that the pool of coding jobs will dry up before the pool of truck driving jobs does. Indeed, those really pessimistic about automated driving may expect that not just the coders but the software engineers will increasingly be put out of a job by improvements in AI even as humans go on driving those trucks. And of course, anyone who takes all this at all seriously should think long and hard about what all this means for glib talk about sending more people to college and more STEM, STEM, STEM! as the pseudo-thinking person's answer to all the woes of the work force.
Subscribe to:
Posts (Atom)