In discussing Japan's low fertility rate and aging population the media (certainly in the U.S., though so far as I can tell, elsewhere as well) has inclined to the crude and the simple-minded, the hot-button and sensationalist--and per usual overlooked important aspects of the matter.
Thus we have stories about Japan as a country of "forty year old virgins," and tales of young people abandoning the prospect of human contact in preference of "virtual" love. But we hear little of what those who look at the business and economics pages know very well--namely that Japan has been in a bad way here for decades, with the most significant consequences for this matter.
After all, in a modern society where economic life is individualistic and setting up a household and having children is very expensive and little help will be forthcoming from any source the responsible thing to do is to only attempt to do so when one has, among much else, a reasonable expectation of an income over the long-term that will be at least adequate to raise children "decently"--which is to say, at a middle-class standard--the prerequisite for which has been a "good job" offering sufficient security of tenure in a position paying a middle class wage that they can expect to go on receiving one for a good long time to come. And that is exactly what has become very elusive in recent decades--as the economic growth engine that looked so impressive in the 1950s and 1960s stalled out, and as Fordism's vague promises of generalized middle-classness faded Japan has been a signal case as the fastest-growing of the large industrialized nations became the slowest-growing nearly overnight, and, to go by one calculation, per capita Gross Domestic Product fell by half during this past generation, all as the old notion of "lifetime employment" waned what can seem a lifetime ago.
Quite naturally those who would have been inclined (and of course, everyone is not necessarily inclined) refrain, with any impression that this is the case reinforced by what we see overseas, not least in those two oldest of the major Western nations, Italy and Germany--their similarities and their differences. Italy comes closer than any other Group of Seven nation to Japan in its shift from brisk growth to stagnation (and, even when we use the more conventional numbers, economic contraction) in another spectacle of a modern country with modern attitudes to these things seeing its birth rate fall. (Indeed, in every single year from 2013 forward Italy's fertility rate has been lower than Japan's, averaging 1.3 against Japan's 1.4 for 2013-2020, with the 2020 rate 1.2 against Japan's 1.3.)
Of course, Germany may not seem to fit the profile so neatly given its image as an economic success story. However, it is worth noting that, even apart from the qualified nature of its success (Germany remains a manufacturing power, but is also a long way away from its "Wirtschaftswunder"-era dynamism), and the fact that its social model is moving in the same direction as everyone else's (with all that means for young people starting their lives), its figures vary significantly by region. In particular Germany's high average age obscures the cleavage between what tend to be the older (and less prosperous) eastern regions as against the more youthful (and more prosperous) western regions.
Alas, a media which makes a curse of the word "millennial," and sneers at the idea of working people wanting any security at all as "entitlement" on their part, has little interest and less sympathy in such matters--while knowing full well that stories about it are less likely to do well in the "attention economy" than stories about "virtual girlfriends." Boding poorly for our understanding of the matter in the recent past, it also bodes poorly for our ability to understand it in the future--in which the ability of young people to get along economically in the world may not be the only factor, but nevertheless a hugely important one, however much the opinion makers of today would like one to think otherwise.
Friday, September 16, 2022
Thursday, September 8, 2022
Has the Theory of Economic Long Waves Ceased to Be Relevant?
The economic theory of "long waves" holds that economic growth has a 40-60 year cycle with the first half of the cycle, an "upward" wave of 20-30 years--a period of strong growth with recessions few and mild--followed by a "downward" wave that is the opposite, with growth weak and downturns frequent and severe for two to three decades until it is followed in its turn by a new upward wave beginning the next cycle.
First suggested in the 1920s by the Soviet economist Nikolai Kondratiev (indeed, long waves are often called "Kondratiev waves" in his honor) the idea is controversial in outline and detail (e.g. just what causes them), but nevertheless has numerous, noteworthy adherents across the spectrum of economic theory and ideology who have made considerable use of it in their work, from figures like Joseph Schumpeter on the right to a Michael Roberts on the left. This is, in part, because the theory seemed to be borne out by the events of mid-century. In hindsight the period from the turn of the century to World War I looks like an upward wave, the period of the '20s and '30s and '40s a downward wave, but then the period that followed it, the big post-war boom of the '50s and '60s another upward wave--which was followed by yet another downward wave absolutely no later than the '70s.
So far, so good--but the years since have been another matter. Assuming a downward wave in the '70s we ought to expect another upward wave in the '90s and certainly the early twenty-first century. Indeed, we might expect to have already run through a whole such wave and, just now, find ourselves in, entering or at least approaching another downward wave.
As it happens the U.S. did have a boom in the late '90s. However, in contrast with the wide expectations that this boom was the beginning of something lasting and epochal (remember how Clinton was going to pay down the national debt with that exploding tax revenue?), that boom petered out fast--and so did the associated seeds of growth, like labor productivity growth, which pretty much fell into the toilet in the twenty-first century, and stayed there. Meanwhile the same years were less than booming for the rest of the world--with the Soviet bloc's output collapse bottoming out, with Europe Eurosclerotic and Japan in its lost decade amid an Asia hard-hit by financial crisis, and the Third World generally struggling with the Commodity Depression, the aftereffects of the Volcker shock/debt crisis, and the new frustrations the decade brought (with the "Asian" crisis tipping Brazil over into default).
Of course, as the American boom waned the rest of the world did somewhat better--indeed, depending on which figures one consults--the 2002-2008 period saw some really impressive growth at the global level. But again this was short-lived, cut off by the 2007-2008 financial crisis, from which the world never really recovered before it got kicked while it was down again by pandemic, recession, war. (The numbers, as measured in any manner, have been lousy, but if one uses the Consumer Price Index rather than chained-dollar-based deflators to adjust the "current" figures for inflation then it seems we saw economic collapse in a large part of the world, partially obscured by China's still doing fairly well, but the Chinese miracle was slowing down too.)
The result is that as of the early 2020s, almost a half century after the downturn (commonly dated to 1973), there simply has been no long boom to speak of. Of course, some analysts remain optimistic, with Swiss financial giant UBS recently suggesting that the latter part of the decade, helped by business investment in digital technologies to enable them to keep operating during the pandemic that will work out to long efficiency gains, public investments in infrastructure and R & D, and a green energy boom, may mean better times ahead. Perhaps. Yet it has seemed to me that there has been more hype than substance in the talk of an automation boom (indeed, business investment seems to have mainly been about short-term crisis management--shoring up supply chains, stocking up on inventory, etc., while their success in "digitizing" remains arguable); government action remains a long way from really boom-starting levels (the Inflation Reduction Act, of which only part is devoted to green investment, devotes $400 billion or so to such matters over a decade, a comparative drop in the bucket); and while I remain optimistic about the potentials of renewable energy there is room for doubt that the investment we get in it will be anywhere near enough to make for a long upward movement.
In short, far from finding myself bullish about the prospect of a new long wave, I find myself remembering that the theory was a conclusion drawn from a very small sample (these cycles not generally traced further back than the late eighteenth century), which especially after the experience of the last half century can leave us the more doubtful that there was ever much to the theory to the begin with. However, I also find myself considering another possibility, namely that for that period of history such a cycle may have actually been operative--and that cycle since broken, perhaps permanently, along with the predictive value that it once seemed to possess.
First suggested in the 1920s by the Soviet economist Nikolai Kondratiev (indeed, long waves are often called "Kondratiev waves" in his honor) the idea is controversial in outline and detail (e.g. just what causes them), but nevertheless has numerous, noteworthy adherents across the spectrum of economic theory and ideology who have made considerable use of it in their work, from figures like Joseph Schumpeter on the right to a Michael Roberts on the left. This is, in part, because the theory seemed to be borne out by the events of mid-century. In hindsight the period from the turn of the century to World War I looks like an upward wave, the period of the '20s and '30s and '40s a downward wave, but then the period that followed it, the big post-war boom of the '50s and '60s another upward wave--which was followed by yet another downward wave absolutely no later than the '70s.
So far, so good--but the years since have been another matter. Assuming a downward wave in the '70s we ought to expect another upward wave in the '90s and certainly the early twenty-first century. Indeed, we might expect to have already run through a whole such wave and, just now, find ourselves in, entering or at least approaching another downward wave.
As it happens the U.S. did have a boom in the late '90s. However, in contrast with the wide expectations that this boom was the beginning of something lasting and epochal (remember how Clinton was going to pay down the national debt with that exploding tax revenue?), that boom petered out fast--and so did the associated seeds of growth, like labor productivity growth, which pretty much fell into the toilet in the twenty-first century, and stayed there. Meanwhile the same years were less than booming for the rest of the world--with the Soviet bloc's output collapse bottoming out, with Europe Eurosclerotic and Japan in its lost decade amid an Asia hard-hit by financial crisis, and the Third World generally struggling with the Commodity Depression, the aftereffects of the Volcker shock/debt crisis, and the new frustrations the decade brought (with the "Asian" crisis tipping Brazil over into default).
Of course, as the American boom waned the rest of the world did somewhat better--indeed, depending on which figures one consults--the 2002-2008 period saw some really impressive growth at the global level. But again this was short-lived, cut off by the 2007-2008 financial crisis, from which the world never really recovered before it got kicked while it was down again by pandemic, recession, war. (The numbers, as measured in any manner, have been lousy, but if one uses the Consumer Price Index rather than chained-dollar-based deflators to adjust the "current" figures for inflation then it seems we saw economic collapse in a large part of the world, partially obscured by China's still doing fairly well, but the Chinese miracle was slowing down too.)
The result is that as of the early 2020s, almost a half century after the downturn (commonly dated to 1973), there simply has been no long boom to speak of. Of course, some analysts remain optimistic, with Swiss financial giant UBS recently suggesting that the latter part of the decade, helped by business investment in digital technologies to enable them to keep operating during the pandemic that will work out to long efficiency gains, public investments in infrastructure and R & D, and a green energy boom, may mean better times ahead. Perhaps. Yet it has seemed to me that there has been more hype than substance in the talk of an automation boom (indeed, business investment seems to have mainly been about short-term crisis management--shoring up supply chains, stocking up on inventory, etc., while their success in "digitizing" remains arguable); government action remains a long way from really boom-starting levels (the Inflation Reduction Act, of which only part is devoted to green investment, devotes $400 billion or so to such matters over a decade, a comparative drop in the bucket); and while I remain optimistic about the potentials of renewable energy there is room for doubt that the investment we get in it will be anywhere near enough to make for a long upward movement.
In short, far from finding myself bullish about the prospect of a new long wave, I find myself remembering that the theory was a conclusion drawn from a very small sample (these cycles not generally traced further back than the late eighteenth century), which especially after the experience of the last half century can leave us the more doubtful that there was ever much to the theory to the begin with. However, I also find myself considering another possibility, namely that for that period of history such a cycle may have actually been operative--and that cycle since broken, perhaps permanently, along with the predictive value that it once seemed to possess.
Tuesday, September 6, 2022
The Vision of Japan as the Future: A Reflection
Back in the '80s it was common for Americans to think of Japan as "the future"--the country on the leading edge of technological development and business practice, the industrial world-beater that was emerging as pace-setter, model and maybe even hegemon.
A few years later, as Japan's economic boom was revealed as substantially a real estate-and-stock bubble that had been as much a function of American weakness as Japanese strength (as America's exploding Reagan-era trade deficit filled the country's bank vaults with dollars, and American devaluation translated to a massive strengthening of the yen); as Japan's supremacy in areas like chip-making proved fragile, and its prospects for leaping ahead of the rest of the world in others (as through breakthroughs in fifth-generation computing, or room-temperature superconductors) proved illusory; and the country went from being the fastest-growing to the most stagnant of the major industrial economies; all that faded, reinforced by the illusions and delusions of America's late '90s tech boom, which shifted the tendency of America's dialogue away from hand-wringing over declinism to "irrational exuberance" (at least, for a short but critical period after the debut of Windows '95).
Yet in hindsight it can seem that Japan never really did stop being the image of the future. It was just the case that observers of conventional mind failed to recognize it because the future was not what people thought it was at the time. They pictured technological dynamism and economic boom--but the future, since that time, has really been technological and economic stagnation, with Japan's "lost decade" turned "lost decades" turned "lost generation" matched by the world's own "lost generation" these past many years. And the same goes for that stgantion's effects, like social withdrawal--Americans, certainly, seeming to notice the phenomenon of the "hikikomori" in Japan long before they noticed it at home.
Thus has it also gone with Japan's demography--the country's people less often marrying and having children, such that even by the standards of a world on the whole going through the "demographic transition" the country's situation has been extreme. According to the Central Intelligence Agency's World Factbook tiny and ultra-rich Monaco apart, Japan is the oldest country on Earth, with a median age of almost 49 years, and only 12 percent of its population under age 14. Still, others are not so far behind, with, according to the very same sources, dozens of other countries, including every major advanced industrial country but the U.S., having a median age of over forty (and the U.S. not far behind, with a median age of 39), and similarly dwindling numbers of youth (the percentage who are 0-14 in age likewise 12 percent in South Korea, 13 percent in Italy, 14 percent in Germany, with 15 percent the Euro area average).
Considering the last it seems fitting that the trend was already evident at the very peak of that preoccupation with Japan as industrial paragon, 1989, the year of the "1.57 shock" (when the country recorded a Total Fertility Rate of 1.57--at the time regarded as a shockingly number, though the government would probably be ecstatic if it was that high today). The result is that those interested in the difficulties of an aging society are looking at Japan wondering how it will deal with these difficulties as they manifest there first--and what the country does here as likely to inform others' thought about how to cope with contemporary reality as much as it did back when business "experts" seemed transfixed by "Japan Inc." as the epitome of industrial competence.
A few years later, as Japan's economic boom was revealed as substantially a real estate-and-stock bubble that had been as much a function of American weakness as Japanese strength (as America's exploding Reagan-era trade deficit filled the country's bank vaults with dollars, and American devaluation translated to a massive strengthening of the yen); as Japan's supremacy in areas like chip-making proved fragile, and its prospects for leaping ahead of the rest of the world in others (as through breakthroughs in fifth-generation computing, or room-temperature superconductors) proved illusory; and the country went from being the fastest-growing to the most stagnant of the major industrial economies; all that faded, reinforced by the illusions and delusions of America's late '90s tech boom, which shifted the tendency of America's dialogue away from hand-wringing over declinism to "irrational exuberance" (at least, for a short but critical period after the debut of Windows '95).
Yet in hindsight it can seem that Japan never really did stop being the image of the future. It was just the case that observers of conventional mind failed to recognize it because the future was not what people thought it was at the time. They pictured technological dynamism and economic boom--but the future, since that time, has really been technological and economic stagnation, with Japan's "lost decade" turned "lost decades" turned "lost generation" matched by the world's own "lost generation" these past many years. And the same goes for that stgantion's effects, like social withdrawal--Americans, certainly, seeming to notice the phenomenon of the "hikikomori" in Japan long before they noticed it at home.
Thus has it also gone with Japan's demography--the country's people less often marrying and having children, such that even by the standards of a world on the whole going through the "demographic transition" the country's situation has been extreme. According to the Central Intelligence Agency's World Factbook tiny and ultra-rich Monaco apart, Japan is the oldest country on Earth, with a median age of almost 49 years, and only 12 percent of its population under age 14. Still, others are not so far behind, with, according to the very same sources, dozens of other countries, including every major advanced industrial country but the U.S., having a median age of over forty (and the U.S. not far behind, with a median age of 39), and similarly dwindling numbers of youth (the percentage who are 0-14 in age likewise 12 percent in South Korea, 13 percent in Italy, 14 percent in Germany, with 15 percent the Euro area average).
Considering the last it seems fitting that the trend was already evident at the very peak of that preoccupation with Japan as industrial paragon, 1989, the year of the "1.57 shock" (when the country recorded a Total Fertility Rate of 1.57--at the time regarded as a shockingly number, though the government would probably be ecstatic if it was that high today). The result is that those interested in the difficulties of an aging society are looking at Japan wondering how it will deal with these difficulties as they manifest there first--and what the country does here as likely to inform others' thought about how to cope with contemporary reality as much as it did back when business "experts" seemed transfixed by "Japan Inc." as the epitome of industrial competence.
Thursday, June 30, 2022
A Generation On: Clifford Stoll's 1995 Essay on the Internet
Clifford Stoll's 1995 Newsweek piece "Why the Web Won't Be Nirvana" has been the butt of many a joke over the years, but not because of its title. Had Stoll limited himself to merely arguing what his title claims we might well remember him as having been clearer-eyed than his contemporaries. Had he somewhat more ambititously argued that the advent of this technology would not in itself translate to not merely nirvana, but not even utopia, we might have accorded him yet greater plaudits. And had he, in more nuanced fashion, argued that some of the much-hyped developments might not come for a long time, if ever, he would also have been right, as we are all too aware looking at exactly some of those things of which he was so dismissive, like telecommuting or the substitution of online for in-person education, or some radical advance for democracy.
However, he was dismissive of the whole thing, not only in the very near term, but, it could seem, any time frame meaningful to people of the 1990s, and on very particular grounds that seem to me more telling than the prediction itself. While paying the limits of the technology as it stood at the time some heed (noting the sheer messiness of what was online, or the awkwardness of reading a CD-ROM while on a '90s-era desktop), he did not stress the limits of the technology as it was then, and likely to remain for some time, even though he could have very easily done so. (What we are in 2022 vaguely talking about as the "Metaverse" was, at the time, widely portrayed as imminent amid the then-insane hyping of Virtual Reality--while what we really had was pay-by-the-hour dial-up in a time in which Amazon had scarcely been founded, and Google, Facebook, Netflix were far from realization.) Nor did Stoll acknowledge the hard facts of economics and politics and power that would a generation on see even those bosses who have made the biggest fortunes in the history of the world out of technological hype broadcast to the whole world their extreme hostility to the very idea of telecommuting, or make the Internet a weapon in the arsenal of Authority against Dissent as much or more than the reverse. (That was not the kind of thing one was likely to get in Newsweek then any more than now.)
Rather what Stoll based his argument on was the need for "human contact," which he was sure the Internet would fail to provide. The result was that where his predictions were correct he was far off the mark in regard to the reasons why (those matters of economics, politics, power), and totally wrong about other points, like his dismissal of online retail and the possibility that it might threaten the business of brick-and-mortar stores, or the viability of online publishing. The truth is that when it comes to mundane tasks like buying cornflakes and underwear convenience, and cheapness, counts for infinitely more than "human contact" with the hassled, time- and cash-strapped great majority of us--while where their performance is concerned human contact is, to put it mildly, overrated. Indeed, it is often a thing many, not all of them introverts, would take some trouble to avoid. (Do you really love encountering pushy salespersons? Long checkout lines where you encounter more rude people? Sales clerks of highly variable competence and personability? For any and all of whom dealing with you may not exactly be the highlight of their own day, one might add?) Indeed, looking at a college classroom in recent years one sees two of his predictions belied, as they are reminded that, while Stoll may indeed be right that "[a] network chat line is a limp substitute for meeting friends over coffee," the average college student much prefers that "limp substitute" to chatting with their neighbors, let alone attending to that instructor right there in the room with them, whom large numbers of them happily replace with an online equivalent whenever this becomes practical.
Thus does it go with other "entertainments." Stoll may well be right that "no interactive multimedia display comes close to the excitement of a live concert," but how often do most people get to go to those? In the meantime the multimedia display has something to commend it against the other substitutes (like the Walkman of Stoll's day). And this is even more the case with his remark that no one would "prefer cybersex to the real thing." After all, the "real thing" isn't so easy for many to come by (even when they aren't coping with pandemic and economic collapse), while even for those for whom it might be an option it seems that not merely cybersex with the real, but "love with the virtual," is competitive enough with the real kind to make many a social critic wag their tongues (with, I suspect, what is treated as a Japanese phenomenon today, like the "hikikomori," likely to prove far from unique to that country in the years ahead).
Far more than Stoll, Edward Castronova and Jane McGonigal seem to have been on the right track when writing about how poorly our workaday world comes off next to the freedoms, stimulation, satisfaction of virtuality, especially when we consider what that reality is like not for the elite who generally get to make a living offering their opinions in public, but the vast majority of the population on the lower rungs of the social hierarchy, facing a deeply unequal, sneering world which every day and in every possible way tells them "I don't care about your problems." Indeed, while a certain sort of person will smugly dismiss any remark about how the world is changing with a brazen a priori confidence that things are always pretty much the same, it seems far from implausible that things are getting worse that way (it's hard to argue with a thing like falling life expectancy!), while it seems there is reason to think that the virtual is only getting more alluring, with people actually wanting it more, not less, as it becomes more familiar to them--a familiar friend rather than something they know only from the Luddite nightmares of so much bottom-feeding sci-fi. In fact, it does not seem too extreme to suspect that many have as little as they can to do with the real offline world--and that only because of the unavoidable physical necessities of dealing with it, on terms that make it any more attractive, only underlining the superiority of the virtual in life as they have lived it.
However, he was dismissive of the whole thing, not only in the very near term, but, it could seem, any time frame meaningful to people of the 1990s, and on very particular grounds that seem to me more telling than the prediction itself. While paying the limits of the technology as it stood at the time some heed (noting the sheer messiness of what was online, or the awkwardness of reading a CD-ROM while on a '90s-era desktop), he did not stress the limits of the technology as it was then, and likely to remain for some time, even though he could have very easily done so. (What we are in 2022 vaguely talking about as the "Metaverse" was, at the time, widely portrayed as imminent amid the then-insane hyping of Virtual Reality--while what we really had was pay-by-the-hour dial-up in a time in which Amazon had scarcely been founded, and Google, Facebook, Netflix were far from realization.) Nor did Stoll acknowledge the hard facts of economics and politics and power that would a generation on see even those bosses who have made the biggest fortunes in the history of the world out of technological hype broadcast to the whole world their extreme hostility to the very idea of telecommuting, or make the Internet a weapon in the arsenal of Authority against Dissent as much or more than the reverse. (That was not the kind of thing one was likely to get in Newsweek then any more than now.)
Rather what Stoll based his argument on was the need for "human contact," which he was sure the Internet would fail to provide. The result was that where his predictions were correct he was far off the mark in regard to the reasons why (those matters of economics, politics, power), and totally wrong about other points, like his dismissal of online retail and the possibility that it might threaten the business of brick-and-mortar stores, or the viability of online publishing. The truth is that when it comes to mundane tasks like buying cornflakes and underwear convenience, and cheapness, counts for infinitely more than "human contact" with the hassled, time- and cash-strapped great majority of us--while where their performance is concerned human contact is, to put it mildly, overrated. Indeed, it is often a thing many, not all of them introverts, would take some trouble to avoid. (Do you really love encountering pushy salespersons? Long checkout lines where you encounter more rude people? Sales clerks of highly variable competence and personability? For any and all of whom dealing with you may not exactly be the highlight of their own day, one might add?) Indeed, looking at a college classroom in recent years one sees two of his predictions belied, as they are reminded that, while Stoll may indeed be right that "[a] network chat line is a limp substitute for meeting friends over coffee," the average college student much prefers that "limp substitute" to chatting with their neighbors, let alone attending to that instructor right there in the room with them, whom large numbers of them happily replace with an online equivalent whenever this becomes practical.
Thus does it go with other "entertainments." Stoll may well be right that "no interactive multimedia display comes close to the excitement of a live concert," but how often do most people get to go to those? In the meantime the multimedia display has something to commend it against the other substitutes (like the Walkman of Stoll's day). And this is even more the case with his remark that no one would "prefer cybersex to the real thing." After all, the "real thing" isn't so easy for many to come by (even when they aren't coping with pandemic and economic collapse), while even for those for whom it might be an option it seems that not merely cybersex with the real, but "love with the virtual," is competitive enough with the real kind to make many a social critic wag their tongues (with, I suspect, what is treated as a Japanese phenomenon today, like the "hikikomori," likely to prove far from unique to that country in the years ahead).
Far more than Stoll, Edward Castronova and Jane McGonigal seem to have been on the right track when writing about how poorly our workaday world comes off next to the freedoms, stimulation, satisfaction of virtuality, especially when we consider what that reality is like not for the elite who generally get to make a living offering their opinions in public, but the vast majority of the population on the lower rungs of the social hierarchy, facing a deeply unequal, sneering world which every day and in every possible way tells them "I don't care about your problems." Indeed, while a certain sort of person will smugly dismiss any remark about how the world is changing with a brazen a priori confidence that things are always pretty much the same, it seems far from implausible that things are getting worse that way (it's hard to argue with a thing like falling life expectancy!), while it seems there is reason to think that the virtual is only getting more alluring, with people actually wanting it more, not less, as it becomes more familiar to them--a familiar friend rather than something they know only from the Luddite nightmares of so much bottom-feeding sci-fi. In fact, it does not seem too extreme to suspect that many have as little as they can to do with the real offline world--and that only because of the unavoidable physical necessities of dealing with it, on terms that make it any more attractive, only underlining the superiority of the virtual in life as they have lived it.
Wednesday, June 29, 2022
The Pandemic and Automation: Where Are We Now?
In the years preceding the pandemic there was enormous hype about automation, particularly in the wake of the 2013 Frey-Osborne study The Future of Employment . Following the pandemic the effort was supposedly in overdrive.
However, a close reading of even the few items about the matter that serve up actual examples of such automation (like the installation of a voice recognition-equipped system for receiving customers' orders as they pass through some fast food outlet's drive-thru lane) reveals that they are clearer on intentions and possibilities than actual developments--like polls telling us that "50% of employers are expecting to accelerate the automation of some roles in their companies" (never mind how much employment those employers account for, or how serious their expectations are, or what "accelerate" and "some" actually mean here). Meanwhile rather more prominent when we look at discussion of actualities rather than possibilities we find ourselves reading about the discontents of the humans. We read of how workers in jobs that have them dealing with the general public face-to-face are burned-out and fed-up, not of how bosses are replacing those workers with new devices--a decade after robot waiters and the like were supposed to be on the verge of becoming commonplace. We read that industrial robot orders are up--but (as, perhaps, we note that the actual number of robots ordered is not really so staggering) we read far more of supposed "labor shortages" than we do of automation filling in the gaps. We also know that, as seen from the standpoint of the whole economy, productive investment--without which no one is automating anything--remains depressed compared with what it was pre-crisis (and remember, the world never really got over that Great Recession), while it also does not seem terribly likely to get much better very soon, with that media darling and icon of techno-hype Elon Musk, even as he promises a humanoid Teslabot by September, publicly raving about recession just around the corner and preemptively slashing his work force in anticipation, not in the expectation of employing fewer humans, just fewer humans with actual salaries (while those Teslabots do not seem to be part of the story, go figure).
Why do we see such a disparity between the expectations and the reality? A major reason, I think, is that those who vaguely anticipated some colossal rush to automate the economy imagined vast numbers of ready, or nearly ready, systems ready to do the job--a natural result of the press tending to imagine that "innovations" at Technology Readiness Level 1 are actually on Level 9, with the truth coming out when push comes to shove, as it so clearly has amid the crisis, the requisite means are not nearly so far along as some would have had us believe. Those observers also underestimated, just as government and the media have generally done, just how disruptive the pandemic was to be--how long the pandemic and its direct disruptions would last, to say nothing of the indirect, and how much all this would matter. In line with the familiar prejudices of the media lockdowns, strikes and the war in Ukraine get plenty of time and space as sources of economic troubles--but critics of central bank monetary policies get very, very little, with one result that the upsurge in inflation took so many "experts" by surprise. And that inflation, and the tightening of credit that has inevitably followed it, however belated and gradual compared with the talk of latterday Volcker shock it may be, are hardly the kind of thing that encourages investors. Nor are the unceasing supply chain problems. (If the country can't even keep itself in baby formula, how can it keep itself in the inputs required for a drastic expansion of revolutionary automation?) The result is that those of us watching this scene would do well to take those reports of some drastic increase in the rate of automation with a measure of skepticism.
However, a close reading of even the few items about the matter that serve up actual examples of such automation (like the installation of a voice recognition-equipped system for receiving customers' orders as they pass through some fast food outlet's drive-thru lane) reveals that they are clearer on intentions and possibilities than actual developments--like polls telling us that "50% of employers are expecting to accelerate the automation of some roles in their companies" (never mind how much employment those employers account for, or how serious their expectations are, or what "accelerate" and "some" actually mean here). Meanwhile rather more prominent when we look at discussion of actualities rather than possibilities we find ourselves reading about the discontents of the humans. We read of how workers in jobs that have them dealing with the general public face-to-face are burned-out and fed-up, not of how bosses are replacing those workers with new devices--a decade after robot waiters and the like were supposed to be on the verge of becoming commonplace. We read that industrial robot orders are up--but (as, perhaps, we note that the actual number of robots ordered is not really so staggering) we read far more of supposed "labor shortages" than we do of automation filling in the gaps. We also know that, as seen from the standpoint of the whole economy, productive investment--without which no one is automating anything--remains depressed compared with what it was pre-crisis (and remember, the world never really got over that Great Recession), while it also does not seem terribly likely to get much better very soon, with that media darling and icon of techno-hype Elon Musk, even as he promises a humanoid Teslabot by September, publicly raving about recession just around the corner and preemptively slashing his work force in anticipation, not in the expectation of employing fewer humans, just fewer humans with actual salaries (while those Teslabots do not seem to be part of the story, go figure).
Why do we see such a disparity between the expectations and the reality? A major reason, I think, is that those who vaguely anticipated some colossal rush to automate the economy imagined vast numbers of ready, or nearly ready, systems ready to do the job--a natural result of the press tending to imagine that "innovations" at Technology Readiness Level 1 are actually on Level 9, with the truth coming out when push comes to shove, as it so clearly has amid the crisis, the requisite means are not nearly so far along as some would have had us believe. Those observers also underestimated, just as government and the media have generally done, just how disruptive the pandemic was to be--how long the pandemic and its direct disruptions would last, to say nothing of the indirect, and how much all this would matter. In line with the familiar prejudices of the media lockdowns, strikes and the war in Ukraine get plenty of time and space as sources of economic troubles--but critics of central bank monetary policies get very, very little, with one result that the upsurge in inflation took so many "experts" by surprise. And that inflation, and the tightening of credit that has inevitably followed it, however belated and gradual compared with the talk of latterday Volcker shock it may be, are hardly the kind of thing that encourages investors. Nor are the unceasing supply chain problems. (If the country can't even keep itself in baby formula, how can it keep itself in the inputs required for a drastic expansion of revolutionary automation?) The result is that those of us watching this scene would do well to take those reports of some drastic increase in the rate of automation with a measure of skepticism.
Tuesday, June 28, 2022
Telecommuting: What We Imagined Once, and Where We Actually Are Now
In his 1980 classic of futurology The Third Wave Alvin Toffler remarked that an era of general "transportation crisis" was upon the world, in which "mass transit systems [were] strained to the breaking point, roads and highways clogged, parking spaces rare, pollution a serious problem, strikes and breakdowns almost routine, and costs skyrocketing." Toffler noted, too, the savings that a turn from in-person commuting to telecommuting might achieve, from the lower expenditure of energy of a computer terminal against a private car (or even mass transit), to the chance to scale down physical facilities as people worked from home permitting reductions of real estate, utility, tax and other expenditures. Indeed, it seemed to him that sheer market forces would go a long way to doing the trick, while pressure from environmentalists to bring down the ecological footprint of business activity, and government's seeing in a shift to telecommuting potential benefits in ways from the oil import-beleaguered trade balance to the collapsing nuclear family, would easily nudge it further along.
Of course, four decades later it would seem that Toffler could not have been more wrong on this point--with his being so wrong the more striking given that the transportation crisis, the energy crisis, he wrote of did not get better, but only worse and worse; while the proliferation of computing and Internet connections, and the improvements in their performance and price, went far, far beyond anything Toffler anticipated in the near term. The reason is that, whatever those concerned for public issues like the environment, the trade balance or anything else thought about the matter--and certainly whatever employees thought--employers didn't want it, as anyone who understood economic history had to see they wouldn't, because of the simple matter of control. (The Industrial Revolution's shift from the "putting-out" system to the factory, Taylorist time-and-motion study, Fordism--through it all the key to higher productivity and profit has lain through ever-more intricate division and direction of labor conveniently gathered together under management's eye.)
Ultimately that opposition mattered more than anything else. Since that time it was only the shock of the recent pandemic put the matter as high on the agenda as it has been these last two years, with the scrambling to implement it in the wake of lockdown only underlining how little serious effort government and business made in the direction of figuring out how we could actually make large-scale telecommuting work in all those decades. And now with government and business discarding such measures as they took to constrain the pandemic there is a drive on the part of employers to go back to the old ways, with, ironically, those very business leaders who are apt to be most celebrated as technological superheroes by the makers of the conventional wisdom frequently in the vanguard; all these determined to see that the experiment in telecommuting remains just that, with, of course, the full blessings of government and the media, as media outlets like The Atlantic publish articles with obnoxiously wheedling titles like "Admit It, You Miss Your Commute."
Of a course, a good many workers have displayed less enthusiasm for this course than their bosses--fully two-thirds of remote workers declaring that they do not want to return to the office, with discarding that trip to and from their workplace that so many experience as the worst part of their day specifically cited among the biggest benefits of not doing so. (No, contrary to what the folks at The Atlantic would have the impressionable believe, they did not miss their commute.) Meanwhile if the strikes shocking elites the world over are anything to go by (even Britain, the homeland of that icon of union-crushing Margaret Thatcher, is seeing the kind of strikes that the Thomas Friedmans of the world complacently thought safely relegated to the historic past), workers, shocked, stressed, burned-out and all-around less complacent after the pandemic, are asserting themselves in a way they haven't in decades. The result is that while for over a generation employers have seemed to hold all the cards, and never let anyone forget it, one ought not to rush to the conclusion that they will get their way yet again.
Of course, four decades later it would seem that Toffler could not have been more wrong on this point--with his being so wrong the more striking given that the transportation crisis, the energy crisis, he wrote of did not get better, but only worse and worse; while the proliferation of computing and Internet connections, and the improvements in their performance and price, went far, far beyond anything Toffler anticipated in the near term. The reason is that, whatever those concerned for public issues like the environment, the trade balance or anything else thought about the matter--and certainly whatever employees thought--employers didn't want it, as anyone who understood economic history had to see they wouldn't, because of the simple matter of control. (The Industrial Revolution's shift from the "putting-out" system to the factory, Taylorist time-and-motion study, Fordism--through it all the key to higher productivity and profit has lain through ever-more intricate division and direction of labor conveniently gathered together under management's eye.)
Ultimately that opposition mattered more than anything else. Since that time it was only the shock of the recent pandemic put the matter as high on the agenda as it has been these last two years, with the scrambling to implement it in the wake of lockdown only underlining how little serious effort government and business made in the direction of figuring out how we could actually make large-scale telecommuting work in all those decades. And now with government and business discarding such measures as they took to constrain the pandemic there is a drive on the part of employers to go back to the old ways, with, ironically, those very business leaders who are apt to be most celebrated as technological superheroes by the makers of the conventional wisdom frequently in the vanguard; all these determined to see that the experiment in telecommuting remains just that, with, of course, the full blessings of government and the media, as media outlets like The Atlantic publish articles with obnoxiously wheedling titles like "Admit It, You Miss Your Commute."
Of a course, a good many workers have displayed less enthusiasm for this course than their bosses--fully two-thirds of remote workers declaring that they do not want to return to the office, with discarding that trip to and from their workplace that so many experience as the worst part of their day specifically cited among the biggest benefits of not doing so. (No, contrary to what the folks at The Atlantic would have the impressionable believe, they did not miss their commute.) Meanwhile if the strikes shocking elites the world over are anything to go by (even Britain, the homeland of that icon of union-crushing Margaret Thatcher, is seeing the kind of strikes that the Thomas Friedmans of the world complacently thought safely relegated to the historic past), workers, shocked, stressed, burned-out and all-around less complacent after the pandemic, are asserting themselves in a way they haven't in decades. The result is that while for over a generation employers have seemed to hold all the cards, and never let anyone forget it, one ought not to rush to the conclusion that they will get their way yet again.
Wednesday, June 22, 2022
On the Word "Startup"
When people speak of "startups" what they mean is that someone is "starting up" a business.
But in choosing to speak not of new businesses, or new companies, but instead startups, they make a far grander claim than the establishment of a new enterprise that will, hopefully, produce something of use to others and a living for its employees and a return for its investors.
Instead they promise that here is a technological and commercial revolution!
And I have to admit myself ever more cynical about that heavily used word and its users.
Because, alongside the success stories there are also the innumerable also-rans, many of which never had a reason to be anything but an also-ran. Because so often we have a startup without an actual product or business plan. Because, even if the technology isn't an overhyped fantasy (as it so often is) the reality is that we tend to end up with a single big winner gobbling up the market (a Microsoft, an Amazon, a Google, a Facebook), driving them out of business or swallowing them up for comparative pennies. They may fantasize that their little firm is the one that will be that winner--but all but the one who actually has the winner are wrong (while even the one who is right can hardly know that at the outset), making what the stupid may call "optimism" or "confidence" really just presumption on a staggering scale. And because so often in back of what is necessarily a misrepresentation are not just illusions and delusions and stupidity, but a good deal of bad faith, and outright criminality. (With Theranos Elizabeth Holmes hoped to be the next Steve Jobs. Instead if Theranos is remembered it will be remembered alongside Enron--about which Holmes ought to know something, her father, whose influence she traded on all her ultra-privileged life, having been Vice-President there.)
And as if all that were not enough I am annoyed, too, by the broader understanding of the world tied up with the word, the muddy thinking and outright lies. About, for example, the actual rate of technological change, which has been wildly exaggerated by so-called "experts" throughout my lifetime to a credulous public and credulous investors, who have so often bet big and lost big as a result. About the notion that technological change do not involve Big Science, established firms, government labs and government subsidies; does not involve vast, diffuse global efforts over very long periods of time; but is just nerd-magic done overnight by garage tinkerers aided by venture capitalists. And of course, there is, what all this justifies and glorifies, and where it has left us in a 2022 far, far different, and far, far sadder than the world we would have got if the Silicon Valley hucksters went anywhere near to delivering on the promises about what their "startups" would bring.
But in choosing to speak not of new businesses, or new companies, but instead startups, they make a far grander claim than the establishment of a new enterprise that will, hopefully, produce something of use to others and a living for its employees and a return for its investors.
Instead they promise that here is a technological and commercial revolution!
And I have to admit myself ever more cynical about that heavily used word and its users.
Because, alongside the success stories there are also the innumerable also-rans, many of which never had a reason to be anything but an also-ran. Because so often we have a startup without an actual product or business plan. Because, even if the technology isn't an overhyped fantasy (as it so often is) the reality is that we tend to end up with a single big winner gobbling up the market (a Microsoft, an Amazon, a Google, a Facebook), driving them out of business or swallowing them up for comparative pennies. They may fantasize that their little firm is the one that will be that winner--but all but the one who actually has the winner are wrong (while even the one who is right can hardly know that at the outset), making what the stupid may call "optimism" or "confidence" really just presumption on a staggering scale. And because so often in back of what is necessarily a misrepresentation are not just illusions and delusions and stupidity, but a good deal of bad faith, and outright criminality. (With Theranos Elizabeth Holmes hoped to be the next Steve Jobs. Instead if Theranos is remembered it will be remembered alongside Enron--about which Holmes ought to know something, her father, whose influence she traded on all her ultra-privileged life, having been Vice-President there.)
And as if all that were not enough I am annoyed, too, by the broader understanding of the world tied up with the word, the muddy thinking and outright lies. About, for example, the actual rate of technological change, which has been wildly exaggerated by so-called "experts" throughout my lifetime to a credulous public and credulous investors, who have so often bet big and lost big as a result. About the notion that technological change do not involve Big Science, established firms, government labs and government subsidies; does not involve vast, diffuse global efforts over very long periods of time; but is just nerd-magic done overnight by garage tinkerers aided by venture capitalists. And of course, there is, what all this justifies and glorifies, and where it has left us in a 2022 far, far different, and far, far sadder than the world we would have got if the Silicon Valley hucksters went anywhere near to delivering on the promises about what their "startups" would bring.
What Barry Levinson's Envy Has to Teach Us About Technology
Back in the mid-'00s the "Frat Pack" were the kings of Hollywood comedy, with hits like Old School (the film that gave them their name), Dodgeball, Anchorman and Wedding Crashers. However, alongside these hits there were quite a few less successful films, among them the Barry Levinson-directed Envy, which costarred Ben Stiller and Jack Black.
The film's story revolves around the Jack Black character's making a fortune with a (let us be decorous here) spray that simplifies the task of cleaning up after one's pet. Early in the film, before Black's character got rich, he was endlessly talking about his "inventions"--which did not actually exist, even in blueprint form--as that spray did not when he was talking about it. As Stiller's character explained to him what he was talking about was "not an invention" but rather "an idea."
It's a short, simple exchange--but not meaningless, the distinction Stiller's character makes quite important. An idea has not necessarily been proven--one reason why one cannot patent an idea--only an invention, which is a far more developed thing. Alas, tech journalism too little recognizes the distinction--telling us about technologies that may not have moved past the idea stage as if they were already realities, and doing even worse with technologies only scarcely more advanced. To cite but one example, not long ago a scientist put forth a concept for an unprecedentedly swift space vehicle, which was all well and good. What many of those journalists failed to understand, or at least properly acknowledge, was that the whole thing was premised on its being equipped with a TOKAMAK FUSION REACTOR--a technology at best far away from being developed to the point where we can fit one delivering the required performance in such a spacecraft. But they wrote about the thing as if this fusion-powered spacecraft were already being trucked out to the launch pad for its first flight.
Of course, that does not obviate the importance of ideas. The process of invention has to start somewhere, after all--and considering the matter I find myself recalling how inventor Hugo Gernsback was so much a believer in the value of ideas in themselves here that he saw in it one of the great uses of science fiction when he launched the first science fiction magazine, Amazing Stories, way back in 1926. But all the same, the distinction is an important one, especially when hopes and money start getting involved. And a century after Gernsback made his case it seems to me that we are far less short on ideas for solutions to the world's problems than on the actual inventions that will do the job.
The film's story revolves around the Jack Black character's making a fortune with a (let us be decorous here) spray that simplifies the task of cleaning up after one's pet. Early in the film, before Black's character got rich, he was endlessly talking about his "inventions"--which did not actually exist, even in blueprint form--as that spray did not when he was talking about it. As Stiller's character explained to him what he was talking about was "not an invention" but rather "an idea."
It's a short, simple exchange--but not meaningless, the distinction Stiller's character makes quite important. An idea has not necessarily been proven--one reason why one cannot patent an idea--only an invention, which is a far more developed thing. Alas, tech journalism too little recognizes the distinction--telling us about technologies that may not have moved past the idea stage as if they were already realities, and doing even worse with technologies only scarcely more advanced. To cite but one example, not long ago a scientist put forth a concept for an unprecedentedly swift space vehicle, which was all well and good. What many of those journalists failed to understand, or at least properly acknowledge, was that the whole thing was premised on its being equipped with a TOKAMAK FUSION REACTOR--a technology at best far away from being developed to the point where we can fit one delivering the required performance in such a spacecraft. But they wrote about the thing as if this fusion-powered spacecraft were already being trucked out to the launch pad for its first flight.
Of course, that does not obviate the importance of ideas. The process of invention has to start somewhere, after all--and considering the matter I find myself recalling how inventor Hugo Gernsback was so much a believer in the value of ideas in themselves here that he saw in it one of the great uses of science fiction when he launched the first science fiction magazine, Amazing Stories, way back in 1926. But all the same, the distinction is an important one, especially when hopes and money start getting involved. And a century after Gernsback made his case it seems to me that we are far less short on ideas for solutions to the world's problems than on the actual inventions that will do the job.
The Pseudo-Environmentalism of Useful Idiots
For a very long time mainstream commentators on the matter of climate change, even when their own inclinations have been toward ecological concern, have indulged the pretension that there was actually a "debate" among climate scientists over whether or not anthropogenic climate change was a reality, and whether it was actually severe enough to demand redress--all of which was false, this a matter of the appearance of controversy manufactured by opponents of action (often by interests fully aware that the problem was real for a very long time before the issue came to public attention).
Those interests continue to do this today--and commentators continue to respectfully call their denial of scientific facts "skepticism"--but less often and less credibly than before, such that, as Michael Mann has remarked, those opposing discussion and redress of the issue are playing a more complex game. Two aspects of that game seem to me to merit particular attention. One is the idea that we should emphasize the consumption choices of individuals (personal "carbon footprint") rather than large-scale action and the programs necessary to bring it about (like the portfolios of utility companies). The other is that the situation has got so bad that there is not much to be done about it now--making calls for action pointless.
Making this issue a matter of individual choices--individual choices on the part of people who mostly can't even pay their bills!--is a strategy that can't work, and was never intended to work. Indeed, assigning all the responsibility to those who have least power (for a start, consumers can only buy what they are offered, within the slight means most of them possess)--railing at them for their alleged crimes as they ignore far, far worse on the part of the powerful, before whom they bow and scrape (by all means, don't say a word about what the CEOs of Big Coal, Oil and Gas are doing, but scream at working people for eating a burger)--was intended to make working people think of talk about climate change as an attack on their own meager living standards, already battered by decades of economic stagnation, austerity and falling incomes. Defeatism is worse still, turning people's attention away from the matter of solving the problem altogether--as with the pernicious sniveling about "grief" (to which the sort who take a Jonathan Franzen seriously so readily incline); to fantasies of "adaptation" (as with the notion that anyone who lives anywhere near a body of water or the tropics can simply relocate themselves somewhere more comfortable with the ease of a billionaire deciding which home they'd like to go to on their private jet today); or given how depressing a world where the defeat has already happened has to be to anyone with the cranial capacity to comprehend the fact, tune the issue out (if only for the sake of their own psychological survival).
In both cases the promulgators of such strategies can say "Mission accomplished!"--again, with the help of people who may have had perfectly good intentions but were simply too unsophisticated to avoid becoming "useful idiots" to those pushing the very agenda they think they are fighting.
If one is at all serious about writing about an issue like climate change, a little self-awareness, a little knowledge, is called for. And if I may make a suggestion--if you speak or write about these matters you should take a good look at yourself, think about whether you are making any positive contribution at all, and if you find that you don't make the cut, either take a break and study up, or get out of the way of those who can do a better job.
Those interests continue to do this today--and commentators continue to respectfully call their denial of scientific facts "skepticism"--but less often and less credibly than before, such that, as Michael Mann has remarked, those opposing discussion and redress of the issue are playing a more complex game. Two aspects of that game seem to me to merit particular attention. One is the idea that we should emphasize the consumption choices of individuals (personal "carbon footprint") rather than large-scale action and the programs necessary to bring it about (like the portfolios of utility companies). The other is that the situation has got so bad that there is not much to be done about it now--making calls for action pointless.
Making this issue a matter of individual choices--individual choices on the part of people who mostly can't even pay their bills!--is a strategy that can't work, and was never intended to work. Indeed, assigning all the responsibility to those who have least power (for a start, consumers can only buy what they are offered, within the slight means most of them possess)--railing at them for their alleged crimes as they ignore far, far worse on the part of the powerful, before whom they bow and scrape (by all means, don't say a word about what the CEOs of Big Coal, Oil and Gas are doing, but scream at working people for eating a burger)--was intended to make working people think of talk about climate change as an attack on their own meager living standards, already battered by decades of economic stagnation, austerity and falling incomes. Defeatism is worse still, turning people's attention away from the matter of solving the problem altogether--as with the pernicious sniveling about "grief" (to which the sort who take a Jonathan Franzen seriously so readily incline); to fantasies of "adaptation" (as with the notion that anyone who lives anywhere near a body of water or the tropics can simply relocate themselves somewhere more comfortable with the ease of a billionaire deciding which home they'd like to go to on their private jet today); or given how depressing a world where the defeat has already happened has to be to anyone with the cranial capacity to comprehend the fact, tune the issue out (if only for the sake of their own psychological survival).
In both cases the promulgators of such strategies can say "Mission accomplished!"--again, with the help of people who may have had perfectly good intentions but were simply too unsophisticated to avoid becoming "useful idiots" to those pushing the very agenda they think they are fighting.
If one is at all serious about writing about an issue like climate change, a little self-awareness, a little knowledge, is called for. And if I may make a suggestion--if you speak or write about these matters you should take a good look at yourself, think about whether you are making any positive contribution at all, and if you find that you don't make the cut, either take a break and study up, or get out of the way of those who can do a better job.
Friday, June 17, 2022
Revisiting George and Meredith Friedman's The Future of War in 2022
When George and Meredith Friedman's The Future of War: Power, Technology and American World Dominance in the 21st Century appeared in 1996 the book, while certainly getting its fair share of attention, did so in spite of being distinctly unfashionable. After all, the authors' concern in the book was traditional, interstate, great power, war at a moment when, because of U.S. strength relative to any plausible opponent, and the expectation that globalization and other changes would soon make the nation-state and its traditional security agenda less relevant to the life of the world ("We no longer fight wars against countries, we fight them against individuals!" many a fashionable "analyst" enthused), with the result that that form of conflict not only seemed remote, but likely to go on becoming only more so over time.
Of course, even at the time many were skeptical of this "conventional wisdom"--for various reasons. The Marxist left, for example, with its stress on capitalism's contradictions, and its theories of imperialism (Luxemburg, Bukharin, Lenin), never bought into all this. However the Friedmans came from the opposite end of the political spectrum--and stood on a rather doctrinaire, "life never changes" insistence on the continuing validity of old-fashioned International Relations 101 "billiard ball"-model-of-the-international-system realpolitik. In doing so the Friedmans did not venture guesses as to who would be the belligerents in the armed conflicts this realpolitik-based vision of international relations anticipated as virtually certain (the U.S. apart), or the specific objects for which they could be expected to fight, or any of those things which could be concluded from such premises, like when and where such wars could be expected to break out--perhaps chastened by their having made a colossal recent error here. (Just a few years earlier here George and Meredith, at the peak of American Japanophobia, warned darkly of The Coming War With Japan.) Instead what the Friedmans emphasized was the long-term evolution of the means for fighting such wars--specifically the shift from massed explosive power and the land, sea and air-based platforms which deliver it (tanks, gun-armed vessels, planes) to precision-guided munitions, highly technologized "Starship Troopers"-style infantry, and eventually space platforms ("battlestars"), making senile and eventually obsolete the systems that dominated the twentieth century battlefield.
As yet we remain far from possessing the technologies to the Friedmans' vision. And certainly George's more specific political predictions have since been unimpressive. (In his book The Next 100 Years he forecast that China would collapse in the 2010s--and Russia not long after, and that without anything like the stress of the current war the country is fighting, which he also failed to predict, instead anticipating a Paris-Berlin-Moscow axis.) However, if there is much that he has clearly got wrong already, he seems to have been sounder than his more fashionable colleagues in realizing that the illusions of the '90s were only that, illusions, a fact whose fuller implications many are only now beginning to realize.
Of course, even at the time many were skeptical of this "conventional wisdom"--for various reasons. The Marxist left, for example, with its stress on capitalism's contradictions, and its theories of imperialism (Luxemburg, Bukharin, Lenin), never bought into all this. However the Friedmans came from the opposite end of the political spectrum--and stood on a rather doctrinaire, "life never changes" insistence on the continuing validity of old-fashioned International Relations 101 "billiard ball"-model-of-the-international-system realpolitik. In doing so the Friedmans did not venture guesses as to who would be the belligerents in the armed conflicts this realpolitik-based vision of international relations anticipated as virtually certain (the U.S. apart), or the specific objects for which they could be expected to fight, or any of those things which could be concluded from such premises, like when and where such wars could be expected to break out--perhaps chastened by their having made a colossal recent error here. (Just a few years earlier here George and Meredith, at the peak of American Japanophobia, warned darkly of The Coming War With Japan.) Instead what the Friedmans emphasized was the long-term evolution of the means for fighting such wars--specifically the shift from massed explosive power and the land, sea and air-based platforms which deliver it (tanks, gun-armed vessels, planes) to precision-guided munitions, highly technologized "Starship Troopers"-style infantry, and eventually space platforms ("battlestars"), making senile and eventually obsolete the systems that dominated the twentieth century battlefield.
As yet we remain far from possessing the technologies to the Friedmans' vision. And certainly George's more specific political predictions have since been unimpressive. (In his book The Next 100 Years he forecast that China would collapse in the 2010s--and Russia not long after, and that without anything like the stress of the current war the country is fighting, which he also failed to predict, instead anticipating a Paris-Berlin-Moscow axis.) However, if there is much that he has clearly got wrong already, he seems to have been sounder than his more fashionable colleagues in realizing that the illusions of the '90s were only that, illusions, a fact whose fuller implications many are only now beginning to realize.
Subscribe to:
Posts (Atom)