Over the years I have had many an occasion to consider the differences in the political discourses of the U.S. and Britain. One of the most significant has been the greater attentiveness of British news-watchers to the economic and socioeconomic than their American counterparts--on average more interested, better-informed, more willing and able to discuss them substantively.
The difference exists in spite of a great many similarities. Britain, too, has its "centrism," which in a prior day undertook reform, but from an essentially conservative position with essentially conservative ends--fearful and resentful of the left while conciliatory to the right, culminating in a rightward shift over time and the withering of discussion, debate, electoral options regarding these matters (witness the trajectory of the Labour Party). Britain, too, has its "culture wars," with both sides in them quite deliberately deploying the politics of identity to divert the public from more material issues. Britain, too, has its "pundits" telling the public social class and class differences are nonexistent or irrelevant or minor, that any such thing as an Establishment is a leftist fantasy, that "neoliberalism" too is "not a thing." All of this reflects the reality that in Britain, too, the media ill serves the public where these matters are concerned (and just about every other, too).
Of course, one may also argue that if all this is the case in Britain it is less thoroughly so than in the U.S., that centrism has not had such a tight grip on the more leftward of its parties (whom would one call the American Aneurin Bevan?), that in Britain the culture wars have not quite gone so far in blocking other issues out of the public consciousness, that much that is conventional wisdom in America is not quite that in Britain, in part because for all its failings the country's media does offer somewhat more leeway for the sorts of marginalized views and voices attentive to such things. Still, as the trajectory of British discourse and policy in recent years makes clear (and as I imagine British leftists in particular would hasten to point out) one should not make too much of these qualifications. Rather three things seem to me to have made the difference:
1. Britain's governance and public policy formation in these areas has so often been so much more centralized than in the U.S.. This makes many an issue decided at the local or state level in the U.S. (local tax rates, much to do with education and health services, etc.), which people in various parts of the country decide in different ways, from different starting points and with different emotional charges, more completely a matter of national policy in Britain (with Thatcherism, in fact, strongly identified with centralization for the sake of implementing its program more fully). The result is that they are an object of national rather than local debate, with all the additional attention this brings them.
2. Britain's policy in the relevant areas has shifted much further left and much further right than it did in the U.S. over the course of the years. This is not to deny that in British life the "center" of the political spectrum may even now be regarded as lying leftward of the American center. However, the public sector, the welfare state, organized labor, were all built up to far greater heights in Britain than was ever seriously considered in the U.S.. And whatever their limitations have been in practice (and these should not be overlooked), Britain had free college and free-at-point-of-service health care, and the U.S. never came close to anything like that. Meanwhile, circa 1980 three of every five British workers was a member of a labor union, as compared with just one in five American workers, while British unions were unencumbered by anything remotely like Taft-Hartley and the top-heaviness of American unions with bureaucracy, and vastly more politically conscious and militant. Naturally the dismantling of all that, if still leaving Britain with more apparatus of this kind than the U.S. ever had, entailed a far bigger and more wrenching shift in economic and social life than what the U.S. has seen in the same period, with that much more controversy, reflected in the reality that, while Ronald Reagan has been a divisive figure in American history, in the American mainstream one simply does not see the expression of the kind of bitterness toward him that one sees in regard to Thatcher in Britain. All of this, again, makes it harder to slight the topic.
3. In Britain the reality of social class has simply been too ostentatious for too long to ignore. Where in the U.S. the national mythology is overwhelmingly devoted to the idea of the self-made individual (to the point that children of extreme socioeconomic privilege constantly pass themselves off as such, with a lickspittle press always enabling them) a monarchy remains the center of British public life, while the legislature's upper house remains a House of Lords, and even for the native-born, old-stock Briton accent is scarcely less a marker of class origin than it was when George Bernard Shaw wrote Pygmalion. Even the difference in the usage of the term "middle class" appears telling. The attempts by British leaders to make the kind of vague use of the term that Americans do, to portray their nation as, rich and poor alike, somehow a middle class nation, seem to have had rather less success than across the Pond. Rather in British usage the word seems to still have a more exclusive (and meaningful) usage, with middle classness having a higher floor--and in the view of most a distinct ceiling too, many scoffing at David Cameron's calling himself "middle class."
Together these three factors--that centralization of political life, the extremity of the distance policy moved along the political spectrum across the past century, and the greater difficulty of dismissing the role of class--make it far, far harder to avoid acknowledging the herd of thoroughly material elephants in the room that make so much difference in the large and the small of individual lives.
Thursday, May 19, 2022
Wednesday, May 18, 2022
The Necessity of Political Language
It would seem reasonably uncontroversial that our ability to discuss a subject in a clear and rigorous way depends on our having a vocabulary adequate to permit that, not least by furnishing us with a satisfactorily extensive lexicon of terms with precise, coherent meanings for reference. However, many seem to see "political language" as serving no purpose whatsoever, and indeed something to be attacked on sight.
Why is this the case? Let us first of all admit the fact that a good deal of language is, from the outset, muddled at the level of conception, with thinkers coining words and phrases that ultimately fail to illuminate, or which they may even deliberately intend to obscure, to confuse, to manipulate. While we are at it let us also get out of the way the reality that a good deal of usage of terms that do not originate that way can end up being used that way. (Alan Sokal, you may remember, got pretty far stringing together favored buzzwords of people in certain categories of study in the humanities and social sciences in totally nonsensical fashion for the sake of exposing the game. Hermeneutics of Quantum Gravity indeed! It's the sort of phrase that a stupid person might come up with when trying to imagine how the intelligent speak.)
Yet all this is very far from the whole issue, and to focus on it can be misleading. After all, these are problems not just with political language, but all language, despite which we generally manage to get along. But many play them up where politics is concerned for the most dubious of reasons. One is laziness, of course. Dismissal of the value of a body of knowledge is easier than actually acquiring it before we make our judgments. ("We'll never need math anyway," says many a frustrated student. Or grammar. Or spelling. Or anything else for which there seems to be an app. They are likely to find out otherwise.) Even those pursuing a bit of intellectual distinction can take the lazy route themselves, being all too familiar with the sorts of cheap maneuvers that impress the simple-minded--for example, dismissing existing categories of thought as if they have somehow transcended them--saying "Look at me and how much smarter I am than everyone else because I have seen through their bugbears! Look at me thinking outside the box!"--when in reality they have not done any thinking at all. When they have not even looked at the box or its contents, let alone managed to think outside them (and likely not even realized that "thinking outside the box" has degenerated into yet anothr atrocious corporate clichè whose utterance strongly suggests that the speaker is not doing what they presume to be doing).
However, other factors seem to me more consequential, not least the prevalence of a postmodernist intellectual orthodoxy that takes a very dim view of the capacities of human reason. This goes only so far in the realm of the physical sciences--one can't argue (very much) with the accomplishments of physicists, for example. But they have more scope to do so in the social realm, where one not only gets away with, but is lauded as displaying the greatest profundity for, doing what would be the physics equivalent of "All that matter and motion and energy stuff is probably incomprehensible, and looking for patterns in it or explanations for it is pointless and pernicious. Best not do it and stick to as superficial a reading of things as we can."
This orthodoxy extends to a dim view of human language's descriptive powers, which has them dismissing all language as "language games" and words--and especially words with heavy political meanings--as therefore meaningless, often in a tone suggestive of this being indisputably obvious to all. That even amid all the superficiality and misuse and abuse of terms words like "conservative," "liberal," "socialist," "capitalist" (or "neoconservative," "neoliberal," "fascist," "centrist") these can still denote real, meaningful and important concepts or sets of concepts, and that even the shallower, diverging or outright erroneous usages may reflect and evoke and reinforce those patterns in telling and important ways, is something they steadfastly refuse to acknowledge--not least because these cheap and shabby evasions may serve their purposes, which is not in furthering a conversation, but undermining a conversation they do not want to see happen at all.
Indeed, blanket attacks on political language are often a shabby cover for attacks on someone else's political language, for the sake of promoting one's own no less political nomenclature. I recall, for example, running into a recent discussion of centrism on Twitter. When the Tweeter in question raised a particular analysis of where centrism sits within the political spectrum one of the first people to respond sneeringly dismissed the left-to-right political spectrum as meaningless and incoherent and declared that people should think in terms of statism and anti-statism. What he suggested, of course, was our narrowing the political dialogue to a single issue along a single axis (most people at least let us have two!) apparently divorced from any intellectual premises whatsoever in an all too blatant attempt to shut down the conversation that Tweeter was trying to start and have one more congenial to him. As the person in question was a professional renewable-energy basher with endorsements for his book from certain high-profile figures his biases here were all too obvious. Simply put, he wanted to make it a matter of "libertarians" as the champions of "freedom" (he didn't define either term, of course, but anyone can guess what he had in mind) against everyone else.
Whatever one makes of his views, his unfortunate intervention (which smacked of that online discourse policing--that heckling--that I disdain regardless of who does it--and which can only seem at odds with any pretension to respect for freedom of speech) only underlined how we never get away from our reliance on a political vocabulary, and that rather than dismissing it in that lazy and pretentious way as so many do, what we ought to do is work to make that vocabulary useful.
Why is this the case? Let us first of all admit the fact that a good deal of language is, from the outset, muddled at the level of conception, with thinkers coining words and phrases that ultimately fail to illuminate, or which they may even deliberately intend to obscure, to confuse, to manipulate. While we are at it let us also get out of the way the reality that a good deal of usage of terms that do not originate that way can end up being used that way. (Alan Sokal, you may remember, got pretty far stringing together favored buzzwords of people in certain categories of study in the humanities and social sciences in totally nonsensical fashion for the sake of exposing the game. Hermeneutics of Quantum Gravity indeed! It's the sort of phrase that a stupid person might come up with when trying to imagine how the intelligent speak.)
Yet all this is very far from the whole issue, and to focus on it can be misleading. After all, these are problems not just with political language, but all language, despite which we generally manage to get along. But many play them up where politics is concerned for the most dubious of reasons. One is laziness, of course. Dismissal of the value of a body of knowledge is easier than actually acquiring it before we make our judgments. ("We'll never need math anyway," says many a frustrated student. Or grammar. Or spelling. Or anything else for which there seems to be an app. They are likely to find out otherwise.) Even those pursuing a bit of intellectual distinction can take the lazy route themselves, being all too familiar with the sorts of cheap maneuvers that impress the simple-minded--for example, dismissing existing categories of thought as if they have somehow transcended them--saying "Look at me and how much smarter I am than everyone else because I have seen through their bugbears! Look at me thinking outside the box!"--when in reality they have not done any thinking at all. When they have not even looked at the box or its contents, let alone managed to think outside them (and likely not even realized that "thinking outside the box" has degenerated into yet anothr atrocious corporate clichè whose utterance strongly suggests that the speaker is not doing what they presume to be doing).
However, other factors seem to me more consequential, not least the prevalence of a postmodernist intellectual orthodoxy that takes a very dim view of the capacities of human reason. This goes only so far in the realm of the physical sciences--one can't argue (very much) with the accomplishments of physicists, for example. But they have more scope to do so in the social realm, where one not only gets away with, but is lauded as displaying the greatest profundity for, doing what would be the physics equivalent of "All that matter and motion and energy stuff is probably incomprehensible, and looking for patterns in it or explanations for it is pointless and pernicious. Best not do it and stick to as superficial a reading of things as we can."
This orthodoxy extends to a dim view of human language's descriptive powers, which has them dismissing all language as "language games" and words--and especially words with heavy political meanings--as therefore meaningless, often in a tone suggestive of this being indisputably obvious to all. That even amid all the superficiality and misuse and abuse of terms words like "conservative," "liberal," "socialist," "capitalist" (or "neoconservative," "neoliberal," "fascist," "centrist") these can still denote real, meaningful and important concepts or sets of concepts, and that even the shallower, diverging or outright erroneous usages may reflect and evoke and reinforce those patterns in telling and important ways, is something they steadfastly refuse to acknowledge--not least because these cheap and shabby evasions may serve their purposes, which is not in furthering a conversation, but undermining a conversation they do not want to see happen at all.
Indeed, blanket attacks on political language are often a shabby cover for attacks on someone else's political language, for the sake of promoting one's own no less political nomenclature. I recall, for example, running into a recent discussion of centrism on Twitter. When the Tweeter in question raised a particular analysis of where centrism sits within the political spectrum one of the first people to respond sneeringly dismissed the left-to-right political spectrum as meaningless and incoherent and declared that people should think in terms of statism and anti-statism. What he suggested, of course, was our narrowing the political dialogue to a single issue along a single axis (most people at least let us have two!) apparently divorced from any intellectual premises whatsoever in an all too blatant attempt to shut down the conversation that Tweeter was trying to start and have one more congenial to him. As the person in question was a professional renewable-energy basher with endorsements for his book from certain high-profile figures his biases here were all too obvious. Simply put, he wanted to make it a matter of "libertarians" as the champions of "freedom" (he didn't define either term, of course, but anyone can guess what he had in mind) against everyone else.
Whatever one makes of his views, his unfortunate intervention (which smacked of that online discourse policing--that heckling--that I disdain regardless of who does it--and which can only seem at odds with any pretension to respect for freedom of speech) only underlined how we never get away from our reliance on a political vocabulary, and that rather than dismissing it in that lazy and pretentious way as so many do, what we ought to do is work to make that vocabulary useful.
Tuesday, May 17, 2022
In the 2020s the 1920s Have Become the 1930s
I remember back about the turn of the century Thomas P.M. Barnett emerged as a national security counterpart to Thomas Friedman who could be characterized as devoting himself to explaining just how, exactly, "McDonnell-Douglas" would back up "McDonald's."
Barnett's conformity to the globalization-singing conventional wisdom of the day made him the sort of fashionable Public Intellectual who got written up in places like Esquire (which magazine's "foreign-policy guru" he subsequently became).
I was rather less impressed than the folks at Esquire. Still, Barnett was astute enough to acknowledge that the whole thing could unravel, citing Admiral William Flanagan in his book Blueprint for Action about the possibility that "the 1990s might be a replay of the 1920s," raising "the question . . . What would it take for the 2000s to turn as sour as the 1930s?"
The analogy seems to me fair enough. Like the 1920s the 1990s were a period after the end of a major international conflict which it was hoped would never be followed by another like it. After all, those optimistic about the trend of things (at least, in the politically orthodox way) imagined that conflict's end supposedly auguring the arrival of a more orderly, peaceful--and prosperous--world, with many a parallel quite striking. On both occasions the U.S. had emerged from that conflict as victor, hyperpowered arbiter of the world's fate, and in the "American way," the pointer to everyone else's future amid a financial boom and euphoria over a supposedly epochal revolution in productivity and consumerism bound up with new technology, and immense self-satisfaction about their freer, "liberated" lifestyles--all the while ignoring anything that gave the lie to their illusions, dismissing the financial crises, the international crises as mere bumps on the road, which they insisted were quite manageable by deified Overseers of the Global Economy, and the stirrings of radicalism at home and abroad (the country certainly had its "status politics," its "culture wars") a much-ado-about-nothing on the wrong side of the end of history.
Considering it Barnett--whom it must be noted again, was fashionable because he was conventional--was on the whole optimistic that the challenges could remain so manageable in the long term. (Hence, that "Blueprint for Action.") Those taking a less sanguine view of these developments thought otherwise, and they have since proven correct as, just as the illusions of the '20s died, so did those of the '90s.
When we look back at the '20s the tendency is to think of them as having come to an end in 1929, with "the Great Crash." Of course, the end of the mood we associate with the decade was not so obviously tidy. But it does seem that the illusions of the '90s may have been a longer time dying than those of the '20s, dying only a bit at a time, with the way things played out enabling the denials to last longer. One may recall, for example, the rush to declare the financial crisis that broke out in 2007-2008 past--such that even an Adam Tooze, when buckling down to study the event properly, was himself surprised to conclude in a book published a decade later that it had never gone away, as it still has not, simply merging with other crises, like the COVID-19 pandemic and its own attached economic crisis (also not behind us, even if some pretend it is), into something bigger and worse and scarier. (How big and bad and scary? Well, according to one calculation, even before the pandemic economic growth rates had either virtually flatlined or turned significantly negative for most of the planet--which makes all the backlash against neoliberalism--the votes for Trump, Britain's exit from the EU, and all the rest--that much less surprising.) Meanwhile, if any doubts had remained after a decade of intensifying and increasingly militarized conflict among the great powers, the war in Ukraine has made it very, very clear that the actuality of such things--open, large-scale, sustained interstate warfare between large nation-states in the middle of Europe, and escalating confrontation between NATO and Russia--is a significant and worsening part of our present reality.
Looking at the news I do not get the impression that very many have properly processed the fact yet. But the neo-'20s mood that characterized the '90s, and lingered in varying ways and to varying degrees long after the years on the calendar ceased to read 199-, seems ever more remote these days, any indication otherwise ever more superficial.
Barnett's conformity to the globalization-singing conventional wisdom of the day made him the sort of fashionable Public Intellectual who got written up in places like Esquire (which magazine's "foreign-policy guru" he subsequently became).
I was rather less impressed than the folks at Esquire. Still, Barnett was astute enough to acknowledge that the whole thing could unravel, citing Admiral William Flanagan in his book Blueprint for Action about the possibility that "the 1990s might be a replay of the 1920s," raising "the question . . . What would it take for the 2000s to turn as sour as the 1930s?"
The analogy seems to me fair enough. Like the 1920s the 1990s were a period after the end of a major international conflict which it was hoped would never be followed by another like it. After all, those optimistic about the trend of things (at least, in the politically orthodox way) imagined that conflict's end supposedly auguring the arrival of a more orderly, peaceful--and prosperous--world, with many a parallel quite striking. On both occasions the U.S. had emerged from that conflict as victor, hyperpowered arbiter of the world's fate, and in the "American way," the pointer to everyone else's future amid a financial boom and euphoria over a supposedly epochal revolution in productivity and consumerism bound up with new technology, and immense self-satisfaction about their freer, "liberated" lifestyles--all the while ignoring anything that gave the lie to their illusions, dismissing the financial crises, the international crises as mere bumps on the road, which they insisted were quite manageable by deified Overseers of the Global Economy, and the stirrings of radicalism at home and abroad (the country certainly had its "status politics," its "culture wars") a much-ado-about-nothing on the wrong side of the end of history.
Considering it Barnett--whom it must be noted again, was fashionable because he was conventional--was on the whole optimistic that the challenges could remain so manageable in the long term. (Hence, that "Blueprint for Action.") Those taking a less sanguine view of these developments thought otherwise, and they have since proven correct as, just as the illusions of the '20s died, so did those of the '90s.
When we look back at the '20s the tendency is to think of them as having come to an end in 1929, with "the Great Crash." Of course, the end of the mood we associate with the decade was not so obviously tidy. But it does seem that the illusions of the '90s may have been a longer time dying than those of the '20s, dying only a bit at a time, with the way things played out enabling the denials to last longer. One may recall, for example, the rush to declare the financial crisis that broke out in 2007-2008 past--such that even an Adam Tooze, when buckling down to study the event properly, was himself surprised to conclude in a book published a decade later that it had never gone away, as it still has not, simply merging with other crises, like the COVID-19 pandemic and its own attached economic crisis (also not behind us, even if some pretend it is), into something bigger and worse and scarier. (How big and bad and scary? Well, according to one calculation, even before the pandemic economic growth rates had either virtually flatlined or turned significantly negative for most of the planet--which makes all the backlash against neoliberalism--the votes for Trump, Britain's exit from the EU, and all the rest--that much less surprising.) Meanwhile, if any doubts had remained after a decade of intensifying and increasingly militarized conflict among the great powers, the war in Ukraine has made it very, very clear that the actuality of such things--open, large-scale, sustained interstate warfare between large nation-states in the middle of Europe, and escalating confrontation between NATO and Russia--is a significant and worsening part of our present reality.
Looking at the news I do not get the impression that very many have properly processed the fact yet. But the neo-'20s mood that characterized the '90s, and lingered in varying ways and to varying degrees long after the years on the calendar ceased to read 199-, seems ever more remote these days, any indication otherwise ever more superficial.
Thursday, May 5, 2022
Did Anyone Actually Read Paul Kennedy's The Rise and Fall of Great Powers?
I have recently remarked what makes for a nonfiction bestseller generally--which, of course, leaves little space for anything that could be called "history." Of course, we do see history reach a wider audience--but only within that demand the public makes for the affirmative and the entertaining. Thus it is what Michael Parenti called gentlemen's history--history by, of and for the comfortable, who are supposed to feel comfortable during and after reading it; history which is conservative and "patriotic" (in the sense of loyalty to those in power, rather than to their country's well-being) and in line with all that self-congratulatory (from the standpoint of the elite in question).
Meanwhile, in its tending to be Great Man-centered it tends toward the personal and the narrative--to, indeed, being biography rather than history. (As A.J.P. Taylor remarked the two genres are actually very different--in the former the individual everything and society nothing; in the latter, the individual nothing and society everything.) It also tends toward, even while presenting its figures in a heroic light, also the gossipy. (Taylor remarked, too, that a "glamorous sex life" was a prerequisite for a successful biography.)
As Jeremy Black demonstrates all of this translates over to military history, which is dominated by biography-memoir-operational account--by the Great Captain subgenre of the Great Man genre, in which such Captains are presented as the dominating figures of the Decisive Battles of History, the same battles over and over and over again (with Britain's portion of the Napoleonic Wars, the U.S. Civil War, and the portions of the two world wars those countries experienced pretty much it for the more popular market in Britain and the U.S.).
One may add that, even in comparison with much other history, it tends especially heavily to the conservative and patriotic--to the hero-worship of generals, nationalistic flag-waving and the rest.
All of this was much on my mind when considering the reception of Paul Kennedy's The Rise and Fall of Great Powers. Certainly a work of history, and very reasonably readable as a work of military history, it stayed on the New York Times hardcover nonfiction bestseller list for 34 weeks--in spite of its being a very different book indeed. Far from offering personal narrative in it Kenndy presents an academic thesis resting on a detailed examination of five hundred years of Western and world history, where the "characters" are not individuals but entire nations and empires, whose development and clashing, ascent and descent, are construed not as the deeds of so-called Great Men, but the hard material facts of geography, technology, demographics, of industries and institutions. Of battles, campaign and wars there are plenty, but little of tactics and strategy and even less of generalship, with what really mattered the way resources, and the matching of resources to objects, told in the crunch.
Covering so much territory even in a seven hundred page volume, of course, means that Kennedy treats any one bit in only so much detail (as is the more evident if one compares it to, for example, his earlier, Britain-focused treatment of the same theme in The Rise and Fall of British Naval Mastery, which I recommend highly to anyone interested in the subject, by the way). Still, the quantitative data alone is, by the standard of popular works, immense, as testified by the inclusion of over fifty charts and tables, with the academic character of the work underlined by the 83 pages of notes and 38 pages of bibliography appended to the over five hundred page main text. Kennedy writes clearly and well, but it is an undeniably data-heavy, analytically-oriented work, with no attempt to enliven the proceedings with what an editor might call "color."
And of course, it was anything but self-congratulatory in the sense discussed here.
Considering Kennedy's book I find myself also considering another major--and similarly unstereotypical--bestseller of 1988, Stephen Hawking's A Brief History of Time. Hawking's book was much shorter (256 pages to the 677 pages of Kennedy's book), and while intellectual hierarchy-addicted morons such as Hollywood writes for take it as a given that physics is the most demanding field of intellectual endeavor, the reality is that even by pop science standards it seemed to me "easy," while I might add, Hawking's tone was sprightly. He clearly meant to produce a book that a broad audience could get something out of, and in my view did so. Kennedy's book most certainly did not. The result is that, if Hawking's book is, as I have seen it called, the most widely-selling unread book in history, I would imagine that very few bothered to read Kennedy's book all the way through--an opinion that Kennedy himself seems to share. He has publicly remarked--joked?--that he didn't "think many people read more than the final chapter on the US and the USSR"--and I would imagine that many more simply knew the alleged contents of the chapter secondhand.
Meanwhile, in its tending to be Great Man-centered it tends toward the personal and the narrative--to, indeed, being biography rather than history. (As A.J.P. Taylor remarked the two genres are actually very different--in the former the individual everything and society nothing; in the latter, the individual nothing and society everything.) It also tends toward, even while presenting its figures in a heroic light, also the gossipy. (Taylor remarked, too, that a "glamorous sex life" was a prerequisite for a successful biography.)
As Jeremy Black demonstrates all of this translates over to military history, which is dominated by biography-memoir-operational account--by the Great Captain subgenre of the Great Man genre, in which such Captains are presented as the dominating figures of the Decisive Battles of History, the same battles over and over and over again (with Britain's portion of the Napoleonic Wars, the U.S. Civil War, and the portions of the two world wars those countries experienced pretty much it for the more popular market in Britain and the U.S.).
One may add that, even in comparison with much other history, it tends especially heavily to the conservative and patriotic--to the hero-worship of generals, nationalistic flag-waving and the rest.
All of this was much on my mind when considering the reception of Paul Kennedy's The Rise and Fall of Great Powers. Certainly a work of history, and very reasonably readable as a work of military history, it stayed on the New York Times hardcover nonfiction bestseller list for 34 weeks--in spite of its being a very different book indeed. Far from offering personal narrative in it Kenndy presents an academic thesis resting on a detailed examination of five hundred years of Western and world history, where the "characters" are not individuals but entire nations and empires, whose development and clashing, ascent and descent, are construed not as the deeds of so-called Great Men, but the hard material facts of geography, technology, demographics, of industries and institutions. Of battles, campaign and wars there are plenty, but little of tactics and strategy and even less of generalship, with what really mattered the way resources, and the matching of resources to objects, told in the crunch.
Covering so much territory even in a seven hundred page volume, of course, means that Kennedy treats any one bit in only so much detail (as is the more evident if one compares it to, for example, his earlier, Britain-focused treatment of the same theme in The Rise and Fall of British Naval Mastery, which I recommend highly to anyone interested in the subject, by the way). Still, the quantitative data alone is, by the standard of popular works, immense, as testified by the inclusion of over fifty charts and tables, with the academic character of the work underlined by the 83 pages of notes and 38 pages of bibliography appended to the over five hundred page main text. Kennedy writes clearly and well, but it is an undeniably data-heavy, analytically-oriented work, with no attempt to enliven the proceedings with what an editor might call "color."
And of course, it was anything but self-congratulatory in the sense discussed here.
Considering Kennedy's book I find myself also considering another major--and similarly unstereotypical--bestseller of 1988, Stephen Hawking's A Brief History of Time. Hawking's book was much shorter (256 pages to the 677 pages of Kennedy's book), and while intellectual hierarchy-addicted morons such as Hollywood writes for take it as a given that physics is the most demanding field of intellectual endeavor, the reality is that even by pop science standards it seemed to me "easy," while I might add, Hawking's tone was sprightly. He clearly meant to produce a book that a broad audience could get something out of, and in my view did so. Kennedy's book most certainly did not. The result is that, if Hawking's book is, as I have seen it called, the most widely-selling unread book in history, I would imagine that very few bothered to read Kennedy's book all the way through--an opinion that Kennedy himself seems to share. He has publicly remarked--joked?--that he didn't "think many people read more than the final chapter on the US and the USSR"--and I would imagine that many more simply knew the alleged contents of the chapter secondhand.
Wednesday, May 4, 2022
Emmanuel Todd, China and the Graying of the World
In a recent interview sociologist and demographer Emmanuel Todd, discussing the matter of China's rise, argued that the country's far-below-replacement level fertility rate (which Todd says is 1.3 per woman) makes unlikely the visions of the country as hegemon for the simple reason that its labor force is bound to contract sharply, with massive implications for its already slowing economy, and its national power.
Considering this I find myself thinking of three counterarguments:
1. The 1.3 Total Fertility Rate (TFR) for China was registered in the wake of the pandemic, with its associated economic and other stresses. Before that it was up at about 1.7--a significant difference, such that rebound is hardly out of the question.
2. Even if one takes the 1.3 TFR as a "new normal" for China the rate in question is not only evident across its neighborhood, but actually more advanced in many neighboring countries. (Japan's TFR was scarcely above that before the pandemic, just 1.36, while South Korea's slipped below 1 in 2018 and was at 0.92 in 2019, according to World Bank figures.)
3. Even if the drop were to go further in China than elsewhere China, with a population of 1.4 billion, is, even after a much more demographic contraction than in neighboring states (a scenario hardly in the cards), still a colossus relative to the other states (Japan today having scarcely an eleventh of China's population, South Korea one-twenty-eighth its population.)
Still, China's contraction is coming at a point at which it is rather poorer than neighbors like Japan and South Korea (with a per capita Gross Domestic Product of $10,000 a year, versus $40,000 for Japan, $30,000 for South Korea)--and already seeing its economic growth slow sharply (those legendary 10 percent a year rates a thing of the past, with the 2012-2019 average more like 7 percent and still falling). The result is that the demands of an aging population could weigh that much more heavily on its resources.
All the same, how much the fact will matter ultimately depends on how societies handle the matter of their aging populations. One can picture a scenario in which modern medicine succeeds in alleviating the debilitating effects of getting older, permitting older persons to need less care. One can also picture a scenario in which rising economic productivity more than makes up for the decline of the labor supply and the rise in the dependency ratio (perhaps by lowering the cost of living). In either case, or one combining the benefits of both, the demographic transition may turn out to be managed easily enough, in China and elsewhere. Yet one can picture less happy scenarios as well--and I am sorry to say, rather easily in light of the disappointments of recent decades on all these scores. But even in that eventuality I would not be too quick to envision the melodramatic collapse scenarios making the rounds of the headlines yet again in recent months.
Considering this I find myself thinking of three counterarguments:
1. The 1.3 Total Fertility Rate (TFR) for China was registered in the wake of the pandemic, with its associated economic and other stresses. Before that it was up at about 1.7--a significant difference, such that rebound is hardly out of the question.
2. Even if one takes the 1.3 TFR as a "new normal" for China the rate in question is not only evident across its neighborhood, but actually more advanced in many neighboring countries. (Japan's TFR was scarcely above that before the pandemic, just 1.36, while South Korea's slipped below 1 in 2018 and was at 0.92 in 2019, according to World Bank figures.)
3. Even if the drop were to go further in China than elsewhere China, with a population of 1.4 billion, is, even after a much more demographic contraction than in neighboring states (a scenario hardly in the cards), still a colossus relative to the other states (Japan today having scarcely an eleventh of China's population, South Korea one-twenty-eighth its population.)
Still, China's contraction is coming at a point at which it is rather poorer than neighbors like Japan and South Korea (with a per capita Gross Domestic Product of $10,000 a year, versus $40,000 for Japan, $30,000 for South Korea)--and already seeing its economic growth slow sharply (those legendary 10 percent a year rates a thing of the past, with the 2012-2019 average more like 7 percent and still falling). The result is that the demands of an aging population could weigh that much more heavily on its resources.
All the same, how much the fact will matter ultimately depends on how societies handle the matter of their aging populations. One can picture a scenario in which modern medicine succeeds in alleviating the debilitating effects of getting older, permitting older persons to need less care. One can also picture a scenario in which rising economic productivity more than makes up for the decline of the labor supply and the rise in the dependency ratio (perhaps by lowering the cost of living). In either case, or one combining the benefits of both, the demographic transition may turn out to be managed easily enough, in China and elsewhere. Yet one can picture less happy scenarios as well--and I am sorry to say, rather easily in light of the disappointments of recent decades on all these scores. But even in that eventuality I would not be too quick to envision the melodramatic collapse scenarios making the rounds of the headlines yet again in recent months.
Tuesday, May 3, 2022
What Ever Became of the Information Age?
You may remember having heard the term "information age"--but it is entirely possible you have only a vague notion of what it meant. This may be because it has been a long time since you heard it last, but also because the term is slippery, having many usages.
Like the terms "atomic age," "jet age," "space age" "information age" can mean an era in which a revolutionary technology has arrived on the scene--and while "information technology" is not really new (writing, and even spoken language, is describable as an "information technology") there is no question that the significance of the electronic computer and its associated communications systems, in their various forms, represented something different from what came before. And indeed the information age came to pass in this sense.
Like the term "Industrial Age" "Information Age" can also denote a shift in the particular, fundamental conditions of work and consumption. The industrial age saw the decline of the rural, agrarian, peasant way of life as the norm as a revolutionary, inanimate energy-powered machine-based form of mass manufacturing became the predominant condition of our existence (employing a quarter of the American labor force at mid-century, while overwhelmingly accounting for the rise in material output and living standards). Likewise the information age held out the prospect of a great increase in the work effort devoted to, in one way or another, producing, processing and communicating information--as the volume of information being produced, processed and communicated exploded. And this, too, did come to pass.
However, the term had other meanings. Of these the one that was most exciting--because it was the one that could really, really make it matter in a way that would merit speaking of A New Age--was the idea that information itself, which has always been substitutable for other economic inputs like land and capital and labor, and substituted for them (this was how the Industrial Age happened, after all, the technical know-how to exploit those energy sources and build all the other machines enabling eventually massive labor substitution), would become so much radically substitutable for everything else that we would in this respect altogether transcend the smokestack, raw material-processing, secondary sector-centered Industrial Age. Thus, if the supply of some good ran short, information-age INNOVATION! would promptly turn scarcity into abundance, with what was promised for nanotechnology exemplary (the radical new materials like carbon nanotubes that would be stronger and lighter and better than so much else, the molecular-scale assemblers that, working atom by atom, would waste not and leave us wanting not). Increasingly suspending the bad old laws of "the dismal science," this would explode growth, even as it liberated growth from reliance on natural resources and the "limits to growth" they imposed, solving the problem of both material scarcity and our impact on the natural environment--socially uplifting and ecological at once. Indeed, thinkers came to speak of literally everything in terms of "information," of our living in a world not of matter and energy but information that we could manipulate as we did lines of computer code if only we knew how, as they were confident we would soon know how, down to our own minds and bodies (most notoriously in the mind uploading visions of Ray Kurzweil and other Singularitarians).
In the process the word "information" itself came to seem fetishistic, magical, not only in the ruminations of so-called pundits mouthing the fashionable notions of the time, but at the level of popular culture--such that in an episode of Seinfeld in which Jerry's neighbor, the postal worker Newman, wanting to remind Jerry that he was a man not to be trifled with, told him in a rather menacing tone that "When you control the mail, you control information."
The line (which has become an Internet meme) seemed exceedingly contemporary to me at the time--and since, as distinctly '90s as any line can get, precisely because, as I should hope is obvious to you, the information age in this grander sense never came to pass. Far from our seeing magical feats of productivity-raising, abundance-creating INNOVATION!, productivity growth collapsed--proving a fraction of what it had been in the heyday of the "Old Economy" at which those lionizers of the information age sneered. Meanwhile we were painfully reminded time and again that at our actually existing technological level economic growth remains a slave to the availability and throughput of natural resources, with the cheap commodities of the '90s giving way to exploding commodity prices in the '00s that precipitated a riot-causing food-and-fuel crisis all over the world. If it is indeed the case that the world is all "just information," to go by where we are in 2022 (in which year we face another painful reminder of our reliance on natural resource as the war in Ukraine precipitates yet another food-and-fuel crisis) the day when we can manipulate matter like microcode remains far off.
Unsurprisingly the buzzwords of more recent years have been more modest. The term one is more likely to hear now is the "Fourth Industrial Revolution"--the expectation that the advances in automation widely projected will be as transformative as the actually existing information age may plausibly be said to have been--but not some transcendent leap beyond material reality.
I do not know for a fact that a Fourth Industrial Revolution is really at hand--but I do know that, being a rather less radical vision than those nano-assembler-based notions of the '90s, the thought that it may be so bespeaks how even our techno-hype has fallen into line with an era of lowered expectations.
Like the terms "atomic age," "jet age," "space age" "information age" can mean an era in which a revolutionary technology has arrived on the scene--and while "information technology" is not really new (writing, and even spoken language, is describable as an "information technology") there is no question that the significance of the electronic computer and its associated communications systems, in their various forms, represented something different from what came before. And indeed the information age came to pass in this sense.
Like the term "Industrial Age" "Information Age" can also denote a shift in the particular, fundamental conditions of work and consumption. The industrial age saw the decline of the rural, agrarian, peasant way of life as the norm as a revolutionary, inanimate energy-powered machine-based form of mass manufacturing became the predominant condition of our existence (employing a quarter of the American labor force at mid-century, while overwhelmingly accounting for the rise in material output and living standards). Likewise the information age held out the prospect of a great increase in the work effort devoted to, in one way or another, producing, processing and communicating information--as the volume of information being produced, processed and communicated exploded. And this, too, did come to pass.
However, the term had other meanings. Of these the one that was most exciting--because it was the one that could really, really make it matter in a way that would merit speaking of A New Age--was the idea that information itself, which has always been substitutable for other economic inputs like land and capital and labor, and substituted for them (this was how the Industrial Age happened, after all, the technical know-how to exploit those energy sources and build all the other machines enabling eventually massive labor substitution), would become so much radically substitutable for everything else that we would in this respect altogether transcend the smokestack, raw material-processing, secondary sector-centered Industrial Age. Thus, if the supply of some good ran short, information-age INNOVATION! would promptly turn scarcity into abundance, with what was promised for nanotechnology exemplary (the radical new materials like carbon nanotubes that would be stronger and lighter and better than so much else, the molecular-scale assemblers that, working atom by atom, would waste not and leave us wanting not). Increasingly suspending the bad old laws of "the dismal science," this would explode growth, even as it liberated growth from reliance on natural resources and the "limits to growth" they imposed, solving the problem of both material scarcity and our impact on the natural environment--socially uplifting and ecological at once. Indeed, thinkers came to speak of literally everything in terms of "information," of our living in a world not of matter and energy but information that we could manipulate as we did lines of computer code if only we knew how, as they were confident we would soon know how, down to our own minds and bodies (most notoriously in the mind uploading visions of Ray Kurzweil and other Singularitarians).
In the process the word "information" itself came to seem fetishistic, magical, not only in the ruminations of so-called pundits mouthing the fashionable notions of the time, but at the level of popular culture--such that in an episode of Seinfeld in which Jerry's neighbor, the postal worker Newman, wanting to remind Jerry that he was a man not to be trifled with, told him in a rather menacing tone that "When you control the mail, you control information."
The line (which has become an Internet meme) seemed exceedingly contemporary to me at the time--and since, as distinctly '90s as any line can get, precisely because, as I should hope is obvious to you, the information age in this grander sense never came to pass. Far from our seeing magical feats of productivity-raising, abundance-creating INNOVATION!, productivity growth collapsed--proving a fraction of what it had been in the heyday of the "Old Economy" at which those lionizers of the information age sneered. Meanwhile we were painfully reminded time and again that at our actually existing technological level economic growth remains a slave to the availability and throughput of natural resources, with the cheap commodities of the '90s giving way to exploding commodity prices in the '00s that precipitated a riot-causing food-and-fuel crisis all over the world. If it is indeed the case that the world is all "just information," to go by where we are in 2022 (in which year we face another painful reminder of our reliance on natural resource as the war in Ukraine precipitates yet another food-and-fuel crisis) the day when we can manipulate matter like microcode remains far off.
Unsurprisingly the buzzwords of more recent years have been more modest. The term one is more likely to hear now is the "Fourth Industrial Revolution"--the expectation that the advances in automation widely projected will be as transformative as the actually existing information age may plausibly be said to have been--but not some transcendent leap beyond material reality.
I do not know for a fact that a Fourth Industrial Revolution is really at hand--but I do know that, being a rather less radical vision than those nano-assembler-based notions of the '90s, the thought that it may be so bespeaks how even our techno-hype has fallen into line with an era of lowered expectations.
Tuesday, April 26, 2022
Checking in With Emmanuel Todd
Over the years I have often found Emmanuel Todd an original, provocative thinker. In The Final Fall he displayed considerable insight into the weaknesses of the Soviet Union in its later years, enough so as to accurately predict important aspects of its final dissolution (not least, the way the reform process required to redress the country's economic stagnation would unleash centrifugal forces, tearing away first the Warsaw Pact satellites, and then the non-Russian republics).
However, Todd has also proven fairly wide off the mark on a number of important occasions, not least in his analysis of the "fall" of the American Empire. In that book he claimed that an ever more deindustrialized and debt- and bubble-reliant U.S. economy would all soon be revealed as an Enron-like house of cards, and lead to a downgrading of America's GNP by observers to compare with the downgrading of the Soviet GNP at the time of that country's collapse. Moreover, he predicted that the "American fall" he described--a matter of the country's means not only being recognized as smaller than advertised but America's having to "live within" those smaller means (no longer able to run its colossal trade deficits seemingly consequence-free)--was to be brought much nearer than would otherwise be the case not by American action, but by Europe's coming into its own. Todd specifically projected Europe coming to include Britain as a full-fledged member of the European project, and embracing Russia as well--the former bringing to the assembly its position as a global financial center, the latter its vast population, natural resources and military assets--with the result the end of European reliance on and subordination to the United States, and the end of the special privileges of "the dollar." Capping this off was Todd's prediction that the end of American predominance would mean the end of neoliberalism, with the world returning to a Keynesian economics that would facilitate growth and development around the world, while Europe's charting a course apart from American neoconservatism would conduce to stability and progress at home and abroad.
Of course, absolutely none of that happened. No such downgrading of the U.S. economy's weight ever occurred. Meanwhile, far from Britain and Russia entering the fold to make Europe the world's indisputable greatest power Britain exited the European Union entirely (with a closer relationship with the U.S. much on the minds of the advocates of that course) while, in sharp contrast with Todd's expectation, Europe and Russia grew apart rather than closer. And for what it is worth, European elites, whose connections with the U.S. were if anything affirmed by the 2007-2008 economic crisis (as Adam Tooze notes, it was a trans-Atlantic banking crisis, rather than an American banking crisis, and only the U.S. had the sheer scale to deliver the bailout), while European elites proved themselves second to none in their attraction to neoliberalism (even if their publics made the implementation of the program slower than they would have liked), and themselves fairly inclined to neoconservatism (displaying the same kind of interventionism from Mali to Syria and beyond)--and all that to such a degree that the English-language press stopped sneering and started praising the continent's governments.
All the same, even when wrong Todd made a sufficiently interesting case to leave us something to think about, rather more so than innumerable Public Intellectuals with infinitely higher profiles. And indeed, as someone who had at least occasionally got a good deal deal of mainstream notice I wondered why we did not hear of him more consistently. I initially supposed that this was a matter of Anglophone insularity, but it seemed he was not terribly present in the French press either--and was surprised to find an interview with him in Japan's Mainichi Shimbun in which he informed them that
However, Todd has also proven fairly wide off the mark on a number of important occasions, not least in his analysis of the "fall" of the American Empire. In that book he claimed that an ever more deindustrialized and debt- and bubble-reliant U.S. economy would all soon be revealed as an Enron-like house of cards, and lead to a downgrading of America's GNP by observers to compare with the downgrading of the Soviet GNP at the time of that country's collapse. Moreover, he predicted that the "American fall" he described--a matter of the country's means not only being recognized as smaller than advertised but America's having to "live within" those smaller means (no longer able to run its colossal trade deficits seemingly consequence-free)--was to be brought much nearer than would otherwise be the case not by American action, but by Europe's coming into its own. Todd specifically projected Europe coming to include Britain as a full-fledged member of the European project, and embracing Russia as well--the former bringing to the assembly its position as a global financial center, the latter its vast population, natural resources and military assets--with the result the end of European reliance on and subordination to the United States, and the end of the special privileges of "the dollar." Capping this off was Todd's prediction that the end of American predominance would mean the end of neoliberalism, with the world returning to a Keynesian economics that would facilitate growth and development around the world, while Europe's charting a course apart from American neoconservatism would conduce to stability and progress at home and abroad.
Of course, absolutely none of that happened. No such downgrading of the U.S. economy's weight ever occurred. Meanwhile, far from Britain and Russia entering the fold to make Europe the world's indisputable greatest power Britain exited the European Union entirely (with a closer relationship with the U.S. much on the minds of the advocates of that course) while, in sharp contrast with Todd's expectation, Europe and Russia grew apart rather than closer. And for what it is worth, European elites, whose connections with the U.S. were if anything affirmed by the 2007-2008 economic crisis (as Adam Tooze notes, it was a trans-Atlantic banking crisis, rather than an American banking crisis, and only the U.S. had the sheer scale to deliver the bailout), while European elites proved themselves second to none in their attraction to neoliberalism (even if their publics made the implementation of the program slower than they would have liked), and themselves fairly inclined to neoconservatism (displaying the same kind of interventionism from Mali to Syria and beyond)--and all that to such a degree that the English-language press stopped sneering and started praising the continent's governments.
All the same, even when wrong Todd made a sufficiently interesting case to leave us something to think about, rather more so than innumerable Public Intellectuals with infinitely higher profiles. And indeed, as someone who had at least occasionally got a good deal deal of mainstream notice I wondered why we did not hear of him more consistently. I initially supposed that this was a matter of Anglophone insularity, but it seemed he was not terribly present in the French press either--and was surprised to find an interview with him in Japan's Mainichi Shimbun in which he informed them that
he does not respond to interviews in France, where the media does not permit levelheaded debate. But because Japan is a safety zone for him, he continues, he does interviews for the Japanese media.I can't say that I'm terribly surprised by his assessment of the French media. However, I suspect that Japan's being a "safety zone" for him is more a function of the "hot buttons" he addresses at home having rather less emotive effect there, while a homegrown counterpart to Todd would probably find his country's media as inhospitable as Todd does France's.
Monday, April 11, 2022
The Return of Space-Based Solar Power to the Conversation?
Back in the 1970s a great deal was said of the prospect of space-based solar power--of massive arrays of photovoltaic solar panels placed in orbit which would transmit the electricity they generated back down to Earth, with Gerard K. O'Neill famously offering a particularly detailed proposal of the type in that '70s-era space development classic The High Frontier. (The tired sneer of the renewables-bashers is that the sun does not shine all the time. But the sun really does shine all the time in space, permitting a much more consistent and greater output from solar panels situated in orbit than on Earth.)
Of course, no such project ever materialized. There were many reasons for that, among them the unswerving commitment of business to fossil fuels, and government commitment to business' reading of its interests (which, to the lament of those concerned for climate change, endures almost unaltered). But there was also the reality that a crucial part of such plans--given the sheer amount of infrastructure that had to be constructed in space--was bringing down the very high cost of space launch. Key to this vision generally, and O'Neill's vision in particular, was the expectation that the space shuttle--which was, as the name indicates, expected to indeed be a shuttle, with a rapid turnaround time providing very regular Earth-to-orbit transit --would produce a drastic fall in space launch costs, with three to four flights a month thought plausible.
Alas, rather than three or four flights a month the shuttles we got in practice could at best manage three or four flights a year--while as the fate of the Challenger and Columbia tragically showed, the risk of their failing to return safe and sound from a mission was well over one percent. As might be expected, the space shuttle was anything but a "shuttle," and while the cost estimates vary greatly, absolutely no one regards it as having cut the price of space launch the way its proponents had hoped. The result was that any attempt to utilize space-based solar power on any significant scale was prohibitively expensive in the circumstances.
Still, the idea never altogether went away, and has received renewed attention in the wake of a British government proposal to pursue such a project. Plausibly also contributing to this attention are the claims by sympathetic analysts that SpaceX has succeeded in achieving lower space launch costs (not nearly so low as O'Neill had banked on--five times O'Neill's figures, in fact, $2500 a pound or so to low Earth orbit as against the $500 or so O'Neill had in mind--but still a considerable improvement); while photovoltaic solar panels have become an ever cheaper way of generating electricity (indeed, the cheapest ever), as well as thinner and lighter, with all that implies for the possibility of designing lighter, more compact and therefore more cost-effective space-based arrays to cut down on cargo size and launch cost.1
I am generally sympathetic to both space development, and to renewable energy, but I also have to admit my doubts in regard to this particular combination of them--in part because every gain in the efficiency of solar panels that makes electricity production from space-based solar cheaper and more efficient also makes terrestrially-based solar cheaper and more efficient, minus the immense launch costs, and the difficulties posed by the continued lack of convenient, regular physical access. (Earth orbit is a crowded, dangerous place--and a massive investment in such a project if we do not have the capacity to effect repair in the case of accident or a collision with a meteorite or piece of space debris seems problematic at best.) Especially barring a much more drastic fall in launch costs than have been claimed by even the most sympathetic for SpaceX; or the advent of the kind of robotics capability that would make humans completely unnecessary to the construction, maintenance, repair of such an infrastructure; or preferably both; it seems to me impractical. Indeed, for the time being it seems to me the safer course to develop solar power on Earth, with RethinkX's "Clean Energy Super Power" concept seeming to me a more compelling approach to the problems posed by the intermittency of solar-generated electricity--and deserving of far more attention than it has received to date.
1. O'Neill's 1976 book estimated that the cargo variant of the space shuttle on which he was counting would get cargo up to low Earth orbit at the price of $110 a pound. Adjusted for inflation using the Consumer Price Index $110 in 1976 would be the equivalent of about $520 in 2021.
Of course, no such project ever materialized. There were many reasons for that, among them the unswerving commitment of business to fossil fuels, and government commitment to business' reading of its interests (which, to the lament of those concerned for climate change, endures almost unaltered). But there was also the reality that a crucial part of such plans--given the sheer amount of infrastructure that had to be constructed in space--was bringing down the very high cost of space launch. Key to this vision generally, and O'Neill's vision in particular, was the expectation that the space shuttle--which was, as the name indicates, expected to indeed be a shuttle, with a rapid turnaround time providing very regular Earth-to-orbit transit --would produce a drastic fall in space launch costs, with three to four flights a month thought plausible.
Alas, rather than three or four flights a month the shuttles we got in practice could at best manage three or four flights a year--while as the fate of the Challenger and Columbia tragically showed, the risk of their failing to return safe and sound from a mission was well over one percent. As might be expected, the space shuttle was anything but a "shuttle," and while the cost estimates vary greatly, absolutely no one regards it as having cut the price of space launch the way its proponents had hoped. The result was that any attempt to utilize space-based solar power on any significant scale was prohibitively expensive in the circumstances.
Still, the idea never altogether went away, and has received renewed attention in the wake of a British government proposal to pursue such a project. Plausibly also contributing to this attention are the claims by sympathetic analysts that SpaceX has succeeded in achieving lower space launch costs (not nearly so low as O'Neill had banked on--five times O'Neill's figures, in fact, $2500 a pound or so to low Earth orbit as against the $500 or so O'Neill had in mind--but still a considerable improvement); while photovoltaic solar panels have become an ever cheaper way of generating electricity (indeed, the cheapest ever), as well as thinner and lighter, with all that implies for the possibility of designing lighter, more compact and therefore more cost-effective space-based arrays to cut down on cargo size and launch cost.1
I am generally sympathetic to both space development, and to renewable energy, but I also have to admit my doubts in regard to this particular combination of them--in part because every gain in the efficiency of solar panels that makes electricity production from space-based solar cheaper and more efficient also makes terrestrially-based solar cheaper and more efficient, minus the immense launch costs, and the difficulties posed by the continued lack of convenient, regular physical access. (Earth orbit is a crowded, dangerous place--and a massive investment in such a project if we do not have the capacity to effect repair in the case of accident or a collision with a meteorite or piece of space debris seems problematic at best.) Especially barring a much more drastic fall in launch costs than have been claimed by even the most sympathetic for SpaceX; or the advent of the kind of robotics capability that would make humans completely unnecessary to the construction, maintenance, repair of such an infrastructure; or preferably both; it seems to me impractical. Indeed, for the time being it seems to me the safer course to develop solar power on Earth, with RethinkX's "Clean Energy Super Power" concept seeming to me a more compelling approach to the problems posed by the intermittency of solar-generated electricity--and deserving of far more attention than it has received to date.
1. O'Neill's 1976 book estimated that the cargo variant of the space shuttle on which he was counting would get cargo up to low Earth orbit at the price of $110 a pound. Adjusted for inflation using the Consumer Price Index $110 in 1976 would be the equivalent of about $520 in 2021.
Saturday, April 9, 2022
The Fortieth Anniversary of the Fifth Generation Computer Systems Initiative--and the Road Ahead for Artificial Intelligence
This month (April 2022) marks the fortieth anniversary of the announcement by the Japanese government's Ministry of International Trade and Industry's Fifth Generation Computer Systems initiative. These computers (said to represent another generation beyond first-generation vacuum tubes, second-generation transistors, third-generation integrated circuits, and fourth-generation Very Large Integrated circuits because of their use of parallel processing and logic programming) were, as Edward Feigenbaum and Pamela McCormick wrote in Creative Computing, supposed to be "able to converse with humans in natural language and understand speech and pictures," and "learn, associate, make inferences, make decisions, and otherwise behave in ways we have always considered the exclusive province of human reason."
With this declaration coming from the very institution which was credited with being the "brains" behind the Japanese economic miracle in the post-war period just as "Japan, Inc." was approaching the peak of its global prominence and power, not least on the basis of Japan's industrial excellence in the computing field, this claim--which meant nothing short of the long-awaited revolution in artificial intelligence being practically here--was taken very seriously indeed. In fact the U.S. and British governments, under administrations (those of Ronald Reagan and Margaret Thatcher, respectively) which were hardly fans of MITI-style involvement with private industry, answered Japan's challenge with their own initiatives.
The race was on!
Alas, it proved a race to nowhere. "'Fifth Generation' Became Japan's Lost Generation" sneered the title of a 1992 article in The New York Times, which went so far as to suggest that American computer scientists cynically overstated the prospects of the Japanese government attaining its stated goal to squeeze a bit more research funding out of the government. While one may argue the reasons for this, and their implications, the indisputable, bottom-line facts is that computers with the capabilities in question, based on those particular technologies or any other, never happened. Indeed, four decades after the announcement of the initiative, after astronomical increases in computing power, decades of additional study of human and machine intelligence, and the extraordinary opportunities for training such intelligences provided by broadband Internet, we continue to struggle to give computers real functionality along the lines that were supposed to be imminent four decades ago (the ability to converse in natural language, understand speech and pictures, make human-like decisions, etc.)--so much so that the burst of excitement we saw in the '10 about the possibility that we were "almost there" has already waned amid a great lowering of expectations.
In spite of the briskness of developments in personal computing over the past generation--in the performance, compactness, cheapness of the devices, the speed and ubiquity of Internet service, and the uses to which these capabilities have been put--it can seem that in other ways the field has been stagnant for a long time. Those first four generations of computing arrived within the space of four decades, between the 1940s and 1970s. Since the 1970s we have, if doing remarkable things with the basic technology, still been in the fourth generation for twice as long as it took us to go from generations zero to three. And in the face of the discouraging fact one may think that we will always be so. But I think that goes too far. If in 2022 we remain well short of the target announced in 1982 we do seem to be getting closer, if with what can feel like painful slowness, and I would expect us to go on doing so--if there seems plenty of room for argument about how quickly we will accomplish that.
For whatever it may be worth my suspicion (based on how after disappointing after the '90s neural nets delivered surprising progress in the '10s, when married to faster computers and the Internet) is that the crux of the problem is hardware--that our succeeding, or failing, to build sufficiently powerful computers is going to be the most important single factor in whether we build computers capable of human-like intelligence because ultimately they must have the capacity to simulate it. This would seem to simplify the issue in respects, given the steadiness in the growth of computing power over time, but it is, of course, uncertain just how powerful a computer has to be to do the job, while continued progress in the area is facing significant hurdles, given the slowness of post-silicon computer architectures to emerge, with the development of the carbon-based chips that had looked like the logical successor running "behind schedule," while more exotic possibilities like quantum computing, if potentially far more revolutionary and looking more dynamic, remain a long way from being ready for either really cutting-edge or everyday use. Still, the incentive and resources to keep forging ahead are undeniably there--while it may well be that, after all the prior disappointments, we have less far to go here than we may think.
With this declaration coming from the very institution which was credited with being the "brains" behind the Japanese economic miracle in the post-war period just as "Japan, Inc." was approaching the peak of its global prominence and power, not least on the basis of Japan's industrial excellence in the computing field, this claim--which meant nothing short of the long-awaited revolution in artificial intelligence being practically here--was taken very seriously indeed. In fact the U.S. and British governments, under administrations (those of Ronald Reagan and Margaret Thatcher, respectively) which were hardly fans of MITI-style involvement with private industry, answered Japan's challenge with their own initiatives.
The race was on!
Alas, it proved a race to nowhere. "'Fifth Generation' Became Japan's Lost Generation" sneered the title of a 1992 article in The New York Times, which went so far as to suggest that American computer scientists cynically overstated the prospects of the Japanese government attaining its stated goal to squeeze a bit more research funding out of the government. While one may argue the reasons for this, and their implications, the indisputable, bottom-line facts is that computers with the capabilities in question, based on those particular technologies or any other, never happened. Indeed, four decades after the announcement of the initiative, after astronomical increases in computing power, decades of additional study of human and machine intelligence, and the extraordinary opportunities for training such intelligences provided by broadband Internet, we continue to struggle to give computers real functionality along the lines that were supposed to be imminent four decades ago (the ability to converse in natural language, understand speech and pictures, make human-like decisions, etc.)--so much so that the burst of excitement we saw in the '10 about the possibility that we were "almost there" has already waned amid a great lowering of expectations.
In spite of the briskness of developments in personal computing over the past generation--in the performance, compactness, cheapness of the devices, the speed and ubiquity of Internet service, and the uses to which these capabilities have been put--it can seem that in other ways the field has been stagnant for a long time. Those first four generations of computing arrived within the space of four decades, between the 1940s and 1970s. Since the 1970s we have, if doing remarkable things with the basic technology, still been in the fourth generation for twice as long as it took us to go from generations zero to three. And in the face of the discouraging fact one may think that we will always be so. But I think that goes too far. If in 2022 we remain well short of the target announced in 1982 we do seem to be getting closer, if with what can feel like painful slowness, and I would expect us to go on doing so--if there seems plenty of room for argument about how quickly we will accomplish that.
For whatever it may be worth my suspicion (based on how after disappointing after the '90s neural nets delivered surprising progress in the '10s, when married to faster computers and the Internet) is that the crux of the problem is hardware--that our succeeding, or failing, to build sufficiently powerful computers is going to be the most important single factor in whether we build computers capable of human-like intelligence because ultimately they must have the capacity to simulate it. This would seem to simplify the issue in respects, given the steadiness in the growth of computing power over time, but it is, of course, uncertain just how powerful a computer has to be to do the job, while continued progress in the area is facing significant hurdles, given the slowness of post-silicon computer architectures to emerge, with the development of the carbon-based chips that had looked like the logical successor running "behind schedule," while more exotic possibilities like quantum computing, if potentially far more revolutionary and looking more dynamic, remain a long way from being ready for either really cutting-edge or everyday use. Still, the incentive and resources to keep forging ahead are undeniably there--while it may well be that, after all the prior disappointments, we have less far to go here than we may think.
Friday, April 8, 2022
Australia's Nuclear Sub Program: The Global Britain Angle
In considering the Australian decision to acquire nuclear submarines in a deal made with Britain and the U.S. my thoughts turned back to Britain's "tilt to the Indo-Pacific"--the British government's decision to focus British foreign policy, and reorient its military policy, on the region, in a break with the European emphasis that has prevailed since the 1960s.
Considering that move one fact of the situation I have repeatedly noted has been that Britain's ability to project force into the region is relatively limited, especially as that region becomes more militarized--with Japan acquiring attack carriers and India a nuclear sub fleet, and Australia expanding its old force of diesel subs and frigates/destroyers into something much larger and more ambitious, reducing the "value" of what Britain can bring from so far away. (Already in the '60s the country's Far East forces, while vastly larger than anything Britain could really afford to station in the area, were inadequate to make being "east of Suez" worthwhile.)
However, Britain's capacity to provide technology that as yet few others can may be a handy supplement to such resources--especially where the resources are so sensitive. Apart from the U.S.' provision of technical support to Britain's nuclear submarine program, and Russian collaboration with India in the development of its own nuclear sub program (which has seen India lease working Russian vessels, in the '80s and again in this century), I cannot think of anything to compare at all with the new deal. Certainly what some have suggested as one possible form the deal may take (given Australia's lack of a nuclear industry), Australia's purchase of nuclear subs outright--possibly from Britain--simply has no precedent.
It is also no isolated action. Indeed, it may be useful to think of how some proponents of a post-Brexit Britain have suggested stronger ties to the Commonwealth--in this case, a relatively large piece of the Commonwealth in the crucial Indo-Pacific arena--as a replacement for its continental connections, with the sub deal a building block for a broader partnership with Australia that would strengthen Britain's local influence. Such an approach seems the more plausible given that, if rather less sensitive and controversial in nature, Indo-Pacific-minded Britain has already turned to a collaboration with Japan to produce their own sixth-generation fighter.
Meanwhile, even as they strengthen Britain's military connections with nations in East Asia such deals can be seen as conducing to the strength of the British military-industrial base that remains a key strategic asset for the country, more important than many appreciate. Like Russia Britain is a nation which has suffered considerable deindustrialization but still possessed of a disproportionately large and advanced military-industrial complex--not least because as British policy from Thatcher forward proved ready to sacrifice the country's manufacturing base for the sake of the bigger neoliberal program, the defense-industrial portion of the sector continued to get government support (with Thatcher herself making a personal lobbying effort to clinch the infamous "deal of the century" with the Saudis back in '88) that has translated to the complex's political and economic importance also being disproportionate. As the cost and complexity of weaponry only continues to grow exports become only more important as a way of keeping such a base viable--while what remains of Britain's manufacturing is that much more dependent on it.
Selling Australia critical technology--and perhaps, even its own versions of the Astute-class submarine--might not balance the country's payments by itself. However, it also does not evoke the derisive laughter that the "tea and biscuits" plan did.
Considering that move one fact of the situation I have repeatedly noted has been that Britain's ability to project force into the region is relatively limited, especially as that region becomes more militarized--with Japan acquiring attack carriers and India a nuclear sub fleet, and Australia expanding its old force of diesel subs and frigates/destroyers into something much larger and more ambitious, reducing the "value" of what Britain can bring from so far away. (Already in the '60s the country's Far East forces, while vastly larger than anything Britain could really afford to station in the area, were inadequate to make being "east of Suez" worthwhile.)
However, Britain's capacity to provide technology that as yet few others can may be a handy supplement to such resources--especially where the resources are so sensitive. Apart from the U.S.' provision of technical support to Britain's nuclear submarine program, and Russian collaboration with India in the development of its own nuclear sub program (which has seen India lease working Russian vessels, in the '80s and again in this century), I cannot think of anything to compare at all with the new deal. Certainly what some have suggested as one possible form the deal may take (given Australia's lack of a nuclear industry), Australia's purchase of nuclear subs outright--possibly from Britain--simply has no precedent.
It is also no isolated action. Indeed, it may be useful to think of how some proponents of a post-Brexit Britain have suggested stronger ties to the Commonwealth--in this case, a relatively large piece of the Commonwealth in the crucial Indo-Pacific arena--as a replacement for its continental connections, with the sub deal a building block for a broader partnership with Australia that would strengthen Britain's local influence. Such an approach seems the more plausible given that, if rather less sensitive and controversial in nature, Indo-Pacific-minded Britain has already turned to a collaboration with Japan to produce their own sixth-generation fighter.
Meanwhile, even as they strengthen Britain's military connections with nations in East Asia such deals can be seen as conducing to the strength of the British military-industrial base that remains a key strategic asset for the country, more important than many appreciate. Like Russia Britain is a nation which has suffered considerable deindustrialization but still possessed of a disproportionately large and advanced military-industrial complex--not least because as British policy from Thatcher forward proved ready to sacrifice the country's manufacturing base for the sake of the bigger neoliberal program, the defense-industrial portion of the sector continued to get government support (with Thatcher herself making a personal lobbying effort to clinch the infamous "deal of the century" with the Saudis back in '88) that has translated to the complex's political and economic importance also being disproportionate. As the cost and complexity of weaponry only continues to grow exports become only more important as a way of keeping such a base viable--while what remains of Britain's manufacturing is that much more dependent on it.
Selling Australia critical technology--and perhaps, even its own versions of the Astute-class submarine--might not balance the country's payments by itself. However, it also does not evoke the derisive laughter that the "tea and biscuits" plan did.
Subscribe to:
Posts (Atom)