Thursday, May 19, 2022

American and British Political Discourse: A Note

Over the years I have had many an occasion to consider the differences in the political discourses of the U.S. and Britain. One of the most significant has been the greater attentiveness of British news-watchers to the economic and socioeconomic than their American counterparts--on average more interested, better-informed, more willing and able to discuss them substantively.

The difference exists in spite of a great many similarities. Britain, too, has its "centrism," which in a prior day undertook reform, but from an essentially conservative position with essentially conservative ends--fearful and resentful of the left while conciliatory to the right, culminating in a rightward shift over time and the withering of discussion, debate, electoral options regarding these matters (witness the trajectory of the Labour Party). Britain, too, has its "culture wars," with both sides in them quite deliberately deploying the politics of identity to divert the public from more material issues. Britain, too, has its "pundits" telling the public social class and class differences are nonexistent or irrelevant or minor, that any such thing as an Establishment is a leftist fantasy, that "neoliberalism" too is "not a thing." All of this reflects the reality that in Britain, too, the media ill serves the public where these matters are concerned (and just about every other, too).

Of course, one may also argue that if all this is the case in Britain it is less thoroughly so than in the U.S., that centrism has not had such a tight grip on the more leftward of its parties (whom would one call the American Aneurin Bevan?), that in Britain the culture wars have not quite gone so far in blocking other issues out of the public consciousness, that much that is conventional wisdom in America is not quite that in Britain, in part because for all its failings the country's media does offer somewhat more leeway for the sorts of marginalized views and voices attentive to such things. Still, as the trajectory of British discourse and policy in recent years makes clear (and as I imagine British leftists in particular would hasten to point out) one should not make too much of these qualifications. Rather three things seem to me to have made the difference:

1. Britain's governance and public policy formation in these areas has so often been so much more centralized than in the U.S.. This makes many an issue decided at the local or state level in the U.S. (local tax rates, much to do with education and health services, etc.), which people in various parts of the country decide in different ways, from different starting points and with different emotional charges, more completely a matter of national policy in Britain (with Thatcherism, in fact, strongly identified with centralization for the sake of implementing its program more fully). The result is that they are an object of national rather than local debate, with all the additional attention this brings them.

2. Britain's policy in the relevant areas has shifted much further left and much further right than it did in the U.S. over the course of the years. This is not to deny that in British life the "center" of the political spectrum may even now be regarded as lying leftward of the American center. However, the public sector, the welfare state, organized labor, were all built up to far greater heights in Britain than was ever seriously considered in the U.S.. And whatever their limitations have been in practice (and these should not be overlooked), Britain had free college and free-at-point-of-service health care, and the U.S. never came close to anything like that. Meanwhile, circa 1980 three of every five British workers was a member of a labor union, as compared with just one in five American workers, while British unions were unencumbered by anything remotely like Taft-Hartley and the top-heaviness of American unions with bureaucracy, and vastly more politically conscious and militant. Naturally the dismantling of all that, if still leaving Britain with more apparatus of this kind than the U.S. ever had, entailed a far bigger and more wrenching shift in economic and social life than what the U.S. has seen in the same period, with that much more controversy, reflected in the reality that, while Ronald Reagan has been a divisive figure in American history, in the American mainstream one simply does not see the expression of the kind of bitterness toward him that one sees in regard to Thatcher in Britain. All of this, again, makes it harder to slight the topic.

3. In Britain the reality of social class has simply been too ostentatious for too long to ignore. Where in the U.S. the national mythology is overwhelmingly devoted to the idea of the self-made individual (to the point that children of extreme socioeconomic privilege constantly pass themselves off as such, with a lickspittle press always enabling them) a monarchy remains the center of British public life, while the legislature's upper house remains a House of Lords, and even for the native-born, old-stock Briton accent is scarcely less a marker of class origin than it was when George Bernard Shaw wrote Pygmalion. Even the difference in the usage of the term "middle class" appears telling. The attempts by British leaders to make the kind of vague use of the term that Americans do, to portray their nation as, rich and poor alike, somehow a middle class nation, seem to have had rather less success than across the Pond. Rather in British usage the word seems to still have a more exclusive (and meaningful) usage, with middle classness having a higher floor--and in the view of most a distinct ceiling too, many scoffing at David Cameron's calling himself "middle class."

Together these three factors--that centralization of political life, the extremity of the distance policy moved along the political spectrum across the past century, and the greater difficulty of dismissing the role of class--make it far, far harder to avoid acknowledging the herd of thoroughly material elephants in the room that make so much difference in the large and the small of individual lives.

Wednesday, May 18, 2022

The Necessity of Political Language

It would seem reasonably uncontroversial that our ability to discuss a subject in a clear and rigorous way depends on our having a vocabulary adequate to permit that, not least by furnishing us with a satisfactorily extensive lexicon of terms with precise, coherent meanings for reference. However, many seem to see "political language" as serving no purpose whatsoever, and indeed something to be attacked on sight.

Why is this the case? Let us first of all admit the fact that a good deal of language is, from the outset, muddled at the level of conception, with thinkers coining words and phrases that ultimately fail to illuminate, or which they may even deliberately intend to obscure, to confuse, to manipulate. While we are at it let us also get out of the way the reality that a good deal of usage of terms that do not originate that way can end up being used that way. (Alan Sokal, you may remember, got pretty far stringing together favored buzzwords of people in certain categories of study in the humanities and social sciences in totally nonsensical fashion for the sake of exposing the game. Hermeneutics of Quantum Gravity indeed! It's the sort of phrase that a stupid person might come up with when trying to imagine how the intelligent speak.)

Yet all this is very far from the whole issue, and to focus on it can be misleading. After all, these are problems not just with political language, but all language, despite which we generally manage to get along. But many play them up where politics is concerned for the most dubious of reasons. One is laziness, of course. Dismissal of the value of a body of knowledge is easier than actually acquiring it before we make our judgments. ("We'll never need math anyway," says many a frustrated student. Or grammar. Or spelling. Or anything else for which there seems to be an app. They are likely to find out otherwise.) Even those pursuing a bit of intellectual distinction can take the lazy route themselves, being all too familiar with the sorts of cheap maneuvers that impress the simple-minded--for example, dismissing existing categories of thought as if they have somehow transcended them--saying "Look at me and how much smarter I am than everyone else because I have seen through their bugbears! Look at me thinking outside the box!"--when in reality they have not done any thinking at all. When they have not even looked at the box or its contents, let alone managed to think outside them (and likely not even realized that "thinking outside the box" has degenerated into yet anothr atrocious corporate clichè whose utterance strongly suggests that the speaker is not doing what they presume to be doing).

However, other factors seem to me more consequential, not least the prevalence of a postmodernist intellectual orthodoxy that takes a very dim view of the capacities of human reason. This goes only so far in the realm of the physical sciences--one can't argue (very much) with the accomplishments of physicists, for example. But they have more scope to do so in the social realm, where one not only gets away with, but is lauded as displaying the greatest profundity for, doing what would be the physics equivalent of "All that matter and motion and energy stuff is probably incomprehensible, and looking for patterns in it or explanations for it is pointless and pernicious. Best not do it and stick to as superficial a reading of things as we can."

This orthodoxy extends to a dim view of human language's descriptive powers, which has them dismissing all language as "language games" and words--and especially words with heavy political meanings--as therefore meaningless, often in a tone suggestive of this being indisputably obvious to all. That even amid all the superficiality and misuse and abuse of terms words like "conservative," "liberal," "socialist," "capitalist" (or "neoconservative," "neoliberal," "fascist," "centrist") these can still denote real, meaningful and important concepts or sets of concepts, and that even the shallower, diverging or outright erroneous usages may reflect and evoke and reinforce those patterns in telling and important ways, is something they steadfastly refuse to acknowledge--not least because these cheap and shabby evasions may serve their purposes, which is not in furthering a conversation, but undermining a conversation they do not want to see happen at all.

Indeed, blanket attacks on political language are often a shabby cover for attacks on someone else's political language, for the sake of promoting one's own no less political nomenclature. I recall, for example, running into a recent discussion of centrism on Twitter. When the Tweeter in question raised a particular analysis of where centrism sits within the political spectrum one of the first people to respond sneeringly dismissed the left-to-right political spectrum as meaningless and incoherent and declared that people should think in terms of statism and anti-statism. What he suggested, of course, was our narrowing the political dialogue to a single issue along a single axis (most people at least let us have two!) apparently divorced from any intellectual premises whatsoever in an all too blatant attempt to shut down the conversation that Tweeter was trying to start and have one more congenial to him. As the person in question was a professional renewable-energy basher with endorsements for his book from certain high-profile figures his biases here were all too obvious. Simply put, he wanted to make it a matter of "libertarians" as the champions of "freedom" (he didn't define either term, of course, but anyone can guess what he had in mind) against everyone else.

Whatever one makes of his views, his unfortunate intervention (which smacked of that online discourse policing--that heckling--that I disdain regardless of who does it--and which can only seem at odds with any pretension to respect for freedom of speech) only underlined how we never get away from our reliance on a political vocabulary, and that rather than dismissing it in that lazy and pretentious way as so many do, what we ought to do is work to make that vocabulary useful.

Tuesday, May 17, 2022

In the 2020s the 1920s Have Become the 1930s

I remember back about the turn of the century Thomas P.M. Barnett emerged as a national security counterpart to Thomas Friedman who could be characterized as devoting himself to explaining just how, exactly, "McDonnell-Douglas" would back up "McDonald's."

Barnett's conformity to the globalization-singing conventional wisdom of the day made him the sort of fashionable Public Intellectual who got written up in places like Esquire (which magazine's "foreign-policy guru" he subsequently became).

I was rather less impressed than the folks at Esquire. Still, Barnett was astute enough to acknowledge that the whole thing could unravel, citing Admiral William Flanagan in his book Blueprint for Action about the possibility that "the 1990s might be a replay of the 1920s," raising "the question . . . What would it take for the 2000s to turn as sour as the 1930s?"

The analogy seems to me fair enough. Like the 1920s the 1990s were a period after the end of a major international conflict which it was hoped would never be followed by another like it. After all, those optimistic about the trend of things (at least, in the politically orthodox way) imagined that conflict's end supposedly auguring the arrival of a more orderly, peaceful--and prosperous--world, with many a parallel quite striking. On both occasions the U.S. had emerged from that conflict as victor, hyperpowered arbiter of the world's fate, and in the "American way," the pointer to everyone else's future amid a financial boom and euphoria over a supposedly epochal revolution in productivity and consumerism bound up with new technology, and immense self-satisfaction about their freer, "liberated" lifestyles--all the while ignoring anything that gave the lie to their illusions, dismissing the financial crises, the international crises as mere bumps on the road, which they insisted were quite manageable by deified Overseers of the Global Economy, and the stirrings of radicalism at home and abroad (the country certainly had its "status politics," its "culture wars") a much-ado-about-nothing on the wrong side of the end of history.

Considering it Barnett--whom it must be noted again, was fashionable because he was conventional--was on the whole optimistic that the challenges could remain so manageable in the long term. (Hence, that "Blueprint for Action.") Those taking a less sanguine view of these developments thought otherwise, and they have since proven correct as, just as the illusions of the '20s died, so did those of the '90s.

When we look back at the '20s the tendency is to think of them as having come to an end in 1929, with "the Great Crash." Of course, the end of the mood we associate with the decade was not so obviously tidy. But it does seem that the illusions of the '90s may have been a longer time dying than those of the '20s, dying only a bit at a time, with the way things played out enabling the denials to last longer. One may recall, for example, the rush to declare the financial crisis that broke out in 2007-2008 past--such that even an Adam Tooze, when buckling down to study the event properly, was himself surprised to conclude in a book published a decade later that it had never gone away, as it still has not, simply merging with other crises, like the COVID-19 pandemic and its own attached economic crisis (also not behind us, even if some pretend it is), into something bigger and worse and scarier. (How big and bad and scary? Well, according to one calculation, even before the pandemic economic growth rates had either virtually flatlined or turned significantly negative for most of the planet--which makes all the backlash against neoliberalism--the votes for Trump, Britain's exit from the EU, and all the rest--that much less surprising.) Meanwhile, if any doubts had remained after a decade of intensifying and increasingly militarized conflict among the great powers, the war in Ukraine has made it very, very clear that the actuality of such things--open, large-scale, sustained interstate warfare between large nation-states in the middle of Europe, and escalating confrontation between NATO and Russia--is a significant and worsening part of our present reality.

Looking at the news I do not get the impression that very many have properly processed the fact yet. But the neo-'20s mood that characterized the '90s, and lingered in varying ways and to varying degrees long after the years on the calendar ceased to read 199-, seems ever more remote these days, any indication otherwise ever more superficial.

Thursday, May 5, 2022

Did Anyone Actually Read Paul Kennedy's The Rise and Fall of Great Powers?

I have recently remarked what makes for a nonfiction bestseller generally--which, of course, leaves little space for anything that could be called "history." Of course, we do see history reach a wider audience--but only within that demand the public makes for the affirmative and the entertaining. Thus it is what Michael Parenti called gentlemen's history--history by, of and for the comfortable, who are supposed to feel comfortable during and after reading it; history which is conservative and "patriotic" (in the sense of loyalty to those in power, rather than to their country's well-being) and in line with all that self-congratulatory (from the standpoint of the elite in question).

Meanwhile, in its tending to be Great Man-centered it tends toward the personal and the narrative--to, indeed, being biography rather than history. (As A.J.P. Taylor remarked the two genres are actually very different--in the former the individual everything and society nothing; in the latter, the individual nothing and society everything.) It also tends toward, even while presenting its figures in a heroic light, also the gossipy. (Taylor remarked, too, that a "glamorous sex life" was a prerequisite for a successful biography.)

As Jeremy Black demonstrates all of this translates over to military history, which is dominated by biography-memoir-operational account--by the Great Captain subgenre of the Great Man genre, in which such Captains are presented as the dominating figures of the Decisive Battles of History, the same battles over and over and over again (with Britain's portion of the Napoleonic Wars, the U.S. Civil War, and the portions of the two world wars those countries experienced pretty much it for the more popular market in Britain and the U.S.).

One may add that, even in comparison with much other history, it tends especially heavily to the conservative and patriotic--to the hero-worship of generals, nationalistic flag-waving and the rest.

All of this was much on my mind when considering the reception of Paul Kennedy's The Rise and Fall of Great Powers. Certainly a work of history, and very reasonably readable as a work of military history, it stayed on the New York Times hardcover nonfiction bestseller list for 34 weeks--in spite of its being a very different book indeed. Far from offering personal narrative in it Kenndy presents an academic thesis resting on a detailed examination of five hundred years of Western and world history, where the "characters" are not individuals but entire nations and empires, whose development and clashing, ascent and descent, are construed not as the deeds of so-called Great Men, but the hard material facts of geography, technology, demographics, of industries and institutions. Of battles, campaign and wars there are plenty, but little of tactics and strategy and even less of generalship, with what really mattered the way resources, and the matching of resources to objects, told in the crunch.

Covering so much territory even in a seven hundred page volume, of course, means that Kennedy treats any one bit in only so much detail (as is the more evident if one compares it to, for example, his earlier, Britain-focused treatment of the same theme in The Rise and Fall of British Naval Mastery, which I recommend highly to anyone interested in the subject, by the way). Still, the quantitative data alone is, by the standard of popular works, immense, as testified by the inclusion of over fifty charts and tables, with the academic character of the work underlined by the 83 pages of notes and 38 pages of bibliography appended to the over five hundred page main text. Kennedy writes clearly and well, but it is an undeniably data-heavy, analytically-oriented work, with no attempt to enliven the proceedings with what an editor might call "color."

And of course, it was anything but self-congratulatory in the sense discussed here.

Considering Kennedy's book I find myself also considering another major--and similarly unstereotypical--bestseller of 1988, Stephen Hawking's A Brief History of Time. Hawking's book was much shorter (256 pages to the 677 pages of Kennedy's book), and while intellectual hierarchy-addicted morons such as Hollywood writes for take it as a given that physics is the most demanding field of intellectual endeavor, the reality is that even by pop science standards it seemed to me "easy," while I might add, Hawking's tone was sprightly. He clearly meant to produce a book that a broad audience could get something out of, and in my view did so. Kennedy's book most certainly did not. The result is that, if Hawking's book is, as I have seen it called, the most widely-selling unread book in history, I would imagine that very few bothered to read Kennedy's book all the way through--an opinion that Kennedy himself seems to share. He has publicly remarked--joked?--that he didn't "think many people read more than the final chapter on the US and the USSR"--and I would imagine that many more simply knew the alleged contents of the chapter secondhand.

Wednesday, May 4, 2022

Emmanuel Todd, China and the Graying of the World

In a recent interview sociologist and demographer Emmanuel Todd, discussing the matter of China's rise, argued that the country's far-below-replacement level fertility rate (which Todd says is 1.3 per woman) makes unlikely the visions of the country as hegemon for the simple reason that its labor force is bound to contract sharply, with massive implications for its already slowing economy, and its national power.

Considering this I find myself thinking of three counterarguments:

1. The 1.3 Total Fertility Rate (TFR) for China was registered in the wake of the pandemic, with its associated economic and other stresses. Before that it was up at about 1.7--a significant difference, such that rebound is hardly out of the question.

2. Even if one takes the 1.3 TFR as a "new normal" for China the rate in question is not only evident across its neighborhood, but actually more advanced in many neighboring countries. (Japan's TFR was scarcely above that before the pandemic, just 1.36, while South Korea's slipped below 1 in 2018 and was at 0.92 in 2019, according to World Bank figures.)

3. Even if the drop were to go further in China than elsewhere China, with a population of 1.4 billion, is, even after a much more demographic contraction than in neighboring states (a scenario hardly in the cards), still a colossus relative to the other states (Japan today having scarcely an eleventh of China's population, South Korea one-twenty-eighth its population.)

Still, China's contraction is coming at a point at which it is rather poorer than neighbors like Japan and South Korea (with a per capita Gross Domestic Product of $10,000 a year, versus $40,000 for Japan, $30,000 for South Korea)--and already seeing its economic growth slow sharply (those legendary 10 percent a year rates a thing of the past, with the 2012-2019 average more like 7 percent and still falling). The result is that the demands of an aging population could weigh that much more heavily on its resources.

All the same, how much the fact will matter ultimately depends on how societies handle the matter of their aging populations. One can picture a scenario in which modern medicine succeeds in alleviating the debilitating effects of getting older, permitting older persons to need less care. One can also picture a scenario in which rising economic productivity more than makes up for the decline of the labor supply and the rise in the dependency ratio (perhaps by lowering the cost of living). In either case, or one combining the benefits of both, the demographic transition may turn out to be managed easily enough, in China and elsewhere. Yet one can picture less happy scenarios as well--and I am sorry to say, rather easily in light of the disappointments of recent decades on all these scores. But even in that eventuality I would not be too quick to envision the melodramatic collapse scenarios making the rounds of the headlines yet again in recent months.

Tuesday, May 3, 2022

What Ever Became of the Information Age?

You may remember having heard the term "information age"--but it is entirely possible you have only a vague notion of what it meant. This may be because it has been a long time since you heard it last, but also because the term is slippery, having many usages.

Like the terms "atomic age," "jet age," "space age" "information age" can mean an era in which a revolutionary technology has arrived on the scene--and while "information technology" is not really new (writing, and even spoken language, is describable as an "information technology") there is no question that the significance of the electronic computer and its associated communications systems, in their various forms, represented something different from what came before. And indeed the information age came to pass in this sense.

Like the term "Industrial Age" "Information Age" can also denote a shift in the particular, fundamental conditions of work and consumption. The industrial age saw the decline of the rural, agrarian, peasant way of life as the norm as a revolutionary, inanimate energy-powered machine-based form of mass manufacturing became the predominant condition of our existence (employing a quarter of the American labor force at mid-century, while overwhelmingly accounting for the rise in material output and living standards). Likewise the information age held out the prospect of a great increase in the work effort devoted to, in one way or another, producing, processing and communicating information--as the volume of information being produced, processed and communicated exploded. And this, too, did come to pass.

However, the term had other meanings. Of these the one that was most exciting--because it was the one that could really, really make it matter in a way that would merit speaking of A New Age--was the idea that information itself, which has always been substitutable for other economic inputs like land and capital and labor, and substituted for them (this was how the Industrial Age happened, after all, the technical know-how to exploit those energy sources and build all the other machines enabling eventually massive labor substitution), would become so much radically substitutable for everything else that we would in this respect altogether transcend the smokestack, raw material-processing, secondary sector-centered Industrial Age. Thus, if the supply of some good ran short, information-age INNOVATION! would promptly turn scarcity into abundance, with what was promised for nanotechnology exemplary (the radical new materials like carbon nanotubes that would be stronger and lighter and better than so much else, the molecular-scale assemblers that, working atom by atom, would waste not and leave us wanting not). Increasingly suspending the bad old laws of "the dismal science," this would explode growth, even as it liberated growth from reliance on natural resources and the "limits to growth" they imposed, solving the problem of both material scarcity and our impact on the natural environment--socially uplifting and ecological at once. Indeed, thinkers came to speak of literally everything in terms of "information," of our living in a world not of matter and energy but information that we could manipulate as we did lines of computer code if only we knew how, as they were confident we would soon know how, down to our own minds and bodies (most notoriously in the mind uploading visions of Ray Kurzweil and other Singularitarians).

In the process the word "information" itself came to seem fetishistic, magical, not only in the ruminations of so-called pundits mouthing the fashionable notions of the time, but at the level of popular culture--such that in an episode of Seinfeld in which Jerry's neighbor, the postal worker Newman, wanting to remind Jerry that he was a man not to be trifled with, told him in a rather menacing tone that "When you control the mail, you control information."

The line (which has become an Internet meme) seemed exceedingly contemporary to me at the time--and since, as distinctly '90s as any line can get, precisely because, as I should hope is obvious to you, the information age in this grander sense never came to pass. Far from our seeing magical feats of productivity-raising, abundance-creating INNOVATION!, productivity growth collapsed--proving a fraction of what it had been in the heyday of the "Old Economy" at which those lionizers of the information age sneered. Meanwhile we were painfully reminded time and again that at our actually existing technological level economic growth remains a slave to the availability and throughput of natural resources, with the cheap commodities of the '90s giving way to exploding commodity prices in the '00s that precipitated a riot-causing food-and-fuel crisis all over the world. If it is indeed the case that the world is all "just information," to go by where we are in 2022 (in which year we face another painful reminder of our reliance on natural resource as the war in Ukraine precipitates yet another food-and-fuel crisis) the day when we can manipulate matter like microcode remains far off.

Unsurprisingly the buzzwords of more recent years have been more modest. The term one is more likely to hear now is the "Fourth Industrial Revolution"--the expectation that the advances in automation widely projected will be as transformative as the actually existing information age may plausibly be said to have been--but not some transcendent leap beyond material reality.

I do not know for a fact that a Fourth Industrial Revolution is really at hand--but I do know that, being a rather less radical vision than those nano-assembler-based notions of the '90s, the thought that it may be so bespeaks how even our techno-hype has fallen into line with an era of lowered expectations.

Subscribe Now: Feed Icon