During the stock market boom of the late twentieth century the capitalization of the stock market grew more than 150 percent between 1994 and 1999 alone--a growth wildly disproportionate to the growth of the underlying economy, even in that period of historically brisk expansion. (Where the stock market capitalization-to-GDP ratio stood at a mere 71 percent in 1994 it was over 153 percent in 1999, more than twice as high, even while U.S. Gross Domestic Product grew by better than a fifth, even after adjustment for inflation.)
It was the conventional wisdom among at least a significant section of financial "experts" that this was some wonderful new normal, with the surge in asset values to somehow continue for a good long time to come, these apparently trying to outbid each other for the public's attention with ever-higher predictions of how high the Dow Jones average would go within the next several years. (Dow Jones 36,000! Dow 40,000! Even Dow 100,000!)
Assuming anything at all but the market's having been released from the laws of economic gravity, this was a big bet on just how well the "real" economy was going to do in these years over euphoria over computing, the Internet and related technologies and the possibilities some claimed to see in them--one sees how much so when they think about what it would have meant if the economy lived up to the investor expectations implied in those stock prices. If, for example, the stock market's capitalization had grown at that rate for the next two decades, and the real economy fallen no further behind the growth of the stock market's capitalization than it was in 1999.
That would have meant a roughly 20 percent a year real economic growth rate for the next two decades, and a nearly forty-fold expansion of the U.S. economy, producing a U.S. GDP of some $700 trillion in today's dollars. Alas, U.S. GDP in 2019 was, in today's terms, more like $24 trillion--a mere thirtieth that sum. And long before the disparity could grow so stark the bubble went bust, just a few months into 2000.
Looking back it is impossible to picture what those decades of 20 percent a year growth would have looked like, with the same going for their somehow producing a country thirty times richer than it is today. In fact, it does not seem an exaggeration to characterize the situation as one of financial and economic singularity--which brings to mind that other Singularity that Ray Kurzweil said so much about in 1999. Something like that technological Singularity would seem the only way in which such a financial boom could have proven a winning bet--such that it seems we can speak of Wall Street's behavior giving the impression that Kurzweil's Singularity really was imminent.
Wednesday, June 8, 2022
Tuesday, June 7, 2022
What Role Might Superconductors Play in the Energy Transition?
Superconductivity has been in the news quite a bit these past couple of years, in large part because of a major breakthrough in 2020--namely the observation of room-temperature superconductivity for the first time in history. Of course, this occurrence was in a lab, under extremely specific and difficult circumstances (with the material put under pressure equal to over two thousand times the pressure at the bottom of the Mariana Trench). Still, if only usable only in very special circumstances the fact remains that room-temperature superconductivity is a proven physical reality, and a great many are watching the progress in this field toward superconducting materials that can work in everyday conditions with interest.
A major reason has been the pursuit of a more efficient electric grid. Of particular importance the density of current in superconducting materials, relative to those presently in use. As a result generators using superconducting coils produce larger and stronger magnetic fields, extracting more power from a given amount of current--with one result that lighter, more compact generators, can deliver the same power as heavier, larger units. When made of a superconducting material wires of a given width transmit up to five times as much electricity as their copper equivalents, and do so with far less loss over long distances. And the storage of electricity in batteries using superconducting materials likewise diminishes the problem of losses, yielding additional efficiencies.
All of this can permit a more efficient exploitation of any energy source, but seems especially helpful in compensating for the intermittency of renewables that has, thus far, slowed the improvement of their cost advantage over fossil fuels and nuclear. Practical experiments have already demonstrated the possibilities of squeezing more power out of windmills equipped with superconducting magnets of given sizes. Superconducting materials' potential for lowering the cost of long-distance power transmission enables them to better connect sun and wind-rich areas with others where demand may outweigh what is reliably available at hand, or simply provide a convenient back-up if demand goes up or local power generation goes down. (Renewables-bashers love to sneer that the sun doesn't always shine and the wind doesn't always blow, but at any given time the sun is probably shining and the wind blowing somewhere, and superconductivity goes a long way to making transmission across those distances cost-effective.) Meanwhile, in contrast with fossil fuel-based power generation, renewables in particular would benefit from their usefulness in storing electricity itself. (Indeed, it is already the case that superconductor-equipped storage is being used on a small scale for the sake of evening out grid fluctuations--while an argument has been made for the plausibility of equipping windmills and photovoltaic banks may be with their own superconducting storage units.)
Altogether such possibilities mean that, even if superconductors get much less attention than other technologies, progress in this area may yet play an important role in the energy transition—and warrant that much more interest on the part of observers looking to make it work, especially if they have the long run in mind.
A major reason has been the pursuit of a more efficient electric grid. Of particular importance the density of current in superconducting materials, relative to those presently in use. As a result generators using superconducting coils produce larger and stronger magnetic fields, extracting more power from a given amount of current--with one result that lighter, more compact generators, can deliver the same power as heavier, larger units. When made of a superconducting material wires of a given width transmit up to five times as much electricity as their copper equivalents, and do so with far less loss over long distances. And the storage of electricity in batteries using superconducting materials likewise diminishes the problem of losses, yielding additional efficiencies.
All of this can permit a more efficient exploitation of any energy source, but seems especially helpful in compensating for the intermittency of renewables that has, thus far, slowed the improvement of their cost advantage over fossil fuels and nuclear. Practical experiments have already demonstrated the possibilities of squeezing more power out of windmills equipped with superconducting magnets of given sizes. Superconducting materials' potential for lowering the cost of long-distance power transmission enables them to better connect sun and wind-rich areas with others where demand may outweigh what is reliably available at hand, or simply provide a convenient back-up if demand goes up or local power generation goes down. (Renewables-bashers love to sneer that the sun doesn't always shine and the wind doesn't always blow, but at any given time the sun is probably shining and the wind blowing somewhere, and superconductivity goes a long way to making transmission across those distances cost-effective.) Meanwhile, in contrast with fossil fuel-based power generation, renewables in particular would benefit from their usefulness in storing electricity itself. (Indeed, it is already the case that superconductor-equipped storage is being used on a small scale for the sake of evening out grid fluctuations--while an argument has been made for the plausibility of equipping windmills and photovoltaic banks may be with their own superconducting storage units.)
Altogether such possibilities mean that, even if superconductors get much less attention than other technologies, progress in this area may yet play an important role in the energy transition—and warrant that much more interest on the part of observers looking to make it work, especially if they have the long run in mind.
Monday, June 6, 2022
What Ever Happened to Superconductors?
Cold fusion and fifth-generation computers were among those technologies that in the 1980s were supposed to be on the verge of changing everything--but over three decades on have amounted to pretty much nothing.
In the same years one also heard a great deal about superconductors, specifically materials which, under appropriate conditions, cease to resist the passage of electrical current, so that it can flow absolutely without loss--becoming, as the name indicates, super conductors. That implies the possibility of enormous efficiencies in a very great deal of what we do with electricity--which can seem just about everything, with the list getting longer all the time.
In considering the publicity afforded the concept in the 1980s one should note that the concept was not new even then. The phenomenon of superconductivity was first observed way, way back in 1911. However, prior to the '80s the known superconductors only worked at extremely low, near-absolute zero temperatures--which meant that they required enormous amounts of energy for refrigeration (especially with electricity passing through them and heating them up). This, of course, left them with little practical use--while achieving better than that was thought not only an engineering difficulty but a theoretical impossibility. What made superconductors seem newly relevant was the discovery of a ceramic (lanthanum barium copper oxide) that could work as a superconductors at relatively high temperature. (I stress relatively, because the '80s-era discovery meant superconductors operating at 90 Kelvin--which is about three hundred degrees below zero for those of us using the Fahrenheit scale.)
That may not seem very promising, but it did arouse expectations about the rate of progress in the field (there were fantasies that "superconductor supremacy" was going to very soon mean world economic supremacy)--which soon proved rather exaggerated. Still, the research effort continued, and happily, so does progress, with the use of different materials enabling them to achieve superconductivity achieved at higher and higher temperatures until, two years ago physicists actually achieved superconductivity at "room temperature" (in fact, achieved it at 58 degrees Fahrenheit, the average temperature in Bergen, Norway, in July and August) garnering significant attention back in 2020.
What has been less widely covered in the coverage aimed at a non-specialist audience has been the specific circumstances of the achievement of that superconductivity. The superconductor in question (a mix of hydrogen, carbon and sulfur) worked because it was under a pressure of 270 gigapascals--a figure more often mentioned than explained. Those unfamiliar with that unit of measurement should know that it is equivalent to well over 2.6 million times sea level atmospheric pressure, or under about 16,000 miles of water--which is to say, more than two thousand times the submarine hull-squashing pressure at the bottom of the Mariana Trench.
As this shows researchers in the field have traded one set of extreme conditions (cold) for another (pressure), so much so that those who imagined from the press reports that commercially useful room-temperature superconductors were imminent may, as is so often the case when looking more closely at pop science stories that make us think a technology at Technology Readiness Level 1 is already up at Level 9 find this a damp squib. But all the same, it is undeniably a breakthrough, proving that room-temperature superconductivity is, at least, possible, and perhaps yielding insights into how it might be achieved in less extreme conditions--while, for what it is worth, work has begun on making those superconductors work at lower pressures than that.
Moreover, it would be a mistake to think that this means that superconductors have amounted to as little as those other technologies previously mentioned have done to date. If without much fanfare, superconductors have already entered a wide variety of practical, everyday uses, with the most significant, perhaps, Magnetic Resonance Imaging (MRI) machines. Seventy percent of those installed worldwide use superconducting magnets to enable more rapid and comprehensive scanning of the patient. And in that we have a reminder of something else, namely that even short of room-temperature superconductivity the technology is being put to practical use, with another breakthrough previously thought an impossibility--a superconductor through which electricity flows in only one direction--opening the door to the use of the technology in computing to produce microprocessors hundreds of times faster than those operating today. Of course, the refrigeration requirements make our seeing this in consumer devices anytime soon implausible--but the head of the research team which made the breakthrough has himself argued for its possible applicability to server farms and supercomputers. If true, this could well prove revolutionary enough in itself.
In the same years one also heard a great deal about superconductors, specifically materials which, under appropriate conditions, cease to resist the passage of electrical current, so that it can flow absolutely without loss--becoming, as the name indicates, super conductors. That implies the possibility of enormous efficiencies in a very great deal of what we do with electricity--which can seem just about everything, with the list getting longer all the time.
In considering the publicity afforded the concept in the 1980s one should note that the concept was not new even then. The phenomenon of superconductivity was first observed way, way back in 1911. However, prior to the '80s the known superconductors only worked at extremely low, near-absolute zero temperatures--which meant that they required enormous amounts of energy for refrigeration (especially with electricity passing through them and heating them up). This, of course, left them with little practical use--while achieving better than that was thought not only an engineering difficulty but a theoretical impossibility. What made superconductors seem newly relevant was the discovery of a ceramic (lanthanum barium copper oxide) that could work as a superconductors at relatively high temperature. (I stress relatively, because the '80s-era discovery meant superconductors operating at 90 Kelvin--which is about three hundred degrees below zero for those of us using the Fahrenheit scale.)
That may not seem very promising, but it did arouse expectations about the rate of progress in the field (there were fantasies that "superconductor supremacy" was going to very soon mean world economic supremacy)--which soon proved rather exaggerated. Still, the research effort continued, and happily, so does progress, with the use of different materials enabling them to achieve superconductivity achieved at higher and higher temperatures until, two years ago physicists actually achieved superconductivity at "room temperature" (in fact, achieved it at 58 degrees Fahrenheit, the average temperature in Bergen, Norway, in July and August) garnering significant attention back in 2020.
What has been less widely covered in the coverage aimed at a non-specialist audience has been the specific circumstances of the achievement of that superconductivity. The superconductor in question (a mix of hydrogen, carbon and sulfur) worked because it was under a pressure of 270 gigapascals--a figure more often mentioned than explained. Those unfamiliar with that unit of measurement should know that it is equivalent to well over 2.6 million times sea level atmospheric pressure, or under about 16,000 miles of water--which is to say, more than two thousand times the submarine hull-squashing pressure at the bottom of the Mariana Trench.
As this shows researchers in the field have traded one set of extreme conditions (cold) for another (pressure), so much so that those who imagined from the press reports that commercially useful room-temperature superconductors were imminent may, as is so often the case when looking more closely at pop science stories that make us think a technology at Technology Readiness Level 1 is already up at Level 9 find this a damp squib. But all the same, it is undeniably a breakthrough, proving that room-temperature superconductivity is, at least, possible, and perhaps yielding insights into how it might be achieved in less extreme conditions--while, for what it is worth, work has begun on making those superconductors work at lower pressures than that.
Moreover, it would be a mistake to think that this means that superconductors have amounted to as little as those other technologies previously mentioned have done to date. If without much fanfare, superconductors have already entered a wide variety of practical, everyday uses, with the most significant, perhaps, Magnetic Resonance Imaging (MRI) machines. Seventy percent of those installed worldwide use superconducting magnets to enable more rapid and comprehensive scanning of the patient. And in that we have a reminder of something else, namely that even short of room-temperature superconductivity the technology is being put to practical use, with another breakthrough previously thought an impossibility--a superconductor through which electricity flows in only one direction--opening the door to the use of the technology in computing to produce microprocessors hundreds of times faster than those operating today. Of course, the refrigeration requirements make our seeing this in consumer devices anytime soon implausible--but the head of the research team which made the breakthrough has himself argued for its possible applicability to server farms and supercomputers. If true, this could well prove revolutionary enough in itself.
Are Those "Spreading Awareness" About Climate Change Aware of What Kind of "Awareness" They Are Spreading?
While thinking about the problem of climate change in recent years I have found myself increasingly concerned with the consequences of so many commentators relentlessly promulgating the bleakest possible view of the situation. These think, or at least give the impression of thinking, that they are "promoting awareness" and somehow contributing to resolving the problem. In fact many, maybe most, are simply promoting defeatism and despair.
Why do they do what they do?
I suspect that they don't understand, or don't want to understand, how politics really works, how and why things do and do not get done. Unable to give the public reasons for hope, and so they put all their energy into exercising the other option for moving it, fear. In spite of the ample evidence that the public already knows all about the problem, and has long been anxious for something to be done about it--so anxious that it is literally sick over it--and time and again elects politicians who promise to do something about it (even if those "leaders" break every promise) these tell themselves that there must not be enough fear out there, and keep doing it over and over again expecting a positive result. Encouraging them in this terribly problematic course is the evident, enormous self-satisfaction persons of weak and unserious mind derive from inflicting disaster porn-riddled jeremiads on the public.
Naturally they never think of the possibility that they have exhausted the usefulness of fear, perhaps a very long time ago, and that continuing to use fear, at least in the manner they have been doing, has become counterproductive; that past a certain point fear can simply make people shut down rather than acting; that rather than screaming alarums they now face the more difficult yet totally indispensable task of explaining frankly and seriously why society has so miserably failed to meet the problem and think seriously and frankly of how it can stop failing and lend their voices to whatever proposals might redress the issue; and that if they are not up to the task (as persons of such caliber generally are not) that they are only getting in the way of those who might be.
Why do they do what they do?
I suspect that they don't understand, or don't want to understand, how politics really works, how and why things do and do not get done. Unable to give the public reasons for hope, and so they put all their energy into exercising the other option for moving it, fear. In spite of the ample evidence that the public already knows all about the problem, and has long been anxious for something to be done about it--so anxious that it is literally sick over it--and time and again elects politicians who promise to do something about it (even if those "leaders" break every promise) these tell themselves that there must not be enough fear out there, and keep doing it over and over again expecting a positive result. Encouraging them in this terribly problematic course is the evident, enormous self-satisfaction persons of weak and unserious mind derive from inflicting disaster porn-riddled jeremiads on the public.
Naturally they never think of the possibility that they have exhausted the usefulness of fear, perhaps a very long time ago, and that continuing to use fear, at least in the manner they have been doing, has become counterproductive; that past a certain point fear can simply make people shut down rather than acting; that rather than screaming alarums they now face the more difficult yet totally indispensable task of explaining frankly and seriously why society has so miserably failed to meet the problem and think seriously and frankly of how it can stop failing and lend their voices to whatever proposals might redress the issue; and that if they are not up to the task (as persons of such caliber generally are not) that they are only getting in the way of those who might be.
Friday, June 3, 2022
Centrism: A Primer
We hear the word "centrist" tossed about a lot--but little about what it really means.
If you want a fuller explanation, supporting everything said here in great deal, to the point of having twenty-five pages of single-spaced endnotes attached, you can go here.
If you want the short version, just keep reading.
Simply put, centrism--certainly in the sense in which we use the term in the U.S.--isn't just middle-of-the-roadness, even if it overlaps with middle-of-the-roadness much of the time, or at least seems to do so. Rather this outlook can more usefully be characterized as classical conservatism updated for a society where liberal institutions have replaced those of the Old Regime which may be said. In line with that conservatism centrists take a dark view of human nature, and are pessimistic about the ability of human beings to rationally understand, direct, "engineer" society and its course. They are especially doubtful about the wisdom and goodness of the "common" man or woman--their ability to understand the issues, and to act rationally when they enter onto the political stage. This leaves them comparatively fearful of and hostile to societal change, especially when that change comes "from below." Instead they favor leadership by an elite able to use its trained judgment, for which they regard no substitute as existing.
However, the twentieth century is also not the eighteenth. As stated previously the feudal-agrarian world of the classical conservative has given way to a capitalist and democratic society, which is the form of life they are stuck with, and stuck with defending. All this being the case, if no lovers of 1789, it is 1917 that haunts them, and against which they define themselves. Thus they accept the fact of a democracy with universal suffrage and liberal rights like freedom of speech--but believe that democracy can only safely operate on very specific lines, keeping its politics "civil" and "pluralist."
What does this mean? It means that people check "ideology"--structured views of what the world is like, how it works, how to operate in it--at the door when they enter into the political arena. They do not raise the matter of how society is structured, who has advantage and who does not, what is right or wrong (much of which they regard as beside the point because of the uncertainties of social life in light of their epistemological doubts, and because they hold that in a liberal society power is so diffuse among voters and consumers that no one really has power over anyone else, for example, corporations against workers or consumers). Instead the practitioner of a centrist politics thinks of society as a plurality of interests, which they assume to all be equally legitimate so long as they abide by those rules in regard to ideology. These interests, within this arena, compete for support and negotiate among themselves in a process advised and guided by experts regarded as objectively treating of value-free facts, for the sake of preventing societal conflicts from escalating to a society-destabilizing degree--or, put more positively, the maintenance of "consensus."
Of course, all that said centrism has tended to embrace particular positions over time. In the mid-twentieth century centrism was for a defensive, containment-oriented anti-Communism in foreign policy, for the New Deal at home (if not necessarily enthusiastic about extending it), for the civil rights movement (in its moderate form), as against a right represented by figures like Barry Goldwater which took a still harder line against Communism (not containment, but "rollback"), sought a return to the nineteenth century with respect to government involvement in the economy, and opposed the civil rights movement as an infringement on the rights of lower levels of government (and not necessarily just on those grounds). Later in the century the ascent of the right (identifiable with Ronald Reagan, who succeeded where Goldwater failed) and other factors (the end of the Cold War, globalization, etc.) saw centrism move a long way to the right on key issues, becoming more like the neoconservative right in its foreign policy, and trading in the New Deal for neoliberalism. Its record in regard to the country's cultural conflicts seems a different thing. Still, it shifted away from a leftishly universalist civil rights movement in favor of a very different identity politics (which the right and many others characterize as "left" but which is readable as very much of the right in its premises, more Maistre than Martin Luther King).
Looking back it seems to me that this version of the center had its heyday in the '90s, when Bill Clinton's administration solidly established the Democratic Party's identification with it in office, governing as it did along these lines, while for the time being the prospect of great power conflict appeared on the wane in a world where Lexuses mattered more than olive trees, and it seemed to many (whether viewing the fact positively or not) that "political correctness" was inexorably in the ascendant. Since then this political vision has faced far more challenge, exemplified by the country's polarization through the twenty-first century--by the contested election of 2000, by the Iraq War and the general expansion of U.S. military involvement in the Middle East, by a succession of economic crises (the tech boom's going bust in 2000, the inflationary energy crisis of the '00s, the Great Recession), by the more recent pandemic, by the escalation of culture war and identity politics, and so much else. In the face of it all thus far the center has generally stuck to its turn-of-the-century positions (partied like it's 1999, so to speak), with the Democratic Party's leadership and officials doing so even as their electoral base has shifted leftward, but it may well be that in the face of the multitude of conflicting pressures centrism will adapt yet again.
If you want a fuller explanation, supporting everything said here in great deal, to the point of having twenty-five pages of single-spaced endnotes attached, you can go here.
If you want the short version, just keep reading.
Simply put, centrism--certainly in the sense in which we use the term in the U.S.--isn't just middle-of-the-roadness, even if it overlaps with middle-of-the-roadness much of the time, or at least seems to do so. Rather this outlook can more usefully be characterized as classical conservatism updated for a society where liberal institutions have replaced those of the Old Regime which may be said. In line with that conservatism centrists take a dark view of human nature, and are pessimistic about the ability of human beings to rationally understand, direct, "engineer" society and its course. They are especially doubtful about the wisdom and goodness of the "common" man or woman--their ability to understand the issues, and to act rationally when they enter onto the political stage. This leaves them comparatively fearful of and hostile to societal change, especially when that change comes "from below." Instead they favor leadership by an elite able to use its trained judgment, for which they regard no substitute as existing.
However, the twentieth century is also not the eighteenth. As stated previously the feudal-agrarian world of the classical conservative has given way to a capitalist and democratic society, which is the form of life they are stuck with, and stuck with defending. All this being the case, if no lovers of 1789, it is 1917 that haunts them, and against which they define themselves. Thus they accept the fact of a democracy with universal suffrage and liberal rights like freedom of speech--but believe that democracy can only safely operate on very specific lines, keeping its politics "civil" and "pluralist."
What does this mean? It means that people check "ideology"--structured views of what the world is like, how it works, how to operate in it--at the door when they enter into the political arena. They do not raise the matter of how society is structured, who has advantage and who does not, what is right or wrong (much of which they regard as beside the point because of the uncertainties of social life in light of their epistemological doubts, and because they hold that in a liberal society power is so diffuse among voters and consumers that no one really has power over anyone else, for example, corporations against workers or consumers). Instead the practitioner of a centrist politics thinks of society as a plurality of interests, which they assume to all be equally legitimate so long as they abide by those rules in regard to ideology. These interests, within this arena, compete for support and negotiate among themselves in a process advised and guided by experts regarded as objectively treating of value-free facts, for the sake of preventing societal conflicts from escalating to a society-destabilizing degree--or, put more positively, the maintenance of "consensus."
Of course, all that said centrism has tended to embrace particular positions over time. In the mid-twentieth century centrism was for a defensive, containment-oriented anti-Communism in foreign policy, for the New Deal at home (if not necessarily enthusiastic about extending it), for the civil rights movement (in its moderate form), as against a right represented by figures like Barry Goldwater which took a still harder line against Communism (not containment, but "rollback"), sought a return to the nineteenth century with respect to government involvement in the economy, and opposed the civil rights movement as an infringement on the rights of lower levels of government (and not necessarily just on those grounds). Later in the century the ascent of the right (identifiable with Ronald Reagan, who succeeded where Goldwater failed) and other factors (the end of the Cold War, globalization, etc.) saw centrism move a long way to the right on key issues, becoming more like the neoconservative right in its foreign policy, and trading in the New Deal for neoliberalism. Its record in regard to the country's cultural conflicts seems a different thing. Still, it shifted away from a leftishly universalist civil rights movement in favor of a very different identity politics (which the right and many others characterize as "left" but which is readable as very much of the right in its premises, more Maistre than Martin Luther King).
Looking back it seems to me that this version of the center had its heyday in the '90s, when Bill Clinton's administration solidly established the Democratic Party's identification with it in office, governing as it did along these lines, while for the time being the prospect of great power conflict appeared on the wane in a world where Lexuses mattered more than olive trees, and it seemed to many (whether viewing the fact positively or not) that "political correctness" was inexorably in the ascendant. Since then this political vision has faced far more challenge, exemplified by the country's polarization through the twenty-first century--by the contested election of 2000, by the Iraq War and the general expansion of U.S. military involvement in the Middle East, by a succession of economic crises (the tech boom's going bust in 2000, the inflationary energy crisis of the '00s, the Great Recession), by the more recent pandemic, by the escalation of culture war and identity politics, and so much else. In the face of it all thus far the center has generally stuck to its turn-of-the-century positions (partied like it's 1999, so to speak), with the Democratic Party's leadership and officials doing so even as their electoral base has shifted leftward, but it may well be that in the face of the multitude of conflicting pressures centrism will adapt yet again.
Thursday, June 2, 2022
A Note on What Ideology Means
When we speak seriously of ideology--of liberalism, conservatism and so forth--we are speaking of a philosophy which addresses fundamental questions about the human condition, and on the basis of the answers it offers to those questions, the problems of economic, political, cultural and social life. Arguably the most important of those questions are:
1. What can we know about the world, and especially the human, social world?
2. What are human beings like--individually and collectively, in society?
3. Given what we know about human beings, what should we consider to be society's goals?
4. If we think that society should be something other than what it is, can we change it for the better? Would the potential gain outweigh the risks?
5. How far can we rely on reason in changing our social arrangements--our economic system, our political system, our culture--for the better?
Conservatism, liberalism, and the rest, all have very specific answers to these questions, which determine their address of specific political questions. For now let us stick with conservatism and liberalism, in the classical sense of each of those terms, which retain some usefulness from this vantage point (even as much else may have changed).
Conservatism takes a dark view of human nature (think Thomas Hobbes), and is pessimistic about the applicability of reason to society. This leaves conservatives more concerned with keeping human badness in check than with, for example, achieving a society affording its members greater freedom, justice or equality, which generally seem to them unrealistic aspirations in the circumstances. Thus they think that the prospects of change for the better are very dim, while tending to regard the social arrangements that have emerged over time, "organically" in response to specific situations--what is often called "tradition," and where following tradition in doctrinaire fashion does not settle the matter, judgments by an elite respectful of tradition based on its own personal, practical experience--as likely to be superior to any human "plan." (As the foundational Joseph de Maistre argued in his Considerations on France, a person can grow a tree, but they cannot "make" a tree--and so it is with a society in the conservative's view.)
Liberalism takes a different view of these matters, seeing human nature as a broader, more flexible thing than conservatives give it credit for, what they might conceive as a timeless, unchanging, unchangeable (and nasty) human nature substantially formed by circumstances. (The liberal John Locke characterized the human mind as a blank slate at birth in his Essay on Human Understanding.) They also have a higher opinion of the capacities of human reason--and therefore see room for better, much better, than we have been doing up to this point, and with that, much more scope existing for a freer, fairer world than history has known. Indeed, they may regard the exercise of reason for the sake of creating a better set of social arrangements as not merely desirable and possible, but obligatory, given that their starting point for thought about society is an individual they regard as having inalienable rights, not least to freedom. They may even regard such change as a practical necessity, for their reason tells them time and again that the world changes, and the "old ways" often fail to meet the new demands it throws up. (Consider, for instance, the interrelated matters of nationalism, militarism, war. The conservative does not see such things going away any time soon, but the liberal points to them as having ceased to be tolerable in an age of globalization, and of nuclear weapons.)
Of course, confronted with this tiny, tiny bit of philosophy 101 many snap "People don't use the word like that!" And certainly most people don't--in part because there has been some awkward shuffling of labels (conservatives having been forced to reckon with liberalism, liberalism having bifurcated into more conservative and more radical versions, etc., etc.) creating a fair amount of superficial confusion. However, more important than any such confusion is the fact that so few thought about the matter long enough to be confused by it; that very few of those who identify as "conservative," "liberal," or anything else have ever considered the questions discussed here at all, let alone in any great depth. All the same, there seem to me to be two rejoinders to their dismissive attitude:
1. Their using the terms in a shallow, unthinking, politically illiterate way does not make those who use the terms in the ways long established in political philosophy and political science somehow incorrect. (To suggest otherwise is more than saying "My ignorance is as good as your knowledge." It is saying "My ignorance is better than your knowledge.") If anything, there is a far better case for the matter being the other way.
2. The more casual, ill-informed usages often turn out to be more consistent with the deeper ones than people generally realize. While people reduce labels like "conservative" or "liberal" to responses to particular hot-button issues about which they may be speaking emotionally rather than intellectually the conservative or liberal position tends to reflect those deeper philosophical assumptions just discussed here (even where the person in question never considered the issue on that level). Thus does it go with such a matter as gender (e.g. gender roles, gender identity, reproductive rights, sexual freedom), with the conservative inclining to the traditional practice, the liberal or radical seeing more scope and reason for change, on the basis of what they rationally judge to be fair and right, and in line with the demands of human rights, including freedom.
The result is that the vulgarian snapping "People don't use the word like that!" has probably done so plenty of times without even knowing it.
1. What can we know about the world, and especially the human, social world?
2. What are human beings like--individually and collectively, in society?
3. Given what we know about human beings, what should we consider to be society's goals?
4. If we think that society should be something other than what it is, can we change it for the better? Would the potential gain outweigh the risks?
5. How far can we rely on reason in changing our social arrangements--our economic system, our political system, our culture--for the better?
Conservatism, liberalism, and the rest, all have very specific answers to these questions, which determine their address of specific political questions. For now let us stick with conservatism and liberalism, in the classical sense of each of those terms, which retain some usefulness from this vantage point (even as much else may have changed).
Conservatism takes a dark view of human nature (think Thomas Hobbes), and is pessimistic about the applicability of reason to society. This leaves conservatives more concerned with keeping human badness in check than with, for example, achieving a society affording its members greater freedom, justice or equality, which generally seem to them unrealistic aspirations in the circumstances. Thus they think that the prospects of change for the better are very dim, while tending to regard the social arrangements that have emerged over time, "organically" in response to specific situations--what is often called "tradition," and where following tradition in doctrinaire fashion does not settle the matter, judgments by an elite respectful of tradition based on its own personal, practical experience--as likely to be superior to any human "plan." (As the foundational Joseph de Maistre argued in his Considerations on France, a person can grow a tree, but they cannot "make" a tree--and so it is with a society in the conservative's view.)
Liberalism takes a different view of these matters, seeing human nature as a broader, more flexible thing than conservatives give it credit for, what they might conceive as a timeless, unchanging, unchangeable (and nasty) human nature substantially formed by circumstances. (The liberal John Locke characterized the human mind as a blank slate at birth in his Essay on Human Understanding.) They also have a higher opinion of the capacities of human reason--and therefore see room for better, much better, than we have been doing up to this point, and with that, much more scope existing for a freer, fairer world than history has known. Indeed, they may regard the exercise of reason for the sake of creating a better set of social arrangements as not merely desirable and possible, but obligatory, given that their starting point for thought about society is an individual they regard as having inalienable rights, not least to freedom. They may even regard such change as a practical necessity, for their reason tells them time and again that the world changes, and the "old ways" often fail to meet the new demands it throws up. (Consider, for instance, the interrelated matters of nationalism, militarism, war. The conservative does not see such things going away any time soon, but the liberal points to them as having ceased to be tolerable in an age of globalization, and of nuclear weapons.)
Of course, confronted with this tiny, tiny bit of philosophy 101 many snap "People don't use the word like that!" And certainly most people don't--in part because there has been some awkward shuffling of labels (conservatives having been forced to reckon with liberalism, liberalism having bifurcated into more conservative and more radical versions, etc., etc.) creating a fair amount of superficial confusion. However, more important than any such confusion is the fact that so few thought about the matter long enough to be confused by it; that very few of those who identify as "conservative," "liberal," or anything else have ever considered the questions discussed here at all, let alone in any great depth. All the same, there seem to me to be two rejoinders to their dismissive attitude:
1. Their using the terms in a shallow, unthinking, politically illiterate way does not make those who use the terms in the ways long established in political philosophy and political science somehow incorrect. (To suggest otherwise is more than saying "My ignorance is as good as your knowledge." It is saying "My ignorance is better than your knowledge.") If anything, there is a far better case for the matter being the other way.
2. The more casual, ill-informed usages often turn out to be more consistent with the deeper ones than people generally realize. While people reduce labels like "conservative" or "liberal" to responses to particular hot-button issues about which they may be speaking emotionally rather than intellectually the conservative or liberal position tends to reflect those deeper philosophical assumptions just discussed here (even where the person in question never considered the issue on that level). Thus does it go with such a matter as gender (e.g. gender roles, gender identity, reproductive rights, sexual freedom), with the conservative inclining to the traditional practice, the liberal or radical seeing more scope and reason for change, on the basis of what they rationally judge to be fair and right, and in line with the demands of human rights, including freedom.
The result is that the vulgarian snapping "People don't use the word like that!" has probably done so plenty of times without even knowing it.
Wednesday, June 1, 2022
Has the Aircraft Carrier Had its Day?
Just this month I was surprised to see a piece in Vanity Fair titled "'Floating Pointlessness': Is This the End of the Age of the Aircraft Carrier?"
The question has been asked again and again for decades--indeed, since the end of the Second World War when the atom bomb and guided missiles cast doubt on the value of nearly every kind of weapons platform then in existence.
Still, the matter is rarely raised in such a forum as Vanity Fair--a fact reflecting its increasing salience, for three reasons:
1. The extent to which more states that had, due to economic constraints and the post-Cold War mood, limited their investment in long-range power projection systems, are now pouring money into them. (Thus Japan has bought buying four heli-carriers, and converted two of them into F-35-carrying attack carriers. Meanwhile Germany seems to be taking an interest.)
2. The resurgence of great power conflict--which means that rather than those carriers simply being used in environments and actions where the opposition had little capability to threaten them (as with every war in which the U.S. has used such carriers since 1945) there is a rising prospect of violent clashes between major navies which may possess significant means foe threatening or even neutralizing carriers.
3. The advent of new anti-shipping weapons that may significantly increase the risk to carriers. These include land and air-based Anti-Ship Ballistic Missiles and hypersonic cruise missiles, against which such ships may be without effective defense (with Russia and China at the forefront of the development). They also include the advent of quieter conventional submarines that may be quite able to slip past the most robust anti-sub protection.
All of this, of course, would seem to be coming to a head in the sharp escalation of conflict between Russia and the West in the wake of open, full-blown interstate war between Russia and Ukraine in a manner that may have no precise parallel since 1945 (even when one includes the break-up of Yugoslavia, which so much of the media seems to have totally forgot about). This is all the more the case for two incidents during that conflict:
1. The first combat use of hypersonic missiles (even if it has been solely against land targets, so far, with questions raised about the weapons' accuracy).
2. The incident foregrounded in the Vanity Fair article, namely the sinking of the Russian cruiser Moskva--the single biggest warship loss since at least the 1982 Falklands conflict, and if one excludes the sinking of the Argentine cruiser General Belgrano in that conflict, the biggest since 1945. This is particularly significant because, at least to go by the version of the incident standard in the American press, the Ukrainian navy accomplished the feat with a pair of Neptune anti-ship missiles--relatively short-range, subsonic missiles far less threatening than any working hypersonic weapon, which nonetheless took out a large ship bristling with anti-air sensors and weapons (including ten surface-to-air missile launchers firing navalized versions of the SA-8 and SA-10 SAMs, and a half dozen close-in weapons systems). The result is that even in the absence of cutting-edge ASBMs, hypersonic cruise missiles and the like large vessels seem to already be more vulnerable than has generally been acknowledged.
Considering the matter I find myself referring back to George Friedman's discussion of the issue in The Future of War, when considering the viability of carriers and other such systems in the age of the guided missiles. He wrote of them as senile rather than obsolescent--which is to say that these systems, ever more endangered, required ever more protection and so yielded, in striking power and other ways, less return on investment, as seen when one considers the extent to which carrier air wings are devoted to fending off threats to the carrier rather than attacking, and the necessity of large numbers of very heavily equipped, sophisticated escorts to make them survivable in a hostile environment. In 2022 the carrier would seem to be that much further along that trajectory--albeit, without any real substitute available. (As yet ship-launched cruise missiles, at least when conventionally equipped, still fall far short of the striking power of a supercarrier's air wing.) The result is a reminder of just how much more unbelievably costly and dangerous modern war keeps on getting, so much so that even more than Friedman Ivan Bloch is, once more, a better guide than he to where we have found ourselves in this regard.
Just don't expect anyone abiding by the conventional wisdom to take the lesson.
The question has been asked again and again for decades--indeed, since the end of the Second World War when the atom bomb and guided missiles cast doubt on the value of nearly every kind of weapons platform then in existence.
Still, the matter is rarely raised in such a forum as Vanity Fair--a fact reflecting its increasing salience, for three reasons:
1. The extent to which more states that had, due to economic constraints and the post-Cold War mood, limited their investment in long-range power projection systems, are now pouring money into them. (Thus Japan has bought buying four heli-carriers, and converted two of them into F-35-carrying attack carriers. Meanwhile Germany seems to be taking an interest.)
2. The resurgence of great power conflict--which means that rather than those carriers simply being used in environments and actions where the opposition had little capability to threaten them (as with every war in which the U.S. has used such carriers since 1945) there is a rising prospect of violent clashes between major navies which may possess significant means foe threatening or even neutralizing carriers.
3. The advent of new anti-shipping weapons that may significantly increase the risk to carriers. These include land and air-based Anti-Ship Ballistic Missiles and hypersonic cruise missiles, against which such ships may be without effective defense (with Russia and China at the forefront of the development). They also include the advent of quieter conventional submarines that may be quite able to slip past the most robust anti-sub protection.
All of this, of course, would seem to be coming to a head in the sharp escalation of conflict between Russia and the West in the wake of open, full-blown interstate war between Russia and Ukraine in a manner that may have no precise parallel since 1945 (even when one includes the break-up of Yugoslavia, which so much of the media seems to have totally forgot about). This is all the more the case for two incidents during that conflict:
1. The first combat use of hypersonic missiles (even if it has been solely against land targets, so far, with questions raised about the weapons' accuracy).
2. The incident foregrounded in the Vanity Fair article, namely the sinking of the Russian cruiser Moskva--the single biggest warship loss since at least the 1982 Falklands conflict, and if one excludes the sinking of the Argentine cruiser General Belgrano in that conflict, the biggest since 1945. This is particularly significant because, at least to go by the version of the incident standard in the American press, the Ukrainian navy accomplished the feat with a pair of Neptune anti-ship missiles--relatively short-range, subsonic missiles far less threatening than any working hypersonic weapon, which nonetheless took out a large ship bristling with anti-air sensors and weapons (including ten surface-to-air missile launchers firing navalized versions of the SA-8 and SA-10 SAMs, and a half dozen close-in weapons systems). The result is that even in the absence of cutting-edge ASBMs, hypersonic cruise missiles and the like large vessels seem to already be more vulnerable than has generally been acknowledged.
Considering the matter I find myself referring back to George Friedman's discussion of the issue in The Future of War, when considering the viability of carriers and other such systems in the age of the guided missiles. He wrote of them as senile rather than obsolescent--which is to say that these systems, ever more endangered, required ever more protection and so yielded, in striking power and other ways, less return on investment, as seen when one considers the extent to which carrier air wings are devoted to fending off threats to the carrier rather than attacking, and the necessity of large numbers of very heavily equipped, sophisticated escorts to make them survivable in a hostile environment. In 2022 the carrier would seem to be that much further along that trajectory--albeit, without any real substitute available. (As yet ship-launched cruise missiles, at least when conventionally equipped, still fall far short of the striking power of a supercarrier's air wing.) The result is a reminder of just how much more unbelievably costly and dangerous modern war keeps on getting, so much so that even more than Friedman Ivan Bloch is, once more, a better guide than he to where we have found ourselves in this regard.
Just don't expect anyone abiding by the conventional wisdom to take the lesson.
Thursday, May 19, 2022
American and British Political Discourse: A Note
Over the years I have had many an occasion to consider the differences in the political discourses of the U.S. and Britain. One of the most significant has been the greater attentiveness of British news-watchers to the economic and socioeconomic than their American counterparts--on average more interested, better-informed, more willing and able to discuss them substantively.
The difference exists in spite of a great many similarities. Britain, too, has its "centrism," which in a prior day undertook reform, but from an essentially conservative position with essentially conservative ends--fearful and resentful of the left while conciliatory to the right, culminating in a rightward shift over time and the withering of discussion, debate, electoral options regarding these matters (witness the trajectory of the Labour Party). Britain, too, has its "culture wars," with both sides in them quite deliberately deploying the politics of identity to divert the public from more material issues. Britain, too, has its "pundits" telling the public social class and class differences are nonexistent or irrelevant or minor, that any such thing as an Establishment is a leftist fantasy, that "neoliberalism" too is "not a thing." All of this reflects the reality that in Britain, too, the media ill serves the public where these matters are concerned (and just about every other, too).
Of course, one may also argue that if all this is the case in Britain it is less thoroughly so than in the U.S., that centrism has not had such a tight grip on the more leftward of its parties (whom would one call the American Aneurin Bevan?), that in Britain the culture wars have not quite gone so far in blocking other issues out of the public consciousness, that much that is conventional wisdom in America is not quite that in Britain, in part because for all its failings the country's media does offer somewhat more leeway for the sorts of marginalized views and voices attentive to such things. Still, as the trajectory of British discourse and policy in recent years makes clear (and as I imagine British leftists in particular would hasten to point out) one should not make too much of these qualifications. Rather three things seem to me to have made the difference:
1. Britain's governance and public policy formation in these areas has so often been so much more centralized than in the U.S.. This makes many an issue decided at the local or state level in the U.S. (local tax rates, much to do with education and health services, etc.), which people in various parts of the country decide in different ways, from different starting points and with different emotional charges, more completely a matter of national policy in Britain (with Thatcherism, in fact, strongly identified with centralization for the sake of implementing its program more fully). The result is that they are an object of national rather than local debate, with all the additional attention this brings them.
2. Britain's policy in the relevant areas has shifted much further left and much further right than it did in the U.S. over the course of the years. This is not to deny that in British life the "center" of the political spectrum may even now be regarded as lying leftward of the American center. However, the public sector, the welfare state, organized labor, were all built up to far greater heights in Britain than was ever seriously considered in the U.S.. And whatever their limitations have been in practice (and these should not be overlooked), Britain had free college and free-at-point-of-service health care, and the U.S. never came close to anything like that. Meanwhile, circa 1980 three of every five British workers was a member of a labor union, as compared with just one in five American workers, while British unions were unencumbered by anything remotely like Taft-Hartley and the top-heaviness of American unions with bureaucracy, and vastly more politically conscious and militant. Naturally the dismantling of all that, if still leaving Britain with more apparatus of this kind than the U.S. ever had, entailed a far bigger and more wrenching shift in economic and social life than what the U.S. has seen in the same period, with that much more controversy, reflected in the reality that, while Ronald Reagan has been a divisive figure in American history, in the American mainstream one simply does not see the expression of the kind of bitterness toward him that one sees in regard to Thatcher in Britain. All of this, again, makes it harder to slight the topic.
3. In Britain the reality of social class has simply been too ostentatious for too long to ignore. Where in the U.S. the national mythology is overwhelmingly devoted to the idea of the self-made individual (to the point that children of extreme socioeconomic privilege constantly pass themselves off as such, with a lickspittle press always enabling them) a monarchy remains the center of British public life, while the legislature's upper house remains a House of Lords, and even for the native-born, old-stock Briton accent is scarcely less a marker of class origin than it was when George Bernard Shaw wrote Pygmalion. Even the difference in the usage of the term "middle class" appears telling. The attempts by British leaders to make the kind of vague use of the term that Americans do, to portray their nation as, rich and poor alike, somehow a middle class nation, seem to have had rather less success than across the Pond. Rather in British usage the word seems to still have a more exclusive (and meaningful) usage, with middle classness having a higher floor--and in the view of most a distinct ceiling too, many scoffing at David Cameron's calling himself "middle class."
Together these three factors--that centralization of political life, the extremity of the distance policy moved along the political spectrum across the past century, and the greater difficulty of dismissing the role of class--make it far, far harder to avoid acknowledging the herd of thoroughly material elephants in the room that make so much difference in the large and the small of individual lives.
The difference exists in spite of a great many similarities. Britain, too, has its "centrism," which in a prior day undertook reform, but from an essentially conservative position with essentially conservative ends--fearful and resentful of the left while conciliatory to the right, culminating in a rightward shift over time and the withering of discussion, debate, electoral options regarding these matters (witness the trajectory of the Labour Party). Britain, too, has its "culture wars," with both sides in them quite deliberately deploying the politics of identity to divert the public from more material issues. Britain, too, has its "pundits" telling the public social class and class differences are nonexistent or irrelevant or minor, that any such thing as an Establishment is a leftist fantasy, that "neoliberalism" too is "not a thing." All of this reflects the reality that in Britain, too, the media ill serves the public where these matters are concerned (and just about every other, too).
Of course, one may also argue that if all this is the case in Britain it is less thoroughly so than in the U.S., that centrism has not had such a tight grip on the more leftward of its parties (whom would one call the American Aneurin Bevan?), that in Britain the culture wars have not quite gone so far in blocking other issues out of the public consciousness, that much that is conventional wisdom in America is not quite that in Britain, in part because for all its failings the country's media does offer somewhat more leeway for the sorts of marginalized views and voices attentive to such things. Still, as the trajectory of British discourse and policy in recent years makes clear (and as I imagine British leftists in particular would hasten to point out) one should not make too much of these qualifications. Rather three things seem to me to have made the difference:
1. Britain's governance and public policy formation in these areas has so often been so much more centralized than in the U.S.. This makes many an issue decided at the local or state level in the U.S. (local tax rates, much to do with education and health services, etc.), which people in various parts of the country decide in different ways, from different starting points and with different emotional charges, more completely a matter of national policy in Britain (with Thatcherism, in fact, strongly identified with centralization for the sake of implementing its program more fully). The result is that they are an object of national rather than local debate, with all the additional attention this brings them.
2. Britain's policy in the relevant areas has shifted much further left and much further right than it did in the U.S. over the course of the years. This is not to deny that in British life the "center" of the political spectrum may even now be regarded as lying leftward of the American center. However, the public sector, the welfare state, organized labor, were all built up to far greater heights in Britain than was ever seriously considered in the U.S.. And whatever their limitations have been in practice (and these should not be overlooked), Britain had free college and free-at-point-of-service health care, and the U.S. never came close to anything like that. Meanwhile, circa 1980 three of every five British workers was a member of a labor union, as compared with just one in five American workers, while British unions were unencumbered by anything remotely like Taft-Hartley and the top-heaviness of American unions with bureaucracy, and vastly more politically conscious and militant. Naturally the dismantling of all that, if still leaving Britain with more apparatus of this kind than the U.S. ever had, entailed a far bigger and more wrenching shift in economic and social life than what the U.S. has seen in the same period, with that much more controversy, reflected in the reality that, while Ronald Reagan has been a divisive figure in American history, in the American mainstream one simply does not see the expression of the kind of bitterness toward him that one sees in regard to Thatcher in Britain. All of this, again, makes it harder to slight the topic.
3. In Britain the reality of social class has simply been too ostentatious for too long to ignore. Where in the U.S. the national mythology is overwhelmingly devoted to the idea of the self-made individual (to the point that children of extreme socioeconomic privilege constantly pass themselves off as such, with a lickspittle press always enabling them) a monarchy remains the center of British public life, while the legislature's upper house remains a House of Lords, and even for the native-born, old-stock Briton accent is scarcely less a marker of class origin than it was when George Bernard Shaw wrote Pygmalion. Even the difference in the usage of the term "middle class" appears telling. The attempts by British leaders to make the kind of vague use of the term that Americans do, to portray their nation as, rich and poor alike, somehow a middle class nation, seem to have had rather less success than across the Pond. Rather in British usage the word seems to still have a more exclusive (and meaningful) usage, with middle classness having a higher floor--and in the view of most a distinct ceiling too, many scoffing at David Cameron's calling himself "middle class."
Together these three factors--that centralization of political life, the extremity of the distance policy moved along the political spectrum across the past century, and the greater difficulty of dismissing the role of class--make it far, far harder to avoid acknowledging the herd of thoroughly material elephants in the room that make so much difference in the large and the small of individual lives.
Wednesday, May 18, 2022
The Necessity of Political Language
It would seem reasonably uncontroversial that our ability to discuss a subject in a clear and rigorous way depends on our having a vocabulary adequate to permit that, not least by furnishing us with a satisfactorily extensive lexicon of terms with precise, coherent meanings for reference. However, many seem to see "political language" as serving no purpose whatsoever, and indeed something to be attacked on sight.
Why is this the case? Let us first of all admit the fact that a good deal of language is, from the outset, muddled at the level of conception, with thinkers coining words and phrases that ultimately fail to illuminate, or which they may even deliberately intend to obscure, to confuse, to manipulate. While we are at it let us also get out of the way the reality that a good deal of usage of terms that do not originate that way can end up being used that way. (Alan Sokal, you may remember, got pretty far stringing together favored buzzwords of people in certain categories of study in the humanities and social sciences in totally nonsensical fashion for the sake of exposing the game. Hermeneutics of Quantum Gravity indeed! It's the sort of phrase that a stupid person might come up with when trying to imagine how the intelligent speak.)
Yet all this is very far from the whole issue, and to focus on it can be misleading. After all, these are problems not just with political language, but all language, despite which we generally manage to get along. But many play them up where politics is concerned for the most dubious of reasons. One is laziness, of course. Dismissal of the value of a body of knowledge is easier than actually acquiring it before we make our judgments. ("We'll never need math anyway," says many a frustrated student. Or grammar. Or spelling. Or anything else for which there seems to be an app. They are likely to find out otherwise.) Even those pursuing a bit of intellectual distinction can take the lazy route themselves, being all too familiar with the sorts of cheap maneuvers that impress the simple-minded--for example, dismissing existing categories of thought as if they have somehow transcended them--saying "Look at me and how much smarter I am than everyone else because I have seen through their bugbears! Look at me thinking outside the box!"--when in reality they have not done any thinking at all. When they have not even looked at the box or its contents, let alone managed to think outside them (and likely not even realized that "thinking outside the box" has degenerated into yet anothr atrocious corporate clichè whose utterance strongly suggests that the speaker is not doing what they presume to be doing).
However, other factors seem to me more consequential, not least the prevalence of a postmodernist intellectual orthodoxy that takes a very dim view of the capacities of human reason. This goes only so far in the realm of the physical sciences--one can't argue (very much) with the accomplishments of physicists, for example. But they have more scope to do so in the social realm, where one not only gets away with, but is lauded as displaying the greatest profundity for, doing what would be the physics equivalent of "All that matter and motion and energy stuff is probably incomprehensible, and looking for patterns in it or explanations for it is pointless and pernicious. Best not do it and stick to as superficial a reading of things as we can."
This orthodoxy extends to a dim view of human language's descriptive powers, which has them dismissing all language as "language games" and words--and especially words with heavy political meanings--as therefore meaningless, often in a tone suggestive of this being indisputably obvious to all. That even amid all the superficiality and misuse and abuse of terms words like "conservative," "liberal," "socialist," "capitalist" (or "neoconservative," "neoliberal," "fascist," "centrist") these can still denote real, meaningful and important concepts or sets of concepts, and that even the shallower, diverging or outright erroneous usages may reflect and evoke and reinforce those patterns in telling and important ways, is something they steadfastly refuse to acknowledge--not least because these cheap and shabby evasions may serve their purposes, which is not in furthering a conversation, but undermining a conversation they do not want to see happen at all.
Indeed, blanket attacks on political language are often a shabby cover for attacks on someone else's political language, for the sake of promoting one's own no less political nomenclature. I recall, for example, running into a recent discussion of centrism on Twitter. When the Tweeter in question raised a particular analysis of where centrism sits within the political spectrum one of the first people to respond sneeringly dismissed the left-to-right political spectrum as meaningless and incoherent and declared that people should think in terms of statism and anti-statism. What he suggested, of course, was our narrowing the political dialogue to a single issue along a single axis (most people at least let us have two!) apparently divorced from any intellectual premises whatsoever in an all too blatant attempt to shut down the conversation that Tweeter was trying to start and have one more congenial to him. As the person in question was a professional renewable-energy basher with endorsements for his book from certain high-profile figures his biases here were all too obvious. Simply put, he wanted to make it a matter of "libertarians" as the champions of "freedom" (he didn't define either term, of course, but anyone can guess what he had in mind) against everyone else.
Whatever one makes of his views, his unfortunate intervention (which smacked of that online discourse policing--that heckling--that I disdain regardless of who does it--and which can only seem at odds with any pretension to respect for freedom of speech) only underlined how we never get away from our reliance on a political vocabulary, and that rather than dismissing it in that lazy and pretentious way as so many do, what we ought to do is work to make that vocabulary useful.
Why is this the case? Let us first of all admit the fact that a good deal of language is, from the outset, muddled at the level of conception, with thinkers coining words and phrases that ultimately fail to illuminate, or which they may even deliberately intend to obscure, to confuse, to manipulate. While we are at it let us also get out of the way the reality that a good deal of usage of terms that do not originate that way can end up being used that way. (Alan Sokal, you may remember, got pretty far stringing together favored buzzwords of people in certain categories of study in the humanities and social sciences in totally nonsensical fashion for the sake of exposing the game. Hermeneutics of Quantum Gravity indeed! It's the sort of phrase that a stupid person might come up with when trying to imagine how the intelligent speak.)
Yet all this is very far from the whole issue, and to focus on it can be misleading. After all, these are problems not just with political language, but all language, despite which we generally manage to get along. But many play them up where politics is concerned for the most dubious of reasons. One is laziness, of course. Dismissal of the value of a body of knowledge is easier than actually acquiring it before we make our judgments. ("We'll never need math anyway," says many a frustrated student. Or grammar. Or spelling. Or anything else for which there seems to be an app. They are likely to find out otherwise.) Even those pursuing a bit of intellectual distinction can take the lazy route themselves, being all too familiar with the sorts of cheap maneuvers that impress the simple-minded--for example, dismissing existing categories of thought as if they have somehow transcended them--saying "Look at me and how much smarter I am than everyone else because I have seen through their bugbears! Look at me thinking outside the box!"--when in reality they have not done any thinking at all. When they have not even looked at the box or its contents, let alone managed to think outside them (and likely not even realized that "thinking outside the box" has degenerated into yet anothr atrocious corporate clichè whose utterance strongly suggests that the speaker is not doing what they presume to be doing).
However, other factors seem to me more consequential, not least the prevalence of a postmodernist intellectual orthodoxy that takes a very dim view of the capacities of human reason. This goes only so far in the realm of the physical sciences--one can't argue (very much) with the accomplishments of physicists, for example. But they have more scope to do so in the social realm, where one not only gets away with, but is lauded as displaying the greatest profundity for, doing what would be the physics equivalent of "All that matter and motion and energy stuff is probably incomprehensible, and looking for patterns in it or explanations for it is pointless and pernicious. Best not do it and stick to as superficial a reading of things as we can."
This orthodoxy extends to a dim view of human language's descriptive powers, which has them dismissing all language as "language games" and words--and especially words with heavy political meanings--as therefore meaningless, often in a tone suggestive of this being indisputably obvious to all. That even amid all the superficiality and misuse and abuse of terms words like "conservative," "liberal," "socialist," "capitalist" (or "neoconservative," "neoliberal," "fascist," "centrist") these can still denote real, meaningful and important concepts or sets of concepts, and that even the shallower, diverging or outright erroneous usages may reflect and evoke and reinforce those patterns in telling and important ways, is something they steadfastly refuse to acknowledge--not least because these cheap and shabby evasions may serve their purposes, which is not in furthering a conversation, but undermining a conversation they do not want to see happen at all.
Indeed, blanket attacks on political language are often a shabby cover for attacks on someone else's political language, for the sake of promoting one's own no less political nomenclature. I recall, for example, running into a recent discussion of centrism on Twitter. When the Tweeter in question raised a particular analysis of where centrism sits within the political spectrum one of the first people to respond sneeringly dismissed the left-to-right political spectrum as meaningless and incoherent and declared that people should think in terms of statism and anti-statism. What he suggested, of course, was our narrowing the political dialogue to a single issue along a single axis (most people at least let us have two!) apparently divorced from any intellectual premises whatsoever in an all too blatant attempt to shut down the conversation that Tweeter was trying to start and have one more congenial to him. As the person in question was a professional renewable-energy basher with endorsements for his book from certain high-profile figures his biases here were all too obvious. Simply put, he wanted to make it a matter of "libertarians" as the champions of "freedom" (he didn't define either term, of course, but anyone can guess what he had in mind) against everyone else.
Whatever one makes of his views, his unfortunate intervention (which smacked of that online discourse policing--that heckling--that I disdain regardless of who does it--and which can only seem at odds with any pretension to respect for freedom of speech) only underlined how we never get away from our reliance on a political vocabulary, and that rather than dismissing it in that lazy and pretentious way as so many do, what we ought to do is work to make that vocabulary useful.
Tuesday, May 17, 2022
In the 2020s the 1920s Have Become the 1930s
I remember back about the turn of the century Thomas P.M. Barnett emerged as a national security counterpart to Thomas Friedman who could be characterized as devoting himself to explaining just how, exactly, "McDonnell-Douglas" would back up "McDonald's."
Barnett's conformity to the globalization-singing conventional wisdom of the day made him the sort of fashionable Public Intellectual who got written up in places like Esquire (which magazine's "foreign-policy guru" he subsequently became).
I was rather less impressed than the folks at Esquire. Still, Barnett was astute enough to acknowledge that the whole thing could unravel, citing Admiral William Flanagan in his book Blueprint for Action about the possibility that "the 1990s might be a replay of the 1920s," raising "the question . . . What would it take for the 2000s to turn as sour as the 1930s?"
The analogy seems to me fair enough. Like the 1920s the 1990s were a period after the end of a major international conflict which it was hoped would never be followed by another like it. After all, those optimistic about the trend of things (at least, in the politically orthodox way) imagined that conflict's end supposedly auguring the arrival of a more orderly, peaceful--and prosperous--world, with many a parallel quite striking. On both occasions the U.S. had emerged from that conflict as victor, hyperpowered arbiter of the world's fate, and in the "American way," the pointer to everyone else's future amid a financial boom and euphoria over a supposedly epochal revolution in productivity and consumerism bound up with new technology, and immense self-satisfaction about their freer, "liberated" lifestyles--all the while ignoring anything that gave the lie to their illusions, dismissing the financial crises, the international crises as mere bumps on the road, which they insisted were quite manageable by deified Overseers of the Global Economy, and the stirrings of radicalism at home and abroad (the country certainly had its "status politics," its "culture wars") a much-ado-about-nothing on the wrong side of the end of history.
Considering it Barnett--whom it must be noted again, was fashionable because he was conventional--was on the whole optimistic that the challenges could remain so manageable in the long term. (Hence, that "Blueprint for Action.") Those taking a less sanguine view of these developments thought otherwise, and they have since proven correct as, just as the illusions of the '20s died, so did those of the '90s.
When we look back at the '20s the tendency is to think of them as having come to an end in 1929, with "the Great Crash." Of course, the end of the mood we associate with the decade was not so obviously tidy. But it does seem that the illusions of the '90s may have been a longer time dying than those of the '20s, dying only a bit at a time, with the way things played out enabling the denials to last longer. One may recall, for example, the rush to declare the financial crisis that broke out in 2007-2008 past--such that even an Adam Tooze, when buckling down to study the event properly, was himself surprised to conclude in a book published a decade later that it had never gone away, as it still has not, simply merging with other crises, like the COVID-19 pandemic and its own attached economic crisis (also not behind us, even if some pretend it is), into something bigger and worse and scarier. (How big and bad and scary? Well, according to one calculation, even before the pandemic economic growth rates had either virtually flatlined or turned significantly negative for most of the planet--which makes all the backlash against neoliberalism--the votes for Trump, Britain's exit from the EU, and all the rest--that much less surprising.) Meanwhile, if any doubts had remained after a decade of intensifying and increasingly militarized conflict among the great powers, the war in Ukraine has made it very, very clear that the actuality of such things--open, large-scale, sustained interstate warfare between large nation-states in the middle of Europe, and escalating confrontation between NATO and Russia--is a significant and worsening part of our present reality.
Looking at the news I do not get the impression that very many have properly processed the fact yet. But the neo-'20s mood that characterized the '90s, and lingered in varying ways and to varying degrees long after the years on the calendar ceased to read 199-, seems ever more remote these days, any indication otherwise ever more superficial.
Barnett's conformity to the globalization-singing conventional wisdom of the day made him the sort of fashionable Public Intellectual who got written up in places like Esquire (which magazine's "foreign-policy guru" he subsequently became).
I was rather less impressed than the folks at Esquire. Still, Barnett was astute enough to acknowledge that the whole thing could unravel, citing Admiral William Flanagan in his book Blueprint for Action about the possibility that "the 1990s might be a replay of the 1920s," raising "the question . . . What would it take for the 2000s to turn as sour as the 1930s?"
The analogy seems to me fair enough. Like the 1920s the 1990s were a period after the end of a major international conflict which it was hoped would never be followed by another like it. After all, those optimistic about the trend of things (at least, in the politically orthodox way) imagined that conflict's end supposedly auguring the arrival of a more orderly, peaceful--and prosperous--world, with many a parallel quite striking. On both occasions the U.S. had emerged from that conflict as victor, hyperpowered arbiter of the world's fate, and in the "American way," the pointer to everyone else's future amid a financial boom and euphoria over a supposedly epochal revolution in productivity and consumerism bound up with new technology, and immense self-satisfaction about their freer, "liberated" lifestyles--all the while ignoring anything that gave the lie to their illusions, dismissing the financial crises, the international crises as mere bumps on the road, which they insisted were quite manageable by deified Overseers of the Global Economy, and the stirrings of radicalism at home and abroad (the country certainly had its "status politics," its "culture wars") a much-ado-about-nothing on the wrong side of the end of history.
Considering it Barnett--whom it must be noted again, was fashionable because he was conventional--was on the whole optimistic that the challenges could remain so manageable in the long term. (Hence, that "Blueprint for Action.") Those taking a less sanguine view of these developments thought otherwise, and they have since proven correct as, just as the illusions of the '20s died, so did those of the '90s.
When we look back at the '20s the tendency is to think of them as having come to an end in 1929, with "the Great Crash." Of course, the end of the mood we associate with the decade was not so obviously tidy. But it does seem that the illusions of the '90s may have been a longer time dying than those of the '20s, dying only a bit at a time, with the way things played out enabling the denials to last longer. One may recall, for example, the rush to declare the financial crisis that broke out in 2007-2008 past--such that even an Adam Tooze, when buckling down to study the event properly, was himself surprised to conclude in a book published a decade later that it had never gone away, as it still has not, simply merging with other crises, like the COVID-19 pandemic and its own attached economic crisis (also not behind us, even if some pretend it is), into something bigger and worse and scarier. (How big and bad and scary? Well, according to one calculation, even before the pandemic economic growth rates had either virtually flatlined or turned significantly negative for most of the planet--which makes all the backlash against neoliberalism--the votes for Trump, Britain's exit from the EU, and all the rest--that much less surprising.) Meanwhile, if any doubts had remained after a decade of intensifying and increasingly militarized conflict among the great powers, the war in Ukraine has made it very, very clear that the actuality of such things--open, large-scale, sustained interstate warfare between large nation-states in the middle of Europe, and escalating confrontation between NATO and Russia--is a significant and worsening part of our present reality.
Looking at the news I do not get the impression that very many have properly processed the fact yet. But the neo-'20s mood that characterized the '90s, and lingered in varying ways and to varying degrees long after the years on the calendar ceased to read 199-, seems ever more remote these days, any indication otherwise ever more superficial.
Subscribe to:
Posts (Atom)