When George and Meredith Friedman's The Future of War: Power, Technology and American World Dominance in the 21st Century appeared in 1996 the book, while certainly getting its fair share of attention, did so in spite of being distinctly unfashionable. After all, the authors' concern in the book was traditional, interstate, great power, war at a moment when, because of U.S. strength relative to any plausible opponent, and the expectation that globalization and other changes would soon make the nation-state and its traditional security agenda less relevant to the life of the world ("We no longer fight wars against countries, we fight them against individuals!" many a fashionable "analyst" enthused), with the result that that form of conflict not only seemed remote, but likely to go on becoming only more so over time.
Of course, even at the time many were skeptical of this "conventional wisdom"--for various reasons. The Marxist left, for example, with its stress on capitalism's contradictions, and its theories of imperialism (Luxemburg, Bukharin, Lenin), never bought into all this. However the Friedmans came from the opposite end of the political spectrum--and stood on a rather doctrinaire, "life never changes" insistence on the continuing validity of old-fashioned International Relations 101 "billiard ball"-model-of-the-international-system realpolitik. In doing so the Friedmans did not venture guesses as to who would be the belligerents in the armed conflicts this realpolitik-based vision of international relations anticipated as virtually certain (the U.S. apart), or the specific objects for which they could be expected to fight, or any of those things which could be concluded from such premises, like when and where such wars could be expected to break out--perhaps chastened by their having made a colossal recent error here. (Just a few years earlier here George and Meredith, at the peak of American Japanophobia, warned darkly of The Coming War With Japan.) Instead what the Friedmans emphasized was the long-term evolution of the means for fighting such wars--specifically the shift from massed explosive power and the land, sea and air-based platforms which deliver it (tanks, gun-armed vessels, planes) to precision-guided munitions, highly technologized "Starship Troopers"-style infantry, and eventually space platforms ("battlestars"), making senile and eventually obsolete the systems that dominated the twentieth century battlefield.
As yet we remain far from possessing the technologies to the Friedmans' vision. And certainly George's more specific political predictions have since been unimpressive. (In his book The Next 100 Years he forecast that China would collapse in the 2010s--and Russia not long after, and that without anything like the stress of the current war the country is fighting, which he also failed to predict, instead anticipating a Paris-Berlin-Moscow axis.) However, if there is much that he has clearly got wrong already, he seems to have been sounder than his more fashionable colleagues in realizing that the illusions of the '90s were only that, illusions, a fact whose fuller implications many are only now beginning to realize.
Friday, June 17, 2022
Wednesday, June 15, 2022
Will the Cost of Living Ever Go Down?
In the twentieth century the economic game was all about getting consumers to spend more. The reason was that by this point businesses were producing significantly more than the customer could afford to buy, or was inclined to buy even when they did have the money, raising the problem of how that product was to be moved.
The result was the advent of consumer culture, complete with the advertising that said you had to have the newest and latest--and priciest--or you would, even if prepared to suffer a great deal of personal inconvenience by making do with an "inferior" product, be shunned as a social outcast (while the companies always made sure there was a newest and latest, often if this was entirely pointless). There was the built-in obsolescence that forced the consumer to buy the same thing over and over again (even before the new version came out). And, it might be added, there was some help from the prevailing politics, in which society didn't deal with problems and left individuals to cope as best they could--which is to say, as expensively as they could. (Urban life getting you down? Buy a house in the suburbs. And a car with which to get there.) And so on and so forth. In line with such a course central bankers made a point of making the borrowing that let consumers spend beyond their means (this, too, was a key part of consumer culture) cheaper than it might otherwise have been, while government "stimulated" the economy fiscally, too, to keep it running smoothly (not least, with those giant military-industrial complexes). And so on and so forth.
The economic growth model I describe here had a heyday in the post-war period--the boom years of the 1950s, the 1960s, the 1970s, when people were consuming more and the economy was growing and they made enough money out of it to consume still more than that. But before that decade was out it was clear that this model was no longer delivering the goods, because growth was sputtering out, profits down --while inflation was up too, how about that? As a result the model shifted in a lot of ways. ("Cry havoc, and let slip the dogs of finance!" is what the policymakers of the day said, minus the Shakespeare, which I suspect was well above their heads. Politicians in my lifetime haven't been the most literate bunch.) But the part about getting people to consume more stayed--in spite of those pesky environmentalists who went about saying there were "limits to growth."
For all the sound and fury we never really saw anything like the glory days of the mid-century boom, economic growth-wise. What little growth there was didn't really trickle down the way the reformers promised it would. And things have got much, much worse since, especially after the Great Recession--while that limits to growth talk has never ceased to haunt the conversation. The result is that few are really content with how things are going in 2022--not least a generation showing signs of being really and truly fed up with the associated "rat race."
For all that, it hardly seems that any dramatic change is imminent, but one may wonder nonetheless--is it conceivable that instead of so many putting so much effort into making people spend more and more, we could turn that capacity for INNOVATION! of which we hear so much (but, it can often seem, see so little) toward enabling individuals to enjoy an acceptable standard of living while spending less, consuming less, relative to today? Certainly the folks at the RethinkX think tank see us as on track toward a world where people can have "First World" comfort at "Third World" prices (and orders of magnitude less ecological impact to boot). Premised on innovations like cellular agirculture, Transportation-as-a-Service, and the "printing" of food, clothing and shelter, it can seem as if the essentials for it are things that were supposed to be near at hand for so long that this will always remain the case--while it is far from clear how well society will realize the potentials they allow even should the technologies themselves become a reality. Certainly the one part of their vision that has advanced most fully, the extreme cheapening of the price of computing and digital communications, has not quite played out the way the optimists of a generation ago hoped--while, even more than before, we live in a society which, rather than solving a problem, tells people to just "live with it." Those problems include a historic upsurge of inflation. Yet it seems that the failure gives us all the more reason to think the harder about what the conventional wisdom would have us write off as impossible.
The result was the advent of consumer culture, complete with the advertising that said you had to have the newest and latest--and priciest--or you would, even if prepared to suffer a great deal of personal inconvenience by making do with an "inferior" product, be shunned as a social outcast (while the companies always made sure there was a newest and latest, often if this was entirely pointless). There was the built-in obsolescence that forced the consumer to buy the same thing over and over again (even before the new version came out). And, it might be added, there was some help from the prevailing politics, in which society didn't deal with problems and left individuals to cope as best they could--which is to say, as expensively as they could. (Urban life getting you down? Buy a house in the suburbs. And a car with which to get there.) And so on and so forth. In line with such a course central bankers made a point of making the borrowing that let consumers spend beyond their means (this, too, was a key part of consumer culture) cheaper than it might otherwise have been, while government "stimulated" the economy fiscally, too, to keep it running smoothly (not least, with those giant military-industrial complexes). And so on and so forth.
The economic growth model I describe here had a heyday in the post-war period--the boom years of the 1950s, the 1960s, the 1970s, when people were consuming more and the economy was growing and they made enough money out of it to consume still more than that. But before that decade was out it was clear that this model was no longer delivering the goods, because growth was sputtering out, profits down --while inflation was up too, how about that? As a result the model shifted in a lot of ways. ("Cry havoc, and let slip the dogs of finance!" is what the policymakers of the day said, minus the Shakespeare, which I suspect was well above their heads. Politicians in my lifetime haven't been the most literate bunch.) But the part about getting people to consume more stayed--in spite of those pesky environmentalists who went about saying there were "limits to growth."
For all the sound and fury we never really saw anything like the glory days of the mid-century boom, economic growth-wise. What little growth there was didn't really trickle down the way the reformers promised it would. And things have got much, much worse since, especially after the Great Recession--while that limits to growth talk has never ceased to haunt the conversation. The result is that few are really content with how things are going in 2022--not least a generation showing signs of being really and truly fed up with the associated "rat race."
For all that, it hardly seems that any dramatic change is imminent, but one may wonder nonetheless--is it conceivable that instead of so many putting so much effort into making people spend more and more, we could turn that capacity for INNOVATION! of which we hear so much (but, it can often seem, see so little) toward enabling individuals to enjoy an acceptable standard of living while spending less, consuming less, relative to today? Certainly the folks at the RethinkX think tank see us as on track toward a world where people can have "First World" comfort at "Third World" prices (and orders of magnitude less ecological impact to boot). Premised on innovations like cellular agirculture, Transportation-as-a-Service, and the "printing" of food, clothing and shelter, it can seem as if the essentials for it are things that were supposed to be near at hand for so long that this will always remain the case--while it is far from clear how well society will realize the potentials they allow even should the technologies themselves become a reality. Certainly the one part of their vision that has advanced most fully, the extreme cheapening of the price of computing and digital communications, has not quite played out the way the optimists of a generation ago hoped--while, even more than before, we live in a society which, rather than solving a problem, tells people to just "live with it." Those problems include a historic upsurge of inflation. Yet it seems that the failure gives us all the more reason to think the harder about what the conventional wisdom would have us write off as impossible.
Friday, June 10, 2022
Beyond Lithium-Ion Batteries
Everyone with any interest in renewable energy solution as part of the solution to the energy-climate crisis is probably sick of hearing it--the renewables-basher's sneer that "The sun doesn't shine and the wind doesn't blow all the time" (as if everyone didn't already know that!).
They are likely sicker still of the renewables-bashers' inflicting on them the conclusion derive from this--namely that renewables cannot and will never supply more than a fraction of our energy. So it's fossil fuels yesterday, fossil fuels today, fossil fuels tomorrow, fossil fuels forever! Sorry/not sorry, hippies! We wash our Sport Utility Vehicles with your tears!
Of course, this argument is a weak one that has been getting weaker all the time. After all, renewables have been getting cheaper--so much cheaper that even equipped with battery storage (also getting cheaper) they are becoming competitive with natural gas, even without special government favors being shown to renewables (even as natural gas, along with the rest of the fossil fuels sector, enjoys past, accumulated largesse, and continued favors, on an immense scale conveniently overlooked by those who whine endlessly about tax credits for solar and wind and the like). Indeed, the RethinkX think tank has made a fascinating case for, assuming that these price drops continue a little longer, it will be the cost-effective thing to build renewable up to the level of "Clean Energy Super Power," with local surpluses of SWB (Solar-Wind-Battery)-based energy making electricity as cheap as bandwidth.
In fairness, battery storage is not without is difficulties. The lithium ion batteries that remain the go-to type, after all, rely on rare minerals concentrated in a handful of conflict-ridden regions (like the lithium of Bolivia and Afghanistan, the cobalt of the Congo) where they are mined in conditions which are brutal for the workers and damaging to the local environment, while there is the additional ecological problem of what to do with the batteries at the end of their useful lives. And the renewables-bashers, of course, make the most of this, too (applying the same double-standard they do to environmental effects and working conditions that they do to such matters as tax credits; after all, they forget just how many products the same can be said of, the fossil fuels to which they are so loyal included). However, besides being sanctimonious in the extreme, they are also wrong about these evils being necessary costs of any attempt to shift the world's energy base. It is far from being the case that lithium-ion constitutes the only option for electrochemical batteries. Quite a number of alternatives based on abundant, low-cost materials, capable of delivering the requisite power and energy density, are in development--with the obstacles falling away regularly enough (cobalt-free batteries not only exist but comprise a growing share of Tesla's production, while Samsung and Panasonic are moving beyond the stuff as well) that, in contrast with so many areas where tech writers hawk baseless optimism, there seem grounds here for the expectation of working technical solutions.
Meanwhile, electrochemical batteries are far from being the sole electricity storage option in every area. Notable here is gravitic storage, which entails raising a weight to a given height using some of the electricity amassed, and then dropping it, releasing its potential energy. Pumped hydroelectric is a well-established type of such storage, used since the nineteenth century in hydroelectric power operations--and in fact its usability as a storage method in connection with solar and other non-hydro forms is likewise well-established. More novel, we are seeing increased interest in the use of towers like those being built by the Swiss Energy Storage firm (which utilize concrete blocks in similar fashion)--a method which is already becoming economically competitive. Such forms of storage may not get a car from A to B--but even in a world where the present hopes set on more sustainably sourced batteries fall short of the present expectations, their enhancement of the viability of renewables-powered electric grid will in itself greatly reduce the problem of supplying the batteries keeping an electrified transport system running.
They are likely sicker still of the renewables-bashers' inflicting on them the conclusion derive from this--namely that renewables cannot and will never supply more than a fraction of our energy. So it's fossil fuels yesterday, fossil fuels today, fossil fuels tomorrow, fossil fuels forever! Sorry/not sorry, hippies! We wash our Sport Utility Vehicles with your tears!
Of course, this argument is a weak one that has been getting weaker all the time. After all, renewables have been getting cheaper--so much cheaper that even equipped with battery storage (also getting cheaper) they are becoming competitive with natural gas, even without special government favors being shown to renewables (even as natural gas, along with the rest of the fossil fuels sector, enjoys past, accumulated largesse, and continued favors, on an immense scale conveniently overlooked by those who whine endlessly about tax credits for solar and wind and the like). Indeed, the RethinkX think tank has made a fascinating case for, assuming that these price drops continue a little longer, it will be the cost-effective thing to build renewable up to the level of "Clean Energy Super Power," with local surpluses of SWB (Solar-Wind-Battery)-based energy making electricity as cheap as bandwidth.
In fairness, battery storage is not without is difficulties. The lithium ion batteries that remain the go-to type, after all, rely on rare minerals concentrated in a handful of conflict-ridden regions (like the lithium of Bolivia and Afghanistan, the cobalt of the Congo) where they are mined in conditions which are brutal for the workers and damaging to the local environment, while there is the additional ecological problem of what to do with the batteries at the end of their useful lives. And the renewables-bashers, of course, make the most of this, too (applying the same double-standard they do to environmental effects and working conditions that they do to such matters as tax credits; after all, they forget just how many products the same can be said of, the fossil fuels to which they are so loyal included). However, besides being sanctimonious in the extreme, they are also wrong about these evils being necessary costs of any attempt to shift the world's energy base. It is far from being the case that lithium-ion constitutes the only option for electrochemical batteries. Quite a number of alternatives based on abundant, low-cost materials, capable of delivering the requisite power and energy density, are in development--with the obstacles falling away regularly enough (cobalt-free batteries not only exist but comprise a growing share of Tesla's production, while Samsung and Panasonic are moving beyond the stuff as well) that, in contrast with so many areas where tech writers hawk baseless optimism, there seem grounds here for the expectation of working technical solutions.
Meanwhile, electrochemical batteries are far from being the sole electricity storage option in every area. Notable here is gravitic storage, which entails raising a weight to a given height using some of the electricity amassed, and then dropping it, releasing its potential energy. Pumped hydroelectric is a well-established type of such storage, used since the nineteenth century in hydroelectric power operations--and in fact its usability as a storage method in connection with solar and other non-hydro forms is likewise well-established. More novel, we are seeing increased interest in the use of towers like those being built by the Swiss Energy Storage firm (which utilize concrete blocks in similar fashion)--a method which is already becoming economically competitive. Such forms of storage may not get a car from A to B--but even in a world where the present hopes set on more sustainably sourced batteries fall short of the present expectations, their enhancement of the viability of renewables-powered electric grid will in itself greatly reduce the problem of supplying the batteries keeping an electrified transport system running.
Thursday, June 9, 2022
Why I'm Sick of Hearing the Term "Carbon Footprint"
Not long ago I argued that the time had long passed when simply screaming about climate change did any good--that those who wanted to make a positive contribution had to talk about solutions.
The piece, unsurprisingly, wasn't terribly popular so far as I can tell--and I was reminded in what response I got that where talking solutions is concerned here people have a tendency to set the bar very low indeed, in a watery way saying things like "The solutions are all around us" or "The solutions are in us" or "The solutions are in the choices we make."
Ah, but where around us and where in us and in which choices? That is what would elevate this past the level of banality--and the real bar that this discussion has to clear, and which so little of it does. And what does get specific makes matters still worse. As climate scientist Michael Mann made clear in an interview in Scientific American that I think ought to be required reading for anyone seriously interested in the climate crisis, the opponents of meaningful action on climate change have gone to great lengths to shift the discussion away from the actions of large and powerful entities like corporations and governments to individuals--not least via that invention of the folks at BP, "carbon footprint."
You likely recall the old adage that "With great power comes great responsibility" (most likely from watching Spiderman, though it is no less valid for that). Reducing everything to personal carbon footprint, and while we are on it ignoring the limited means and limited options of the vast majority of the planet (even in the First World U.S. a third of the public cannot meet a mere $400 emergency out of their own means), this is yet another case of the opposite--of assigning all the responsibility to those who have none of the power (or so close to it as makes no difference), and treating them like eco-criminals for failing to accomplish the impossible task set for them. In spite of that the overwhelming evidence is that the public as a whole recognizes the problem--and wants something done about it, as it shows again and again not just in the polls but at the ballot box--but this brand of pseudo-environmentalism is setting the effort back rather than advancing it precisely because of the ways in which it alienates the very public to which it is presuming to appeal.
The piece, unsurprisingly, wasn't terribly popular so far as I can tell--and I was reminded in what response I got that where talking solutions is concerned here people have a tendency to set the bar very low indeed, in a watery way saying things like "The solutions are all around us" or "The solutions are in us" or "The solutions are in the choices we make."
Ah, but where around us and where in us and in which choices? That is what would elevate this past the level of banality--and the real bar that this discussion has to clear, and which so little of it does. And what does get specific makes matters still worse. As climate scientist Michael Mann made clear in an interview in Scientific American that I think ought to be required reading for anyone seriously interested in the climate crisis, the opponents of meaningful action on climate change have gone to great lengths to shift the discussion away from the actions of large and powerful entities like corporations and governments to individuals--not least via that invention of the folks at BP, "carbon footprint."
You likely recall the old adage that "With great power comes great responsibility" (most likely from watching Spiderman, though it is no less valid for that). Reducing everything to personal carbon footprint, and while we are on it ignoring the limited means and limited options of the vast majority of the planet (even in the First World U.S. a third of the public cannot meet a mere $400 emergency out of their own means), this is yet another case of the opposite--of assigning all the responsibility to those who have none of the power (or so close to it as makes no difference), and treating them like eco-criminals for failing to accomplish the impossible task set for them. In spite of that the overwhelming evidence is that the public as a whole recognizes the problem--and wants something done about it, as it shows again and again not just in the polls but at the ballot box--but this brand of pseudo-environmentalism is setting the effort back rather than advancing it precisely because of the ways in which it alienates the very public to which it is presuming to appeal.
Wednesday, June 8, 2022
A Financial Singularity? The Stock Market Bubble of the 1990s
During the stock market boom of the late twentieth century the capitalization of the stock market grew more than 150 percent between 1994 and 1999 alone--a growth wildly disproportionate to the growth of the underlying economy, even in that period of historically brisk expansion. (Where the stock market capitalization-to-GDP ratio stood at a mere 71 percent in 1994 it was over 153 percent in 1999, more than twice as high, even while U.S. Gross Domestic Product grew by better than a fifth, even after adjustment for inflation.)
It was the conventional wisdom among at least a significant section of financial "experts" that this was some wonderful new normal, with the surge in asset values to somehow continue for a good long time to come, these apparently trying to outbid each other for the public's attention with ever-higher predictions of how high the Dow Jones average would go within the next several years. (Dow Jones 36,000! Dow 40,000! Even Dow 100,000!)
Assuming anything at all but the market's having been released from the laws of economic gravity, this was a big bet on just how well the "real" economy was going to do in these years over euphoria over computing, the Internet and related technologies and the possibilities some claimed to see in them--one sees how much so when they think about what it would have meant if the economy lived up to the investor expectations implied in those stock prices. If, for example, the stock market's capitalization had grown at that rate for the next two decades, and the real economy fallen no further behind the growth of the stock market's capitalization than it was in 1999.
That would have meant a roughly 20 percent a year real economic growth rate for the next two decades, and a nearly forty-fold expansion of the U.S. economy, producing a U.S. GDP of some $700 trillion in today's dollars. Alas, U.S. GDP in 2019 was, in today's terms, more like $24 trillion--a mere thirtieth that sum. And long before the disparity could grow so stark the bubble went bust, just a few months into 2000.
Looking back it is impossible to picture what those decades of 20 percent a year growth would have looked like, with the same going for their somehow producing a country thirty times richer than it is today. In fact, it does not seem an exaggeration to characterize the situation as one of financial and economic singularity--which brings to mind that other Singularity that Ray Kurzweil said so much about in 1999. Something like that technological Singularity would seem the only way in which such a financial boom could have proven a winning bet--such that it seems we can speak of Wall Street's behavior giving the impression that Kurzweil's Singularity really was imminent.
It was the conventional wisdom among at least a significant section of financial "experts" that this was some wonderful new normal, with the surge in asset values to somehow continue for a good long time to come, these apparently trying to outbid each other for the public's attention with ever-higher predictions of how high the Dow Jones average would go within the next several years. (Dow Jones 36,000! Dow 40,000! Even Dow 100,000!)
Assuming anything at all but the market's having been released from the laws of economic gravity, this was a big bet on just how well the "real" economy was going to do in these years over euphoria over computing, the Internet and related technologies and the possibilities some claimed to see in them--one sees how much so when they think about what it would have meant if the economy lived up to the investor expectations implied in those stock prices. If, for example, the stock market's capitalization had grown at that rate for the next two decades, and the real economy fallen no further behind the growth of the stock market's capitalization than it was in 1999.
That would have meant a roughly 20 percent a year real economic growth rate for the next two decades, and a nearly forty-fold expansion of the U.S. economy, producing a U.S. GDP of some $700 trillion in today's dollars. Alas, U.S. GDP in 2019 was, in today's terms, more like $24 trillion--a mere thirtieth that sum. And long before the disparity could grow so stark the bubble went bust, just a few months into 2000.
Looking back it is impossible to picture what those decades of 20 percent a year growth would have looked like, with the same going for their somehow producing a country thirty times richer than it is today. In fact, it does not seem an exaggeration to characterize the situation as one of financial and economic singularity--which brings to mind that other Singularity that Ray Kurzweil said so much about in 1999. Something like that technological Singularity would seem the only way in which such a financial boom could have proven a winning bet--such that it seems we can speak of Wall Street's behavior giving the impression that Kurzweil's Singularity really was imminent.
Tuesday, June 7, 2022
What Role Might Superconductors Play in the Energy Transition?
Superconductivity has been in the news quite a bit these past couple of years, in large part because of a major breakthrough in 2020--namely the observation of room-temperature superconductivity for the first time in history. Of course, this occurrence was in a lab, under extremely specific and difficult circumstances (with the material put under pressure equal to over two thousand times the pressure at the bottom of the Mariana Trench). Still, if only usable only in very special circumstances the fact remains that room-temperature superconductivity is a proven physical reality, and a great many are watching the progress in this field toward superconducting materials that can work in everyday conditions with interest.
A major reason has been the pursuit of a more efficient electric grid. Of particular importance the density of current in superconducting materials, relative to those presently in use. As a result generators using superconducting coils produce larger and stronger magnetic fields, extracting more power from a given amount of current--with one result that lighter, more compact generators, can deliver the same power as heavier, larger units. When made of a superconducting material wires of a given width transmit up to five times as much electricity as their copper equivalents, and do so with far less loss over long distances. And the storage of electricity in batteries using superconducting materials likewise diminishes the problem of losses, yielding additional efficiencies.
All of this can permit a more efficient exploitation of any energy source, but seems especially helpful in compensating for the intermittency of renewables that has, thus far, slowed the improvement of their cost advantage over fossil fuels and nuclear. Practical experiments have already demonstrated the possibilities of squeezing more power out of windmills equipped with superconducting magnets of given sizes. Superconducting materials' potential for lowering the cost of long-distance power transmission enables them to better connect sun and wind-rich areas with others where demand may outweigh what is reliably available at hand, or simply provide a convenient back-up if demand goes up or local power generation goes down. (Renewables-bashers love to sneer that the sun doesn't always shine and the wind doesn't always blow, but at any given time the sun is probably shining and the wind blowing somewhere, and superconductivity goes a long way to making transmission across those distances cost-effective.) Meanwhile, in contrast with fossil fuel-based power generation, renewables in particular would benefit from their usefulness in storing electricity itself. (Indeed, it is already the case that superconductor-equipped storage is being used on a small scale for the sake of evening out grid fluctuations--while an argument has been made for the plausibility of equipping windmills and photovoltaic banks may be with their own superconducting storage units.)
Altogether such possibilities mean that, even if superconductors get much less attention than other technologies, progress in this area may yet play an important role in the energy transition—and warrant that much more interest on the part of observers looking to make it work, especially if they have the long run in mind.
A major reason has been the pursuit of a more efficient electric grid. Of particular importance the density of current in superconducting materials, relative to those presently in use. As a result generators using superconducting coils produce larger and stronger magnetic fields, extracting more power from a given amount of current--with one result that lighter, more compact generators, can deliver the same power as heavier, larger units. When made of a superconducting material wires of a given width transmit up to five times as much electricity as their copper equivalents, and do so with far less loss over long distances. And the storage of electricity in batteries using superconducting materials likewise diminishes the problem of losses, yielding additional efficiencies.
All of this can permit a more efficient exploitation of any energy source, but seems especially helpful in compensating for the intermittency of renewables that has, thus far, slowed the improvement of their cost advantage over fossil fuels and nuclear. Practical experiments have already demonstrated the possibilities of squeezing more power out of windmills equipped with superconducting magnets of given sizes. Superconducting materials' potential for lowering the cost of long-distance power transmission enables them to better connect sun and wind-rich areas with others where demand may outweigh what is reliably available at hand, or simply provide a convenient back-up if demand goes up or local power generation goes down. (Renewables-bashers love to sneer that the sun doesn't always shine and the wind doesn't always blow, but at any given time the sun is probably shining and the wind blowing somewhere, and superconductivity goes a long way to making transmission across those distances cost-effective.) Meanwhile, in contrast with fossil fuel-based power generation, renewables in particular would benefit from their usefulness in storing electricity itself. (Indeed, it is already the case that superconductor-equipped storage is being used on a small scale for the sake of evening out grid fluctuations--while an argument has been made for the plausibility of equipping windmills and photovoltaic banks may be with their own superconducting storage units.)
Altogether such possibilities mean that, even if superconductors get much less attention than other technologies, progress in this area may yet play an important role in the energy transition—and warrant that much more interest on the part of observers looking to make it work, especially if they have the long run in mind.
Monday, June 6, 2022
What Ever Happened to Superconductors?
Cold fusion and fifth-generation computers were among those technologies that in the 1980s were supposed to be on the verge of changing everything--but over three decades on have amounted to pretty much nothing.
In the same years one also heard a great deal about superconductors, specifically materials which, under appropriate conditions, cease to resist the passage of electrical current, so that it can flow absolutely without loss--becoming, as the name indicates, super conductors. That implies the possibility of enormous efficiencies in a very great deal of what we do with electricity--which can seem just about everything, with the list getting longer all the time.
In considering the publicity afforded the concept in the 1980s one should note that the concept was not new even then. The phenomenon of superconductivity was first observed way, way back in 1911. However, prior to the '80s the known superconductors only worked at extremely low, near-absolute zero temperatures--which meant that they required enormous amounts of energy for refrigeration (especially with electricity passing through them and heating them up). This, of course, left them with little practical use--while achieving better than that was thought not only an engineering difficulty but a theoretical impossibility. What made superconductors seem newly relevant was the discovery of a ceramic (lanthanum barium copper oxide) that could work as a superconductors at relatively high temperature. (I stress relatively, because the '80s-era discovery meant superconductors operating at 90 Kelvin--which is about three hundred degrees below zero for those of us using the Fahrenheit scale.)
That may not seem very promising, but it did arouse expectations about the rate of progress in the field (there were fantasies that "superconductor supremacy" was going to very soon mean world economic supremacy)--which soon proved rather exaggerated. Still, the research effort continued, and happily, so does progress, with the use of different materials enabling them to achieve superconductivity achieved at higher and higher temperatures until, two years ago physicists actually achieved superconductivity at "room temperature" (in fact, achieved it at 58 degrees Fahrenheit, the average temperature in Bergen, Norway, in July and August) garnering significant attention back in 2020.
What has been less widely covered in the coverage aimed at a non-specialist audience has been the specific circumstances of the achievement of that superconductivity. The superconductor in question (a mix of hydrogen, carbon and sulfur) worked because it was under a pressure of 270 gigapascals--a figure more often mentioned than explained. Those unfamiliar with that unit of measurement should know that it is equivalent to well over 2.6 million times sea level atmospheric pressure, or under about 16,000 miles of water--which is to say, more than two thousand times the submarine hull-squashing pressure at the bottom of the Mariana Trench.
As this shows researchers in the field have traded one set of extreme conditions (cold) for another (pressure), so much so that those who imagined from the press reports that commercially useful room-temperature superconductors were imminent may, as is so often the case when looking more closely at pop science stories that make us think a technology at Technology Readiness Level 1 is already up at Level 9 find this a damp squib. But all the same, it is undeniably a breakthrough, proving that room-temperature superconductivity is, at least, possible, and perhaps yielding insights into how it might be achieved in less extreme conditions--while, for what it is worth, work has begun on making those superconductors work at lower pressures than that.
Moreover, it would be a mistake to think that this means that superconductors have amounted to as little as those other technologies previously mentioned have done to date. If without much fanfare, superconductors have already entered a wide variety of practical, everyday uses, with the most significant, perhaps, Magnetic Resonance Imaging (MRI) machines. Seventy percent of those installed worldwide use superconducting magnets to enable more rapid and comprehensive scanning of the patient. And in that we have a reminder of something else, namely that even short of room-temperature superconductivity the technology is being put to practical use, with another breakthrough previously thought an impossibility--a superconductor through which electricity flows in only one direction--opening the door to the use of the technology in computing to produce microprocessors hundreds of times faster than those operating today. Of course, the refrigeration requirements make our seeing this in consumer devices anytime soon implausible--but the head of the research team which made the breakthrough has himself argued for its possible applicability to server farms and supercomputers. If true, this could well prove revolutionary enough in itself.
In the same years one also heard a great deal about superconductors, specifically materials which, under appropriate conditions, cease to resist the passage of electrical current, so that it can flow absolutely without loss--becoming, as the name indicates, super conductors. That implies the possibility of enormous efficiencies in a very great deal of what we do with electricity--which can seem just about everything, with the list getting longer all the time.
In considering the publicity afforded the concept in the 1980s one should note that the concept was not new even then. The phenomenon of superconductivity was first observed way, way back in 1911. However, prior to the '80s the known superconductors only worked at extremely low, near-absolute zero temperatures--which meant that they required enormous amounts of energy for refrigeration (especially with electricity passing through them and heating them up). This, of course, left them with little practical use--while achieving better than that was thought not only an engineering difficulty but a theoretical impossibility. What made superconductors seem newly relevant was the discovery of a ceramic (lanthanum barium copper oxide) that could work as a superconductors at relatively high temperature. (I stress relatively, because the '80s-era discovery meant superconductors operating at 90 Kelvin--which is about three hundred degrees below zero for those of us using the Fahrenheit scale.)
That may not seem very promising, but it did arouse expectations about the rate of progress in the field (there were fantasies that "superconductor supremacy" was going to very soon mean world economic supremacy)--which soon proved rather exaggerated. Still, the research effort continued, and happily, so does progress, with the use of different materials enabling them to achieve superconductivity achieved at higher and higher temperatures until, two years ago physicists actually achieved superconductivity at "room temperature" (in fact, achieved it at 58 degrees Fahrenheit, the average temperature in Bergen, Norway, in July and August) garnering significant attention back in 2020.
What has been less widely covered in the coverage aimed at a non-specialist audience has been the specific circumstances of the achievement of that superconductivity. The superconductor in question (a mix of hydrogen, carbon and sulfur) worked because it was under a pressure of 270 gigapascals--a figure more often mentioned than explained. Those unfamiliar with that unit of measurement should know that it is equivalent to well over 2.6 million times sea level atmospheric pressure, or under about 16,000 miles of water--which is to say, more than two thousand times the submarine hull-squashing pressure at the bottom of the Mariana Trench.
As this shows researchers in the field have traded one set of extreme conditions (cold) for another (pressure), so much so that those who imagined from the press reports that commercially useful room-temperature superconductors were imminent may, as is so often the case when looking more closely at pop science stories that make us think a technology at Technology Readiness Level 1 is already up at Level 9 find this a damp squib. But all the same, it is undeniably a breakthrough, proving that room-temperature superconductivity is, at least, possible, and perhaps yielding insights into how it might be achieved in less extreme conditions--while, for what it is worth, work has begun on making those superconductors work at lower pressures than that.
Moreover, it would be a mistake to think that this means that superconductors have amounted to as little as those other technologies previously mentioned have done to date. If without much fanfare, superconductors have already entered a wide variety of practical, everyday uses, with the most significant, perhaps, Magnetic Resonance Imaging (MRI) machines. Seventy percent of those installed worldwide use superconducting magnets to enable more rapid and comprehensive scanning of the patient. And in that we have a reminder of something else, namely that even short of room-temperature superconductivity the technology is being put to practical use, with another breakthrough previously thought an impossibility--a superconductor through which electricity flows in only one direction--opening the door to the use of the technology in computing to produce microprocessors hundreds of times faster than those operating today. Of course, the refrigeration requirements make our seeing this in consumer devices anytime soon implausible--but the head of the research team which made the breakthrough has himself argued for its possible applicability to server farms and supercomputers. If true, this could well prove revolutionary enough in itself.
Are Those "Spreading Awareness" About Climate Change Aware of What Kind of "Awareness" They Are Spreading?
While thinking about the problem of climate change in recent years I have found myself increasingly concerned with the consequences of so many commentators relentlessly promulgating the bleakest possible view of the situation. These think, or at least give the impression of thinking, that they are "promoting awareness" and somehow contributing to resolving the problem. In fact many, maybe most, are simply promoting defeatism and despair.
Why do they do what they do?
I suspect that they don't understand, or don't want to understand, how politics really works, how and why things do and do not get done. Unable to give the public reasons for hope, and so they put all their energy into exercising the other option for moving it, fear. In spite of the ample evidence that the public already knows all about the problem, and has long been anxious for something to be done about it--so anxious that it is literally sick over it--and time and again elects politicians who promise to do something about it (even if those "leaders" break every promise) these tell themselves that there must not be enough fear out there, and keep doing it over and over again expecting a positive result. Encouraging them in this terribly problematic course is the evident, enormous self-satisfaction persons of weak and unserious mind derive from inflicting disaster porn-riddled jeremiads on the public.
Naturally they never think of the possibility that they have exhausted the usefulness of fear, perhaps a very long time ago, and that continuing to use fear, at least in the manner they have been doing, has become counterproductive; that past a certain point fear can simply make people shut down rather than acting; that rather than screaming alarums they now face the more difficult yet totally indispensable task of explaining frankly and seriously why society has so miserably failed to meet the problem and think seriously and frankly of how it can stop failing and lend their voices to whatever proposals might redress the issue; and that if they are not up to the task (as persons of such caliber generally are not) that they are only getting in the way of those who might be.
Why do they do what they do?
I suspect that they don't understand, or don't want to understand, how politics really works, how and why things do and do not get done. Unable to give the public reasons for hope, and so they put all their energy into exercising the other option for moving it, fear. In spite of the ample evidence that the public already knows all about the problem, and has long been anxious for something to be done about it--so anxious that it is literally sick over it--and time and again elects politicians who promise to do something about it (even if those "leaders" break every promise) these tell themselves that there must not be enough fear out there, and keep doing it over and over again expecting a positive result. Encouraging them in this terribly problematic course is the evident, enormous self-satisfaction persons of weak and unserious mind derive from inflicting disaster porn-riddled jeremiads on the public.
Naturally they never think of the possibility that they have exhausted the usefulness of fear, perhaps a very long time ago, and that continuing to use fear, at least in the manner they have been doing, has become counterproductive; that past a certain point fear can simply make people shut down rather than acting; that rather than screaming alarums they now face the more difficult yet totally indispensable task of explaining frankly and seriously why society has so miserably failed to meet the problem and think seriously and frankly of how it can stop failing and lend their voices to whatever proposals might redress the issue; and that if they are not up to the task (as persons of such caliber generally are not) that they are only getting in the way of those who might be.
Friday, June 3, 2022
Centrism: A Primer
We hear the word "centrist" tossed about a lot--but little about what it really means.
If you want a fuller explanation, supporting everything said here in great deal, to the point of having twenty-five pages of single-spaced endnotes attached, you can go here.
If you want the short version, just keep reading.
Simply put, centrism--certainly in the sense in which we use the term in the U.S.--isn't just middle-of-the-roadness, even if it overlaps with middle-of-the-roadness much of the time, or at least seems to do so. Rather this outlook can more usefully be characterized as classical conservatism updated for a society where liberal institutions have replaced those of the Old Regime which may be said. In line with that conservatism centrists take a dark view of human nature, and are pessimistic about the ability of human beings to rationally understand, direct, "engineer" society and its course. They are especially doubtful about the wisdom and goodness of the "common" man or woman--their ability to understand the issues, and to act rationally when they enter onto the political stage. This leaves them comparatively fearful of and hostile to societal change, especially when that change comes "from below." Instead they favor leadership by an elite able to use its trained judgment, for which they regard no substitute as existing.
However, the twentieth century is also not the eighteenth. As stated previously the feudal-agrarian world of the classical conservative has given way to a capitalist and democratic society, which is the form of life they are stuck with, and stuck with defending. All this being the case, if no lovers of 1789, it is 1917 that haunts them, and against which they define themselves. Thus they accept the fact of a democracy with universal suffrage and liberal rights like freedom of speech--but believe that democracy can only safely operate on very specific lines, keeping its politics "civil" and "pluralist."
What does this mean? It means that people check "ideology"--structured views of what the world is like, how it works, how to operate in it--at the door when they enter into the political arena. They do not raise the matter of how society is structured, who has advantage and who does not, what is right or wrong (much of which they regard as beside the point because of the uncertainties of social life in light of their epistemological doubts, and because they hold that in a liberal society power is so diffuse among voters and consumers that no one really has power over anyone else, for example, corporations against workers or consumers). Instead the practitioner of a centrist politics thinks of society as a plurality of interests, which they assume to all be equally legitimate so long as they abide by those rules in regard to ideology. These interests, within this arena, compete for support and negotiate among themselves in a process advised and guided by experts regarded as objectively treating of value-free facts, for the sake of preventing societal conflicts from escalating to a society-destabilizing degree--or, put more positively, the maintenance of "consensus."
Of course, all that said centrism has tended to embrace particular positions over time. In the mid-twentieth century centrism was for a defensive, containment-oriented anti-Communism in foreign policy, for the New Deal at home (if not necessarily enthusiastic about extending it), for the civil rights movement (in its moderate form), as against a right represented by figures like Barry Goldwater which took a still harder line against Communism (not containment, but "rollback"), sought a return to the nineteenth century with respect to government involvement in the economy, and opposed the civil rights movement as an infringement on the rights of lower levels of government (and not necessarily just on those grounds). Later in the century the ascent of the right (identifiable with Ronald Reagan, who succeeded where Goldwater failed) and other factors (the end of the Cold War, globalization, etc.) saw centrism move a long way to the right on key issues, becoming more like the neoconservative right in its foreign policy, and trading in the New Deal for neoliberalism. Its record in regard to the country's cultural conflicts seems a different thing. Still, it shifted away from a leftishly universalist civil rights movement in favor of a very different identity politics (which the right and many others characterize as "left" but which is readable as very much of the right in its premises, more Maistre than Martin Luther King).
Looking back it seems to me that this version of the center had its heyday in the '90s, when Bill Clinton's administration solidly established the Democratic Party's identification with it in office, governing as it did along these lines, while for the time being the prospect of great power conflict appeared on the wane in a world where Lexuses mattered more than olive trees, and it seemed to many (whether viewing the fact positively or not) that "political correctness" was inexorably in the ascendant. Since then this political vision has faced far more challenge, exemplified by the country's polarization through the twenty-first century--by the contested election of 2000, by the Iraq War and the general expansion of U.S. military involvement in the Middle East, by a succession of economic crises (the tech boom's going bust in 2000, the inflationary energy crisis of the '00s, the Great Recession), by the more recent pandemic, by the escalation of culture war and identity politics, and so much else. In the face of it all thus far the center has generally stuck to its turn-of-the-century positions (partied like it's 1999, so to speak), with the Democratic Party's leadership and officials doing so even as their electoral base has shifted leftward, but it may well be that in the face of the multitude of conflicting pressures centrism will adapt yet again.
If you want a fuller explanation, supporting everything said here in great deal, to the point of having twenty-five pages of single-spaced endnotes attached, you can go here.
If you want the short version, just keep reading.
Simply put, centrism--certainly in the sense in which we use the term in the U.S.--isn't just middle-of-the-roadness, even if it overlaps with middle-of-the-roadness much of the time, or at least seems to do so. Rather this outlook can more usefully be characterized as classical conservatism updated for a society where liberal institutions have replaced those of the Old Regime which may be said. In line with that conservatism centrists take a dark view of human nature, and are pessimistic about the ability of human beings to rationally understand, direct, "engineer" society and its course. They are especially doubtful about the wisdom and goodness of the "common" man or woman--their ability to understand the issues, and to act rationally when they enter onto the political stage. This leaves them comparatively fearful of and hostile to societal change, especially when that change comes "from below." Instead they favor leadership by an elite able to use its trained judgment, for which they regard no substitute as existing.
However, the twentieth century is also not the eighteenth. As stated previously the feudal-agrarian world of the classical conservative has given way to a capitalist and democratic society, which is the form of life they are stuck with, and stuck with defending. All this being the case, if no lovers of 1789, it is 1917 that haunts them, and against which they define themselves. Thus they accept the fact of a democracy with universal suffrage and liberal rights like freedom of speech--but believe that democracy can only safely operate on very specific lines, keeping its politics "civil" and "pluralist."
What does this mean? It means that people check "ideology"--structured views of what the world is like, how it works, how to operate in it--at the door when they enter into the political arena. They do not raise the matter of how society is structured, who has advantage and who does not, what is right or wrong (much of which they regard as beside the point because of the uncertainties of social life in light of their epistemological doubts, and because they hold that in a liberal society power is so diffuse among voters and consumers that no one really has power over anyone else, for example, corporations against workers or consumers). Instead the practitioner of a centrist politics thinks of society as a plurality of interests, which they assume to all be equally legitimate so long as they abide by those rules in regard to ideology. These interests, within this arena, compete for support and negotiate among themselves in a process advised and guided by experts regarded as objectively treating of value-free facts, for the sake of preventing societal conflicts from escalating to a society-destabilizing degree--or, put more positively, the maintenance of "consensus."
Of course, all that said centrism has tended to embrace particular positions over time. In the mid-twentieth century centrism was for a defensive, containment-oriented anti-Communism in foreign policy, for the New Deal at home (if not necessarily enthusiastic about extending it), for the civil rights movement (in its moderate form), as against a right represented by figures like Barry Goldwater which took a still harder line against Communism (not containment, but "rollback"), sought a return to the nineteenth century with respect to government involvement in the economy, and opposed the civil rights movement as an infringement on the rights of lower levels of government (and not necessarily just on those grounds). Later in the century the ascent of the right (identifiable with Ronald Reagan, who succeeded where Goldwater failed) and other factors (the end of the Cold War, globalization, etc.) saw centrism move a long way to the right on key issues, becoming more like the neoconservative right in its foreign policy, and trading in the New Deal for neoliberalism. Its record in regard to the country's cultural conflicts seems a different thing. Still, it shifted away from a leftishly universalist civil rights movement in favor of a very different identity politics (which the right and many others characterize as "left" but which is readable as very much of the right in its premises, more Maistre than Martin Luther King).
Looking back it seems to me that this version of the center had its heyday in the '90s, when Bill Clinton's administration solidly established the Democratic Party's identification with it in office, governing as it did along these lines, while for the time being the prospect of great power conflict appeared on the wane in a world where Lexuses mattered more than olive trees, and it seemed to many (whether viewing the fact positively or not) that "political correctness" was inexorably in the ascendant. Since then this political vision has faced far more challenge, exemplified by the country's polarization through the twenty-first century--by the contested election of 2000, by the Iraq War and the general expansion of U.S. military involvement in the Middle East, by a succession of economic crises (the tech boom's going bust in 2000, the inflationary energy crisis of the '00s, the Great Recession), by the more recent pandemic, by the escalation of culture war and identity politics, and so much else. In the face of it all thus far the center has generally stuck to its turn-of-the-century positions (partied like it's 1999, so to speak), with the Democratic Party's leadership and officials doing so even as their electoral base has shifted leftward, but it may well be that in the face of the multitude of conflicting pressures centrism will adapt yet again.
Thursday, June 2, 2022
A Note on What Ideology Means
When we speak seriously of ideology--of liberalism, conservatism and so forth--we are speaking of a philosophy which addresses fundamental questions about the human condition, and on the basis of the answers it offers to those questions, the problems of economic, political, cultural and social life. Arguably the most important of those questions are:
1. What can we know about the world, and especially the human, social world?
2. What are human beings like--individually and collectively, in society?
3. Given what we know about human beings, what should we consider to be society's goals?
4. If we think that society should be something other than what it is, can we change it for the better? Would the potential gain outweigh the risks?
5. How far can we rely on reason in changing our social arrangements--our economic system, our political system, our culture--for the better?
Conservatism, liberalism, and the rest, all have very specific answers to these questions, which determine their address of specific political questions. For now let us stick with conservatism and liberalism, in the classical sense of each of those terms, which retain some usefulness from this vantage point (even as much else may have changed).
Conservatism takes a dark view of human nature (think Thomas Hobbes), and is pessimistic about the applicability of reason to society. This leaves conservatives more concerned with keeping human badness in check than with, for example, achieving a society affording its members greater freedom, justice or equality, which generally seem to them unrealistic aspirations in the circumstances. Thus they think that the prospects of change for the better are very dim, while tending to regard the social arrangements that have emerged over time, "organically" in response to specific situations--what is often called "tradition," and where following tradition in doctrinaire fashion does not settle the matter, judgments by an elite respectful of tradition based on its own personal, practical experience--as likely to be superior to any human "plan." (As the foundational Joseph de Maistre argued in his Considerations on France, a person can grow a tree, but they cannot "make" a tree--and so it is with a society in the conservative's view.)
Liberalism takes a different view of these matters, seeing human nature as a broader, more flexible thing than conservatives give it credit for, what they might conceive as a timeless, unchanging, unchangeable (and nasty) human nature substantially formed by circumstances. (The liberal John Locke characterized the human mind as a blank slate at birth in his Essay on Human Understanding.) They also have a higher opinion of the capacities of human reason--and therefore see room for better, much better, than we have been doing up to this point, and with that, much more scope existing for a freer, fairer world than history has known. Indeed, they may regard the exercise of reason for the sake of creating a better set of social arrangements as not merely desirable and possible, but obligatory, given that their starting point for thought about society is an individual they regard as having inalienable rights, not least to freedom. They may even regard such change as a practical necessity, for their reason tells them time and again that the world changes, and the "old ways" often fail to meet the new demands it throws up. (Consider, for instance, the interrelated matters of nationalism, militarism, war. The conservative does not see such things going away any time soon, but the liberal points to them as having ceased to be tolerable in an age of globalization, and of nuclear weapons.)
Of course, confronted with this tiny, tiny bit of philosophy 101 many snap "People don't use the word like that!" And certainly most people don't--in part because there has been some awkward shuffling of labels (conservatives having been forced to reckon with liberalism, liberalism having bifurcated into more conservative and more radical versions, etc., etc.) creating a fair amount of superficial confusion. However, more important than any such confusion is the fact that so few thought about the matter long enough to be confused by it; that very few of those who identify as "conservative," "liberal," or anything else have ever considered the questions discussed here at all, let alone in any great depth. All the same, there seem to me to be two rejoinders to their dismissive attitude:
1. Their using the terms in a shallow, unthinking, politically illiterate way does not make those who use the terms in the ways long established in political philosophy and political science somehow incorrect. (To suggest otherwise is more than saying "My ignorance is as good as your knowledge." It is saying "My ignorance is better than your knowledge.") If anything, there is a far better case for the matter being the other way.
2. The more casual, ill-informed usages often turn out to be more consistent with the deeper ones than people generally realize. While people reduce labels like "conservative" or "liberal" to responses to particular hot-button issues about which they may be speaking emotionally rather than intellectually the conservative or liberal position tends to reflect those deeper philosophical assumptions just discussed here (even where the person in question never considered the issue on that level). Thus does it go with such a matter as gender (e.g. gender roles, gender identity, reproductive rights, sexual freedom), with the conservative inclining to the traditional practice, the liberal or radical seeing more scope and reason for change, on the basis of what they rationally judge to be fair and right, and in line with the demands of human rights, including freedom.
The result is that the vulgarian snapping "People don't use the word like that!" has probably done so plenty of times without even knowing it.
1. What can we know about the world, and especially the human, social world?
2. What are human beings like--individually and collectively, in society?
3. Given what we know about human beings, what should we consider to be society's goals?
4. If we think that society should be something other than what it is, can we change it for the better? Would the potential gain outweigh the risks?
5. How far can we rely on reason in changing our social arrangements--our economic system, our political system, our culture--for the better?
Conservatism, liberalism, and the rest, all have very specific answers to these questions, which determine their address of specific political questions. For now let us stick with conservatism and liberalism, in the classical sense of each of those terms, which retain some usefulness from this vantage point (even as much else may have changed).
Conservatism takes a dark view of human nature (think Thomas Hobbes), and is pessimistic about the applicability of reason to society. This leaves conservatives more concerned with keeping human badness in check than with, for example, achieving a society affording its members greater freedom, justice or equality, which generally seem to them unrealistic aspirations in the circumstances. Thus they think that the prospects of change for the better are very dim, while tending to regard the social arrangements that have emerged over time, "organically" in response to specific situations--what is often called "tradition," and where following tradition in doctrinaire fashion does not settle the matter, judgments by an elite respectful of tradition based on its own personal, practical experience--as likely to be superior to any human "plan." (As the foundational Joseph de Maistre argued in his Considerations on France, a person can grow a tree, but they cannot "make" a tree--and so it is with a society in the conservative's view.)
Liberalism takes a different view of these matters, seeing human nature as a broader, more flexible thing than conservatives give it credit for, what they might conceive as a timeless, unchanging, unchangeable (and nasty) human nature substantially formed by circumstances. (The liberal John Locke characterized the human mind as a blank slate at birth in his Essay on Human Understanding.) They also have a higher opinion of the capacities of human reason--and therefore see room for better, much better, than we have been doing up to this point, and with that, much more scope existing for a freer, fairer world than history has known. Indeed, they may regard the exercise of reason for the sake of creating a better set of social arrangements as not merely desirable and possible, but obligatory, given that their starting point for thought about society is an individual they regard as having inalienable rights, not least to freedom. They may even regard such change as a practical necessity, for their reason tells them time and again that the world changes, and the "old ways" often fail to meet the new demands it throws up. (Consider, for instance, the interrelated matters of nationalism, militarism, war. The conservative does not see such things going away any time soon, but the liberal points to them as having ceased to be tolerable in an age of globalization, and of nuclear weapons.)
Of course, confronted with this tiny, tiny bit of philosophy 101 many snap "People don't use the word like that!" And certainly most people don't--in part because there has been some awkward shuffling of labels (conservatives having been forced to reckon with liberalism, liberalism having bifurcated into more conservative and more radical versions, etc., etc.) creating a fair amount of superficial confusion. However, more important than any such confusion is the fact that so few thought about the matter long enough to be confused by it; that very few of those who identify as "conservative," "liberal," or anything else have ever considered the questions discussed here at all, let alone in any great depth. All the same, there seem to me to be two rejoinders to their dismissive attitude:
1. Their using the terms in a shallow, unthinking, politically illiterate way does not make those who use the terms in the ways long established in political philosophy and political science somehow incorrect. (To suggest otherwise is more than saying "My ignorance is as good as your knowledge." It is saying "My ignorance is better than your knowledge.") If anything, there is a far better case for the matter being the other way.
2. The more casual, ill-informed usages often turn out to be more consistent with the deeper ones than people generally realize. While people reduce labels like "conservative" or "liberal" to responses to particular hot-button issues about which they may be speaking emotionally rather than intellectually the conservative or liberal position tends to reflect those deeper philosophical assumptions just discussed here (even where the person in question never considered the issue on that level). Thus does it go with such a matter as gender (e.g. gender roles, gender identity, reproductive rights, sexual freedom), with the conservative inclining to the traditional practice, the liberal or radical seeing more scope and reason for change, on the basis of what they rationally judge to be fair and right, and in line with the demands of human rights, including freedom.
The result is that the vulgarian snapping "People don't use the word like that!" has probably done so plenty of times without even knowing it.
Subscribe to:
Posts (Atom)