Boston: Houghton Mifflin, 1973, pp. 334.
In his 1967 book The New Industrial State John Kenneth Galbraith argued that, far from being the market-disciplined firms of neoclassical economics, the large, industrial corporations in crucial sectors like aerospace, defense, computers, chemicals and oil essentially operated like units within a planned economy.1 In Economics and the Public Purpose he put this "planning system" into the context of the larger economy, which also includes smaller, less complex enterprises that can be fairly described as still operating in a market system.2
As might be expected given the disparity in size and power between the economic units involved, the conditions of the enterprises of the "planning system," and those in the market system, very different, generally to the disadvantage of the latter. The large companies of the former are able to exercise considerable control over such crucial matters as the prices at which they buy inputs (through practices like vertical integration, or raising needed capital internally without turning to creditors) and sell their output (for instance, being able to tacitly avoid the more disruptive forms of competition, and pass cost increases along to customers when these are unavoidable). By contrast, companies in the market system cannot, leaving them subject to the vicissitudes of competition, and consistently derive smaller and less secure returns accordingly. The planning system also comes in ahead of the market system where public policy and public support are concerned (as seen when one compares, for instance, the aerospace sector and the construction industry).
The result is evident not just in the differing returns among the heads of these firms--the difference between the income of a Fortune 500 CEO and a small business owner, for instance--but their workers, as a comparison of the lot of a unionized autoworker with that of an agricultural laborer makes clear. However, both these systems still come in ahead of the broader public interest, and the public sector representing it. One result is that the United States is richest in those things supplied by the planning system (like cars and consumer electronics), less well-supplied with the products of the market system (like housing), and least well-supplied by those goods which the private sector does not supply and does not value (like public transport options), making for the mix of private affluence and public squalor that he first addressed in 1958's The Affluent Society.
After presenting this comprehensive picture of the economy Galbraith devotes the last third of his book to laying out a "theory of reform." This begins with the correction of the conventional wisdom on economics (dominated by neoclassical pieties which defend the status quo rather than explaining how things actually work) and reform of the workings of the legislature (with the elimination of Congress's seniority system, a vehicle of the planning system's influence on policy, particularly important).3 However, he also considers an ambitious range of specific economic measures to the end of redressing the imbalance in development and power between the planning and market systems (Galbraith terms this "the new socialism"), and between both of these and the public sector more generally.
The market system would be aided by the encouragement of small business (and its employees) to organize, the extension and increase of the minimum wage, and the conclusion of international agreements to protect their prices and income, while government would make greater efforts to support their "educational, capital and technological" needs--in the interest of fairness as well as the improvement of output in these comparatively neglected areas. Where the planning system is concerned, differentials in compensation inside companies would become an issue in collective bargaining, while the tax system would be rendered more progressive, in part through the closure of loopholes, but also the abandonment of the distinction made among types of income (as between salary and capital gains, for instance).4 Affirmative action and a guaranteed minimum income scheme would be crucial to the extension of the effort to eliminate inequality to all; environmental regulation would implement the principle of what would today be called sustainable development; and the government would assume responsibility for services the private sector does not perform well or at all (like health care, urban housing and local transport).
Galbraith rounds out this program with a number of additional measures, including the nationalization of certain corporations--those in the defense sector (given their relationship to the Federal government, in many respects already public, but in need of being reined in, especially given the arms race to which they contribute), and even large corporations of any type where shareholder power has disappeared due to the diffusion of stock (the case for the superiority of private to public enterprise, after all, depending on the control of a self-interested owner within a free market, now vanished as a practical matter).5 There would also be a national authority charged with coordinating the planning in different industries, which in turn would coordinate the management of the international economy with its counterparts abroad. The achievement of price stability would rely less on the monetary policy Galbraith has found to be ineffectual, and much more on the use of fiscal policy (increasing and reducing not just spending but taxes as necessary), and selective price and wage controls (also an instrument for reducing inequality by limiting the divergence of wages within firms, and between economic sectors).6
Radical as some of the prescriptions seemed even at the time of writing (nationalizations, central planning, etc.), Galbraith believed that the country was already headed in this direction. Of course, the reality proved to be quite the opposite. The economic crises which had just begun at the time he was writing the book--American problems with the balance of payments, the energy crisis--proved an opportunity not for those seeking the reforms he described, but for partisans of neoclassical ideas (like Thatcher and Reagan) to reverse the movement toward more activist, egalitarian direction of the economy underway since the Second World War.
Still, it would be simplistic to dismiss Galbraith's ideas on these grounds. Galbraith's examination of the large corporation and the small business together provides an impressively comprehensive picture of the American economy which remains useful in its broad outlines, and most of its details as well. The most significant changes in the American and world economies since his time have lain in the increased pervasiveness and density of transnational operations, and the expansion and intensification of speculative finance, which do not alter the fundamentals of the picture. Indeed, the ways in which his image has dated have in their way validated Galbraith's diagnosis of the American economy's problems (inequality, deindustrialization, balance of payments problems, environmental degradation), which have sharply worsened--in large part because those orthodoxies he so incisively attacked have predominated, and remain dominant in spite of their consistent and at times spectacular failure to deliver the promised goods.
Of course, even those who will grant his insight on that score can still take issue with his suggested solutions, which are virtually impossible to present within mainstream discussion today, and some of which he was less prone to advance in later works.7 Yet, whatever their deficiencies, in regard to either their practical effectiveness or their political plausibility, the scope and ambition of Galbraith's program stands as a reminder of how timid, tepid and narrow what passes for public "debate" has become--to our great cost.
1. The presentation of the theory of the planning system is here somewhat refined over what appeared in the 1967 book. In discussing the driving imperatives of "technostructure"-dominated companies Galbraith does not write of company "social goals," or the role of "identification" and "adaptation" in the executive's attitude toward his firm. Rather he emphasizes the drive for expansion of the company, and the material enjoyments of high-ranking company officials served by such growth. He also considers the transnational development of that system through the advent of the multinational corporation.
2. The continued existence of such businesses is a function of the small size and lower level of organization of the businesses operating in this area, because the activity they perform is unstandardized and geographically dispersed, the service rendered entails a personal element, or the actors involved are connected with the arts. (In cases this has also been bolstered by legal and political obstacles, as in construction.) One result has been that organization and technology in these areas lags well behind that of the corporate-dominated high-tech sectors, which has kept them small (a result of these smaller, poorer, less secure firms being less able to invest in such things as technical R & D).
3. Galbraith points to such matters as the wall neoclassical economics puts up between microeconomics and macroeconomics, its preference for sticking with the small business version of the firm to addressing the reality of the large corporation, its failure to come to grips with the way in which technological innovation actually occurs, its advocacy of monetary policy as the sole respectable tool of economic regulation, and in general, "the way it rationalizes and conceals the disadvantages of the weak."
4. This is in line with Galbraith's theory of the use of organization to counter the market-upsetting advantages of large corporations in his 1954 American Capitalism: The Theory of Countervailing Power.
5. The stock of the companies would not be expropriated, but purchased with interest-bearing securities.
6. Criticism of the claims made for monetary policy is a long-running theme in Galbraith's work. In The Culture of Contentment he compares the attempt to encourage growth with lower interest rates to "pushing an object along on the floor with a string." It is also not much more effective at taming inflation by restraining demand. In the chapter on the subject in The Affluent Society he contends that higher interest rates have little effect on firms in the planning system because they can use internal savings to fund investments, and when they do borrow, pass costs along to their customers; while consumers are less inhibited by higher interest rates than might be expected because their schedule of payments obscures interest charges.
7. In 1996's The Good Society, for instance, Galbraith left aside the matter of price controls, apparently on grounds of political feasibility rather than practical effect, and made no mention of public ownership as a response to the failure of shareholder power.
Tweet
Monday, November 26, 2012
Wednesday, November 7, 2012
The Culture of Contentment and the Election of 2012
Those critical of the thrust of American policy have in recent decades frequently found themselves at a loss to explain the voting behavior of the electorate--why, for instance, the middle-class would vote against tax increases on the rich at times of fiscal strain. C. Wright Mills offered a portrait of Americans as alienated, but also atomized, distracted and uncomprehending. Another view, recently updated by Thomas Frank, analyzed the roles of "market populism" and culture war in rallying them to such positions. Morris Berman, likewise updating a longer tradition, offered the view of Americans as deeply, irrevocably committed to a social model centered on individual striving for private material gain (in spite of its personal and social consequences).
John Kenneth Galbraith offered an interesting complement to this body of thought in The Culture of Contentment in which he pointed to an often overlooked element of self-interest as a factor in this behavior. He wrote of a "contented majority" among the electorate, which included the broad professional and business class, as well as those who might be thought of as working class, but earn comparable income (as even some factory workers do). These are, in their attitude toward politics, preoccupied with their short-run personal comfort above all else; prone to believe that people "get what they deserve"; and in general take a dim view of government outside of certain roles they view as personally beneficial. (Defense apart, these include the contributions of Social Security and Medicare to their retirements or the maintenance of elderly relatives, or the protection of their savings by financial bail-outs.)
Consequently they resent the programs aimed at helping the poor (welfare, public education, etc.) that they pay for but do not expect to use; tolerate high levels of income inequality, thinking that "the price of prevention of any aggression against one's own income is tolerance of the greater amount for others," as any "redistribution of income from the rich . . . opens the door for consideration of higher taxes for the comfortable but less endowed"; and particularly in line with their short-run thinking, believe that "short-run public inaction . . . is always prefer[able] to protective long-run action."
This combination of preferences is exactly the sort of thing to lead to failures to address not just rural and urban poverty, but also ecological problems, the deterioration of the country's transport infrastructure, uncontrolled financial speculation, deindustrialization, mounting debt private and public, foreign and domestic, and the impact on U.S. influence abroad. And it counts disproportionately because the contented are disproportionately represented in the half of the eligible electorate which votes.
Galbraith speculated that this attitude would continue until a sharp economic downturn, a major military setback or a "revolt" of the underclass made the contented discontented, and one might wonder at what this suggests about the present. Recent years have seen the worst economic crisis since the Great Depression (the disappearance of trillions in wealth, a 10 percent U-3 unemployment rate at its low point, continuing austerity at the local and state levels), and the disillusionment following a succession of costly, frustrating and ultimately unpopular wars across the Greater Middle East (particularly Iraq, from which U.S. troops withdrew last year, but increasingly Afghanistan as well). However, such discontent as has appeared has, to the dismay of observers like Frank, taken a mostly rightward, status quo-affirming direction--the spike in interest in Ayn Rand, still being read by a wide audience in a way so many '50s-era intellectuals are not; the advent of the Tea Party, which was outraged more by the prospect of aid to troubled homeowners than the bail-out of spectacularly stupid and irresponsible speculators, and has had far more political influence on the Republican Party than Occupy Wall Street has had on the Democrats.
This is all reflected in the tenor of the political dialogue. The national debt's bounding far ahead of GDP remains a valid cause for concern, but the unseriousness of conservative criticism of it remains evident in the assertions that the deficits are not so large that more cannot be spent on defense, or further tax cuts bestowed on the wealthy. The dependence of the economy on fossil fuel profligacy has not been questioned to any significant extent, Mitt Romney's plan for job creation based on the premise of booming oil production--in line with his belittling of climate change and efforts to combat it even as, for the second year in a row, the southern states suffered through a near-record drought, while Manhattan's subway system was shut down by a hurricane.1
Even the personal histories and ideological positions of the presidential and vice-presidential candidates put forth by the Republican Party (the "traditional party of contentment") are telling. Mitt Romney is a wealthy, Swiss Bank account-holding Wall Street speculator born to privilege whose less-guarded remarks have been deeply offensive toward those not among the contented. His running mate Paul Ryan is an economics major whose quibbling over his adherence to Randian ideals (given their atheism) just makes his devotion to ultraorthodox economic thought that much clearer. Granted, Romney and Ryan lost, which some may take to validate the worry of sympathetic pundits like Tim Stanley that they were the "wrong" candidates this time around - but all the same the election was a close one, so much so that these candidates lost by only two percent of the popular vote, while the Republican Party retained control of the House of Representatives (albeit, by a slimmer margin).
It is, of course, an easy enough thing to envision the stress on our accustomed politics growing more severe. A softening global economy (Chinese and Indian growth is slowing, while Europe remains troubled), the prospect of sharp spending cuts adding to the pain already being caused by the considerable austerity at the state and local levels, the failure to rein in the kind of speculation that caused the meltdown of 2008, to name just a few of the continuing problems, are all plausible sources of such stresses. But for now the culture of contentment remains surprisingly intact.
1. Prominent in the plan to "create 12 million new jobs" is quicker and wider oil drilling (offshore drilling alone is expected to create 1.2 million jobs), the construction of the Keystone XL pipeline, and the elimination of carbon regulation from the Clean Air Act, as well as bullish assumptions about what shale oil production will achieve.
John Kenneth Galbraith offered an interesting complement to this body of thought in The Culture of Contentment in which he pointed to an often overlooked element of self-interest as a factor in this behavior. He wrote of a "contented majority" among the electorate, which included the broad professional and business class, as well as those who might be thought of as working class, but earn comparable income (as even some factory workers do). These are, in their attitude toward politics, preoccupied with their short-run personal comfort above all else; prone to believe that people "get what they deserve"; and in general take a dim view of government outside of certain roles they view as personally beneficial. (Defense apart, these include the contributions of Social Security and Medicare to their retirements or the maintenance of elderly relatives, or the protection of their savings by financial bail-outs.)
Consequently they resent the programs aimed at helping the poor (welfare, public education, etc.) that they pay for but do not expect to use; tolerate high levels of income inequality, thinking that "the price of prevention of any aggression against one's own income is tolerance of the greater amount for others," as any "redistribution of income from the rich . . . opens the door for consideration of higher taxes for the comfortable but less endowed"; and particularly in line with their short-run thinking, believe that "short-run public inaction . . . is always prefer[able] to protective long-run action."
This combination of preferences is exactly the sort of thing to lead to failures to address not just rural and urban poverty, but also ecological problems, the deterioration of the country's transport infrastructure, uncontrolled financial speculation, deindustrialization, mounting debt private and public, foreign and domestic, and the impact on U.S. influence abroad. And it counts disproportionately because the contented are disproportionately represented in the half of the eligible electorate which votes.
Galbraith speculated that this attitude would continue until a sharp economic downturn, a major military setback or a "revolt" of the underclass made the contented discontented, and one might wonder at what this suggests about the present. Recent years have seen the worst economic crisis since the Great Depression (the disappearance of trillions in wealth, a 10 percent U-3 unemployment rate at its low point, continuing austerity at the local and state levels), and the disillusionment following a succession of costly, frustrating and ultimately unpopular wars across the Greater Middle East (particularly Iraq, from which U.S. troops withdrew last year, but increasingly Afghanistan as well). However, such discontent as has appeared has, to the dismay of observers like Frank, taken a mostly rightward, status quo-affirming direction--the spike in interest in Ayn Rand, still being read by a wide audience in a way so many '50s-era intellectuals are not; the advent of the Tea Party, which was outraged more by the prospect of aid to troubled homeowners than the bail-out of spectacularly stupid and irresponsible speculators, and has had far more political influence on the Republican Party than Occupy Wall Street has had on the Democrats.
This is all reflected in the tenor of the political dialogue. The national debt's bounding far ahead of GDP remains a valid cause for concern, but the unseriousness of conservative criticism of it remains evident in the assertions that the deficits are not so large that more cannot be spent on defense, or further tax cuts bestowed on the wealthy. The dependence of the economy on fossil fuel profligacy has not been questioned to any significant extent, Mitt Romney's plan for job creation based on the premise of booming oil production--in line with his belittling of climate change and efforts to combat it even as, for the second year in a row, the southern states suffered through a near-record drought, while Manhattan's subway system was shut down by a hurricane.1
Even the personal histories and ideological positions of the presidential and vice-presidential candidates put forth by the Republican Party (the "traditional party of contentment") are telling. Mitt Romney is a wealthy, Swiss Bank account-holding Wall Street speculator born to privilege whose less-guarded remarks have been deeply offensive toward those not among the contented. His running mate Paul Ryan is an economics major whose quibbling over his adherence to Randian ideals (given their atheism) just makes his devotion to ultraorthodox economic thought that much clearer. Granted, Romney and Ryan lost, which some may take to validate the worry of sympathetic pundits like Tim Stanley that they were the "wrong" candidates this time around - but all the same the election was a close one, so much so that these candidates lost by only two percent of the popular vote, while the Republican Party retained control of the House of Representatives (albeit, by a slimmer margin).
It is, of course, an easy enough thing to envision the stress on our accustomed politics growing more severe. A softening global economy (Chinese and Indian growth is slowing, while Europe remains troubled), the prospect of sharp spending cuts adding to the pain already being caused by the considerable austerity at the state and local levels, the failure to rein in the kind of speculation that caused the meltdown of 2008, to name just a few of the continuing problems, are all plausible sources of such stresses. But for now the culture of contentment remains surprisingly intact.
1. Prominent in the plan to "create 12 million new jobs" is quicker and wider oil drilling (offshore drilling alone is expected to create 1.2 million jobs), the construction of the Keystone XL pipeline, and the elimination of carbon regulation from the Clean Air Act, as well as bullish assumptions about what shale oil production will achieve.
Sunday, October 28, 2012
In Defense of Futurology
Anyone who looks into futurology quickly encounters the enormous skepticism surrounding the whole discipline, denunciations like Max Dublin's Futurehype: The Tyranny of Prophecy reflecting common sentiments.
It would be pointless to deny that there is much to critiques like Dublin's. It is indisputable that futurologists do not often get things right, least of all when the predictions are of the dramatic, headline-making kind. It is equally indisputable that not all of the mistakes have been made in good faith; the public constantly has unfounded, dishonest and astonishingly arrogant predictions being foisted on it, which have often originated in or become the tools of vested interest. These have often diverted our attention away from real and immediate problems to phantasmic ones (as with so many proclamations about foreign military threats). They have also been used to deflect calls for meaningful action on pressing matters with illusions of cheap, easy fixes if one only waits (the apparent attitude of many Singularitarians to poverty and ecological problems), or even things working out for the best all by themselves (like the promise that speculative bubbles may have burst disastrously in the past, but this latest one is different). In short, predictions have often been worse than useless, doing a great deal of damage.
Yet, the fact remains that prediction is unavoidable. It is unreasonable to expect that self-conscious, time-conscious beings will not consider what lies ahead of them--and even were this possible, their refraining from doing so would be undesirable. Intelligent action requires presumptions about outcomes, which necessarily involves prediction, explicit or implicit, especially when planning is involved. As the range, scale and complexity of the requisite planning grows, so does the requisite predictive effort. Given the realities of a thoroughly globalized industrial civilization, there is a genuine need for the kinds of elaborate, specialized efforts that have made an industry of forecasting.
Moreover, just as we have to be mindful of the harm caused by bad predictions, it is well worth remembering that the failure to heed those sound predictions eventually validated by events have also done much harm, and threaten to do much more, as with the dismissal of Ivan Bloch's predictions about what a general European war in the early twentieth century would be like, the warnings about financial speculation run amok again and again down to the meltdown of 2008, the problem of climate change today. Seen within a larger context, the problem seems to be less a particular intellectual tool for understanding the world (unique as its influence may be in respects) than the corruption of our intellectual life by political agendas, the failure of the media to lucidly present the issues of the day, the inadequate training of even the better-educated parts of the public to think critically and cope with nuance.
Alas, these quite obvious sides of the issue seem to carry little weight with the knee-jerk detractors who seem to see something intrinsically wrong with this activity--a view which in most cases says more about them than it does the failings of futurology. Some regard predictability as excluding the possibility of freedom (apparently not worrying that freedom in a totally random universe would deprive choice of its meaning), or are simply suspicious of the exercise of human reason, especially when applied to the human sphere (a position which has run throughout the conservative tradition, all the way down to today's postmodernists). Libertarians in particular seem to dislike prediction because of its association with planning, which they associate with impiety toward the Market, and with government intervention in it (while they conveniently overlook the ways in which private actors must plan). I suppose, too, that there is a considerable amount of plain old anti-intellectualism in the sentiment (as those who make the more influential or nuanced predictions tend to be experts). And the truth is that in many a case an individual's level of contempt tends to reflect their gut response to what a particular futurologist happens to be saying. (I have found that climate change "skeptics" get quite nasty toward the whole field when they encounter predictions about this particular subject.)
Rather than writing off futurology (something we simply cannot do), the only viable approach is to strive to get better at making predictions, and to get better at making use of them. We need to critically examine the claims we get, not just the "what" and "when" and "where" of them, but the "how" and the "why." Along with a genuine diversity of views--and a readiness to call out those who really are the venal mouthpieces of special interests--this is far and away the best protection against those who would seek to monopolize real debates, or manufacture fake ones by sowing pseudo-scientific doubt (like those who would have us believe there is no connection between tobacco and cancer, or carbon emissions and climate change).
We should not look to the field for a precision or a certainty beyond what it can actually give. Especially when the distance between the present and the point in the future is considerable, we should expect to get just the rough outline of "things to come"--which is, in the end, what the most successful predictions generally seem to strive for, perhaps because that is all we can detect through the noise. We need to think less in terms of inevitabilities than spectrums of probability robust enough to survive life's frictions, surprises and the range of choices that may appreciably affect them; to think less in terms of simplistic projections of the present than the ways in which trends peter out, and accelerate, and interact with one another; to recognize the attached caveats and respond accordingly--to not shy away from committing to this or that expectation, but be able to react with something better than dumbfounded stupidity if events take a different course.
And even where predictions do fall short, it is well worth remembering that asking the large questions has a way of yielding insights that might not otherwise emerge, a not inconsiderable prize in itself.
It would be pointless to deny that there is much to critiques like Dublin's. It is indisputable that futurologists do not often get things right, least of all when the predictions are of the dramatic, headline-making kind. It is equally indisputable that not all of the mistakes have been made in good faith; the public constantly has unfounded, dishonest and astonishingly arrogant predictions being foisted on it, which have often originated in or become the tools of vested interest. These have often diverted our attention away from real and immediate problems to phantasmic ones (as with so many proclamations about foreign military threats). They have also been used to deflect calls for meaningful action on pressing matters with illusions of cheap, easy fixes if one only waits (the apparent attitude of many Singularitarians to poverty and ecological problems), or even things working out for the best all by themselves (like the promise that speculative bubbles may have burst disastrously in the past, but this latest one is different). In short, predictions have often been worse than useless, doing a great deal of damage.
Yet, the fact remains that prediction is unavoidable. It is unreasonable to expect that self-conscious, time-conscious beings will not consider what lies ahead of them--and even were this possible, their refraining from doing so would be undesirable. Intelligent action requires presumptions about outcomes, which necessarily involves prediction, explicit or implicit, especially when planning is involved. As the range, scale and complexity of the requisite planning grows, so does the requisite predictive effort. Given the realities of a thoroughly globalized industrial civilization, there is a genuine need for the kinds of elaborate, specialized efforts that have made an industry of forecasting.
Moreover, just as we have to be mindful of the harm caused by bad predictions, it is well worth remembering that the failure to heed those sound predictions eventually validated by events have also done much harm, and threaten to do much more, as with the dismissal of Ivan Bloch's predictions about what a general European war in the early twentieth century would be like, the warnings about financial speculation run amok again and again down to the meltdown of 2008, the problem of climate change today. Seen within a larger context, the problem seems to be less a particular intellectual tool for understanding the world (unique as its influence may be in respects) than the corruption of our intellectual life by political agendas, the failure of the media to lucidly present the issues of the day, the inadequate training of even the better-educated parts of the public to think critically and cope with nuance.
Alas, these quite obvious sides of the issue seem to carry little weight with the knee-jerk detractors who seem to see something intrinsically wrong with this activity--a view which in most cases says more about them than it does the failings of futurology. Some regard predictability as excluding the possibility of freedom (apparently not worrying that freedom in a totally random universe would deprive choice of its meaning), or are simply suspicious of the exercise of human reason, especially when applied to the human sphere (a position which has run throughout the conservative tradition, all the way down to today's postmodernists). Libertarians in particular seem to dislike prediction because of its association with planning, which they associate with impiety toward the Market, and with government intervention in it (while they conveniently overlook the ways in which private actors must plan). I suppose, too, that there is a considerable amount of plain old anti-intellectualism in the sentiment (as those who make the more influential or nuanced predictions tend to be experts). And the truth is that in many a case an individual's level of contempt tends to reflect their gut response to what a particular futurologist happens to be saying. (I have found that climate change "skeptics" get quite nasty toward the whole field when they encounter predictions about this particular subject.)
Rather than writing off futurology (something we simply cannot do), the only viable approach is to strive to get better at making predictions, and to get better at making use of them. We need to critically examine the claims we get, not just the "what" and "when" and "where" of them, but the "how" and the "why." Along with a genuine diversity of views--and a readiness to call out those who really are the venal mouthpieces of special interests--this is far and away the best protection against those who would seek to monopolize real debates, or manufacture fake ones by sowing pseudo-scientific doubt (like those who would have us believe there is no connection between tobacco and cancer, or carbon emissions and climate change).
We should not look to the field for a precision or a certainty beyond what it can actually give. Especially when the distance between the present and the point in the future is considerable, we should expect to get just the rough outline of "things to come"--which is, in the end, what the most successful predictions generally seem to strive for, perhaps because that is all we can detect through the noise. We need to think less in terms of inevitabilities than spectrums of probability robust enough to survive life's frictions, surprises and the range of choices that may appreciably affect them; to think less in terms of simplistic projections of the present than the ways in which trends peter out, and accelerate, and interact with one another; to recognize the attached caveats and respond accordingly--to not shy away from committing to this or that expectation, but be able to react with something better than dumbfounded stupidity if events take a different course.
And even where predictions do fall short, it is well worth remembering that asking the large questions has a way of yielding insights that might not otherwise emerge, a not inconsiderable prize in itself.
Saturday, October 27, 2012
A Scarcity of "New Thinking"
Looking back on intellectual history the middle of the last century - the 1950s especially, but to differing degrees the years immediately preceding and following it - seems to have been an exceptionally fertile time for popular writing on sociology and economics and the history related to them.
Today's more striking public intellectuals are not without ideas of their own, but the writers of the '50s remain an essential point of reference and overall influence, and the work of the newer authors, at its most substantial, tends to carry forward the earlier arguments and critiques. Andrew Bacevich is perfectly clear on the matter of his debt to William Appleman Williams. Morris Berman, discussing his latest, points out that "it is actually part of a lineage, the path initially staked out by Richard Hofstadter, C. Vann Woodward and Louis Hartz" in the late 1940s and 1950s. At the close of The Wrecking Crew it is to Richard Hofstadter that Thomas Frank looks back. Indeed, Chris Hedges, in Death of the Liberal Class, compares our current crop of public intellectuals - unfavorably - with these predecessors of a half century ago.
Why was this the case? Perhaps it was a reflection of the unique opportunities available to that generation of authors. (It is hard to picture a John Kenneth Galbraith or a William H. Whyte working at Fortune Magazine today, or to think of any liberal intellectual - or any intellectual for that matter - who had the kind of long, varied experience of government work Galbraith did during World War II and the Cold War.) It seems likely, too, that this was partly a matter of the novelty of the problems of the post-war years, which have since become so familiar, and remain deeply entangled with our newer problems, so much so that commenting on them it is harder to appear fresh or original. (Reading Barbara Ehrenreich's Bait and Switch, for instance, one sees a world deeply changed since the time of Whyte's The Organization Man - and yet in other ways, very much the same, not least in the use and abuse of personality tests.) Another reason may have been the assumption of a more literate general audience. (Much as Galbraith has been justly praised for his urbane prose style, major publisher's today might take issue with its suitability to a work of pop economics aimed at today's market.) But certainly part of it was the existence of a political climate which permitted the airing of a range of ideas that was wider in some respects, and a stronger hope that the expression of those ideas could actually mean something out in the real world. (The irony that they lived in the era of the House Un-American Activities Committee should not be lost on us.)
The last is only too apparent in the tone of the writings, imbued with a sense that there is little that will be done about today's problems, regardless of what they say (the titles of books like those by Hedges and Berman saying it all). This apathy is hardly going unnoticed. When the hand on the Bulletin of the Atomic Scientists' Doomsday Clock moved one minute forward this month, Kennette Benedict, the magazine's executive director, cited the lack of "new thinking" as a factor.
As far as I can tell, no one has expressed real surprise at that statement.
Today's more striking public intellectuals are not without ideas of their own, but the writers of the '50s remain an essential point of reference and overall influence, and the work of the newer authors, at its most substantial, tends to carry forward the earlier arguments and critiques. Andrew Bacevich is perfectly clear on the matter of his debt to William Appleman Williams. Morris Berman, discussing his latest, points out that "it is actually part of a lineage, the path initially staked out by Richard Hofstadter, C. Vann Woodward and Louis Hartz" in the late 1940s and 1950s. At the close of The Wrecking Crew it is to Richard Hofstadter that Thomas Frank looks back. Indeed, Chris Hedges, in Death of the Liberal Class, compares our current crop of public intellectuals - unfavorably - with these predecessors of a half century ago.
Why was this the case? Perhaps it was a reflection of the unique opportunities available to that generation of authors. (It is hard to picture a John Kenneth Galbraith or a William H. Whyte working at Fortune Magazine today, or to think of any liberal intellectual - or any intellectual for that matter - who had the kind of long, varied experience of government work Galbraith did during World War II and the Cold War.) It seems likely, too, that this was partly a matter of the novelty of the problems of the post-war years, which have since become so familiar, and remain deeply entangled with our newer problems, so much so that commenting on them it is harder to appear fresh or original. (Reading Barbara Ehrenreich's Bait and Switch, for instance, one sees a world deeply changed since the time of Whyte's The Organization Man - and yet in other ways, very much the same, not least in the use and abuse of personality tests.) Another reason may have been the assumption of a more literate general audience. (Much as Galbraith has been justly praised for his urbane prose style, major publisher's today might take issue with its suitability to a work of pop economics aimed at today's market.) But certainly part of it was the existence of a political climate which permitted the airing of a range of ideas that was wider in some respects, and a stronger hope that the expression of those ideas could actually mean something out in the real world. (The irony that they lived in the era of the House Un-American Activities Committee should not be lost on us.)
The last is only too apparent in the tone of the writings, imbued with a sense that there is little that will be done about today's problems, regardless of what they say (the titles of books like those by Hedges and Berman saying it all). This apathy is hardly going unnoticed. When the hand on the Bulletin of the Atomic Scientists' Doomsday Clock moved one minute forward this month, Kennette Benedict, the magazine's executive director, cited the lack of "new thinking" as a factor.
As far as I can tell, no one has expressed real surprise at that statement.
Sunday, October 21, 2012
On the European Union's Nobel Peace Prize
In Isaac Asimov's classic story "The Evitable Conflict," Stephen Byerley suggests that the major conflicts of human history were never settled by force. Instead each of them "persisted through a series of conflicts, then vanished of itself – 'not with a bang, but with a whimper,' as the economic and social environment changed." So it was with the competition of the Houses of Hapsburg and Valois-Bourbon for European hegemony, or the clash of Catholic and Protestant for dominion over Western Christendom. So it also was with the "nationalist-imperialist wars" of the nineteenth and twentieth centuries
The fact of Europe's Cold War-era division, the much more powerful presence of the U.S. and Soviet Union on the continent's balance of power, to say nothing of their considerable control over the members of their alliance systems, marginalized the importance of the balance of power among the West European states themselves. It is worth remembering, too, that the elites of these countries were ideologically committed to internationalist anti-Communism far and above intra-European nationalistic grievances (in West Germany's case, even above German unity). At any rate, their defense expenditures and military activities were constrained by the damage to their economies during the war and the costs of rebuilding after, and in the case of Britain, France, the Netherlands and Belgium their efforts to prop up the remnants of their colonial empires - with even that much implausible without the United States footing the bill (the U.S. paid eighty percent of the cost for France's failed war to retain Indochina) and impossible in the face of U.S. disapproval (as demonstrated during the 1956 Suez War). For all the blood shed over them, those empires vanished quickly (non-Europe clearly decided to be non-European, to borrow Asimov's phrase), eliminating one of the principal objects of contention among the European powers for the last four centuries. Meanwhile the division of Germany sharply reduced its margin of demographic and economic superiority over France and Britain, and along with their acquisition of nuclear weaponry, guaranteed its inability to appreciably threaten them - even without the considerable, immediate restraints imposed by the superpowers.
At the same time welfare state capitalism (prompted, in part, by fears of the radicalization of Western electorates in its absence) brought rapid growth and distributed its fruits widely enough to produce generalized and rising affluence, dampening prickly nationalisms and contributing greatly to domestic stability. Along with the war-weariness induced by the conflict (which came on top of the war-weariness that followed in the wake of World War I), reinforced by the tastes later age cohorts had of conflict in colonial campaigns and Cold War missions from Londonderry to Algiers to the East Indies, and war scares where the Bomb seemed dangerously close to coming into play, this greatly diminished the appeal of belligerent grandstanding over territorial disagreements or perceived insults to the national honor. What aggressive nationalism remained tended to be directed internally, against the post-war influx of immigrants (exclusion, rather than expansion, its orientation).
These changes, of course, survived the economic downturn of the 1970s, and the end of the Cold War. Certainly Germany's reunification caused anxieties in certain quarters, but the melodrama that surrounded the event - like Margaret Thatcher's absurd request that Mikhail Gorbachev "halt" the process, or the conversations between Francois Mitterand's aide Jacques Attali with Anatoly Chernyaev regarding the revival of the '30s-era Franco-Soviet alliance - merely testified to lagging perceptions among the generation that lived through the war (and in some cases, one suspects, an even more profound disconnect from reality), demonstrated most clearly by the level of abstraction in the discussions, the scarcity of anything like concrete scenarios. (No one really expected some renewed German assault on Alsace-Lorraine or Operation Barbarossa II, for instance, or articulated anything comparable that they did regard as more plausible.)
Consequently, rather than making peace possible, it would seem that the EU was made possible by the roles of the U.S. and Soviet Union and the bigger Cold War that hung over the old conflicts; by the end of Europe's empires; by the arrival of the nuclear age and the limitations it imposed on old-style war-making; by the comforts of welfare capitalism; by the weariness which followed the hugely destructive wars of the earlier era, the smaller wars of the post-1945 period, and the frightening prospect of a war even more destructive than all those seen in the past - with almost all of these developments significantly originating in or prompted by events outside Europe.
Of course, that raises the question of what the European Union did with the opportunity afforded it by all the factors discussed above. The Union did achieve a unity of sorts - but its achievements have actually been least impressive in the realm of foreign policy and defense, this remaining very much the purview of separate nation-states (and what those states actually did prominent in criticisms of the award from both the right and left). Integration has been much deeper in the continent's economic life, but the contribution this actually made to peace is wide open to question, by those who question the claims made for international trade as pacifier, those who are doubtful about free trade as an engine of prosperity, and those who see the Union's combination of expansion and neoliberal prescriptions (the consequences of which are quite apparent on European streets from Athens to Lisbon) as having been quite inimical to the cause the prize is supposed to celebrate and advance.1
1. I am reminded at this point of a different, more recent science fiction tale, Robert J. Sawyer's novel Flash Forward (the basis for the short-lived TV series). In the book businessman Mr. Cheung, after developing a technique to make human beings immortal, decides to offer the treatment to Nobel Prize winners, peace prize winners included, as "by and large the selections are deserving" of the award. As the critics of this prize demonstrate, this is not a view with which all would agree.
when the most important question in the world was which portions of Europe would control the economic resources and consuming capacity of which portions of non-Europe . . . Until the forces of nationalism spread sufficiently, so that non-Europe ended what all the wars could not, and decided it could exist quite comfortably all non-European.This explanation of the end of that cycle of wars, published in this story a mere five years after VE Day, seems to me far more persuasive than its attribution to regional integration via the European Union and its predecessors - something inextricable from the rationale given for the EU's receipt of this year's Nobel Peace Prize, its role in uniting the continent.
The fact of Europe's Cold War-era division, the much more powerful presence of the U.S. and Soviet Union on the continent's balance of power, to say nothing of their considerable control over the members of their alliance systems, marginalized the importance of the balance of power among the West European states themselves. It is worth remembering, too, that the elites of these countries were ideologically committed to internationalist anti-Communism far and above intra-European nationalistic grievances (in West Germany's case, even above German unity). At any rate, their defense expenditures and military activities were constrained by the damage to their economies during the war and the costs of rebuilding after, and in the case of Britain, France, the Netherlands and Belgium their efforts to prop up the remnants of their colonial empires - with even that much implausible without the United States footing the bill (the U.S. paid eighty percent of the cost for France's failed war to retain Indochina) and impossible in the face of U.S. disapproval (as demonstrated during the 1956 Suez War). For all the blood shed over them, those empires vanished quickly (non-Europe clearly decided to be non-European, to borrow Asimov's phrase), eliminating one of the principal objects of contention among the European powers for the last four centuries. Meanwhile the division of Germany sharply reduced its margin of demographic and economic superiority over France and Britain, and along with their acquisition of nuclear weaponry, guaranteed its inability to appreciably threaten them - even without the considerable, immediate restraints imposed by the superpowers.
At the same time welfare state capitalism (prompted, in part, by fears of the radicalization of Western electorates in its absence) brought rapid growth and distributed its fruits widely enough to produce generalized and rising affluence, dampening prickly nationalisms and contributing greatly to domestic stability. Along with the war-weariness induced by the conflict (which came on top of the war-weariness that followed in the wake of World War I), reinforced by the tastes later age cohorts had of conflict in colonial campaigns and Cold War missions from Londonderry to Algiers to the East Indies, and war scares where the Bomb seemed dangerously close to coming into play, this greatly diminished the appeal of belligerent grandstanding over territorial disagreements or perceived insults to the national honor. What aggressive nationalism remained tended to be directed internally, against the post-war influx of immigrants (exclusion, rather than expansion, its orientation).
These changes, of course, survived the economic downturn of the 1970s, and the end of the Cold War. Certainly Germany's reunification caused anxieties in certain quarters, but the melodrama that surrounded the event - like Margaret Thatcher's absurd request that Mikhail Gorbachev "halt" the process, or the conversations between Francois Mitterand's aide Jacques Attali with Anatoly Chernyaev regarding the revival of the '30s-era Franco-Soviet alliance - merely testified to lagging perceptions among the generation that lived through the war (and in some cases, one suspects, an even more profound disconnect from reality), demonstrated most clearly by the level of abstraction in the discussions, the scarcity of anything like concrete scenarios. (No one really expected some renewed German assault on Alsace-Lorraine or Operation Barbarossa II, for instance, or articulated anything comparable that they did regard as more plausible.)
Consequently, rather than making peace possible, it would seem that the EU was made possible by the roles of the U.S. and Soviet Union and the bigger Cold War that hung over the old conflicts; by the end of Europe's empires; by the arrival of the nuclear age and the limitations it imposed on old-style war-making; by the comforts of welfare capitalism; by the weariness which followed the hugely destructive wars of the earlier era, the smaller wars of the post-1945 period, and the frightening prospect of a war even more destructive than all those seen in the past - with almost all of these developments significantly originating in or prompted by events outside Europe.
Of course, that raises the question of what the European Union did with the opportunity afforded it by all the factors discussed above. The Union did achieve a unity of sorts - but its achievements have actually been least impressive in the realm of foreign policy and defense, this remaining very much the purview of separate nation-states (and what those states actually did prominent in criticisms of the award from both the right and left). Integration has been much deeper in the continent's economic life, but the contribution this actually made to peace is wide open to question, by those who question the claims made for international trade as pacifier, those who are doubtful about free trade as an engine of prosperity, and those who see the Union's combination of expansion and neoliberal prescriptions (the consequences of which are quite apparent on European streets from Athens to Lisbon) as having been quite inimical to the cause the prize is supposed to celebrate and advance.1
1. I am reminded at this point of a different, more recent science fiction tale, Robert J. Sawyer's novel Flash Forward (the basis for the short-lived TV series). In the book businessman Mr. Cheung, after developing a technique to make human beings immortal, decides to offer the treatment to Nobel Prize winners, peace prize winners included, as "by and large the selections are deserving" of the award. As the critics of this prize demonstrate, this is not a view with which all would agree.
Wednesday, October 17, 2012
Review: The Affluent Society, by John Kenneth Galbraith
Boston: Houghton Mifflin, 1958, pp. 368.
John Kenneth Galbraith's starting point in The Affluent Society is the observation that historically the great economic problem was that of "producing enough," a concern enshrined in the mainstream of economic thought (by way of Smith, Malthus, Ricardo, Mill and Marshall, and Marx, Veblen and Keynes for that matter), and by the Social Darwinism that found such a warm reception in the United States (where Herbert Spencer became hugely influential). Accordingly, the stress of economic thought was on the maximization of the output of the essentials of life--food, clothing, shelter, fuel--through the most efficient possible use of the available resources prioritized. As it happened, the items in question lent themselves toward private, individual production and consumption, while proponents of this ideal venerated the free operation of a competitive market as a device for forcing capitalists and employers to make the utmost effort in getting production up and price down. The result was, not incidentally, a tendency to value the private sector, and denigrate the public sector (at least, where goods besides physical security are concerned), a pattern which extended to the view of increased private consumption as a gain to prosperity, and increased public consumption a reduction in it. Going alongside this was the attitude that society had little choice but to tolerate poverty, given the gap between what was produced, and what was needed.1
However, Galbraith contends that by the twentieth century the advanced industrial countries of the Western world had largely solved the production problem, the central issue no longer securing an adequate supply of life's essentials--society now "affluent" in that respect. In making his case Galbraith points to the many ways in which these societies had deeply changed since the Industrial Revolution. The truly poor (those who do not get enough to eat, etc.) had been transformed from the vast majority into a minority. An increased share of production was devoted to satisfying not indisputable physical needs (to a great extent, satiated), but the artificial wants manufactured by consumer culture's unprecedented use of techniques like planned obsolescence and advertising.
This was all as people worked less, the 70-hour week giving way to the 40-hour week, while child labor became a thing of the past, and the retirement of the elderly became routine (all of these, indicators of the reduced urgency of the demand for labor and its longtime principal products). Additionally, rather than the failure of supply the failure of demand, as seen in the Great Depression, came to be construed as the great threat to prosperity, while the continued stress on the maximization of production has been motivated less by the desirability of more goods than of production's role in maintaining full employment. Galbraith even contended that the expansion of production, and with it, general well-being, had gone a long way to diminishing the political charge of the issue of inequality.
As a result, in place of the traditional problems of poverty there were now the side effects of an economy of private affluence. The maintenance of full employment as the norm meant inflationary pressure, while consumer culture brought with it a tendency toward oppressive and unsustainable debt accumulation. At the same time the advent of such private prosperity meant an imbalance with the comparative poverty of the public sector and the public goods it supplied, such as educational, health and policing services, urban planning, and the protection of the natural environment raising new issues, not the least of them the ways in which lacks here undermine the private sector's operation. The remediable failures of the education system, for instance, harm scientific R & D efforts. Likewise, the failures of city planning have an adverse effect on the quantity and quality of the housing stock available to home buyers. And so on.
The solution of the old problems, and their replacement by new unsolved problems (by making the public sector affluent as the private sector had become, to the benefit of both), went largely unrecognized because of the tendency of the "conventional wisdom" (a term Galbraith coined in this book) about the issue to lag behind circumstance.2 Nonetheless, Galbraith was confident that change was possible, even likely, and certainly his argument did find an audience at the time. The idea that there could be economic life beyond flat-out GDP maximization in fact became a commonplace in social science, social comment and futurological speculation during the years that followed, William Appleman Williams, Gunther Stent and Alvin Toffler, among many, many others, all featuring it prominently in their work. Since then, ideas such as "sustainable" growth have gained currency in certain circles, as have broader measures of wealth, like the General Progress Indicator.
However, it seems that we are further than ever from embracing the thinking Galbraith presented, American society regarding itself as far from affluent in the way he suggests (even though per-capita GDP has more than doubled since then). The prioritization of sheer GDP growth as the sole test of economic performance, the identification of the private sector with goods and the public sector with costs, the Victorian severity toward the poor, all remain the predominant assumptions over a half century later.
One has to wonder why, given that there seems a great deal of merit to his argument. Certainly one part of the explanation seems to be Galbraith's underestimation of the extent to which the "good life" would continue to be perceived in terms of more private consumption, and on a related note, the intensely private way in which Americans think about their problems, individual and collective (to which C. Wright Mills, stronger on this score, often pointed). Galbraith also had a tendency to underestimate the power of vested interests to affect public policy, and how the shift in the terms of economic debate in the neoliberal revolution (made possible by a combination of economic crisis in the early 1970s and a politics of cultural backlash) would permit new circumstances to reinforce the conventional wisdom he criticized, leaving the mainstream less open to newer thinking.
At least in part because of neoliberal policies like the lowering of trade barriers, financial deregulation, and the weakening of organized labor and worker protections, the last four decades have been marked by much lower rates of economic growth, as well as intensified international economic competition, recurrent financial crises, higher unemployment, and greater inequality. Americans may have remained privately "affluent"--but the vast majority did not get much more affluent by even the most conventional measures, and they felt less secure in what they did have. If anything, given that their personal budgets were often pressed by inflation sufficient to wipe out the additions to purchasing power represented by rises in their income, the increasingly expensive post-secondary education increasingly seen as a requirement for remunerative employment, the exploding costs of health care, and the mounting private debt loads to which all these added, they had reason to feel less affluent than before.
Meanwhile, the reality of public poverty grew only more pronounced under the same circumstances, as slower growth rates, rising dependency ratios (putting pressure on benefits for the retired or elderly as they had been traditionally organized), and the public portion of the growing tab for educational and health care spending (and the shift away from progressive taxation even as the distribution of wealth became more unequal) squeezed government budgets, making bigger deficits and mounting government debt the long-term trend. And the neoliberal context made all the difference as to how this was perceived.
It is conceivable that under other circumstances the redress of the public sector's problems may have been seen as part of a broader project of reviving overall prosperity. The growing numbers of retirees might have been viewed as relieving the pressure of job creation and inflation, for instance. Expanded public programs in the areas of education and health care might have been seen as a way of relieving those burdens on the individual consumer. And so on and so forth. However, it became the conventional wisdom that the thing to do was not to improve public services, but to turn them into for-profit private services (a drive given greater urgency by the construing of those larger numbers of retirees as a fiscal disaster in the making). And while, despite all the cutbacks, the proportion of government budgets devoted to transfer payments cannot be said to have shrunk there is no doubt that the response of individuals was increasingly to substitute private goods for public ones; to respond to the inadequacies of the public education system by paying more out of pocket to send one's children to private school, for instance.3
This transfer of formerly public burdens to the private individual did that much more to make what could have appeared like private affluence instead appear as private scarcity, and make the prioritization of expanded private consumption (especially with the idea of expanding public consumption delegitimized) appear the solution--with old-fashioned GDP growth the means.4 And of course, the broader social inequality attending the process by which the lives of wage and salary-earners became so strained meant that much more need for a convenient solvent for the resulting tensions, the obvious candidate for which was, again, GDP growth (rhetorically, at any rate).
In retrospect it seems Alain Touraine who, far more than any counterpart of his in the English-speaking world (Galbraith included), was prescient in contending that post-industrial society would be the most growth-driven and GDP-obsessed in history, with the post-2008 economic shock doing surprisingly little to shake up the mainstream's embrace of neoliberalism. Nonetheless, the case for moving beyond the "conventional wisdom" he described, for thinking of public as well as private affluence, and looking beyond the narrow definition of growth so dear to economic orthodoxy, has only grown stronger with the passage of the decades.
1. One irony of this view, not fully recognized in Galbraith's writing, is the role of public support in creating private prosperity even in the eras of Smith and Ricardo and Mill, much of the public preferring to stick with simplistic Horatio Alger and Edisonade images of how fortunes are made, corporate giants born and Third World poverty turned into industrialized affluence. It remains an irony of American political life that those who are most inclined to evoke the Founding Fathers are those most inclined to neglect the lessons of Alexander Hamilton's "Report on Manufactures"; those most inclined to speak of American economic exceptionalism, those least inclined to forget that there was such a thing as an "American School" of economic thought, and the role it played in building the American economy.
2. Among Galbraith's specific, practical suggestions was a delinking of work and income through a more generous, but also more carefully managed, program of unemployment insurance so that less-than-full-employment became an acceptable norm politically, relieving inflationary pressure. He also suggested a sales tax on private consumption for the funding of improved public services. More broadly, he argued for the elimination of toil (unpleasant work performed wholly for pay as it renders no other satisfaction), making all workers part of what he called the "New Class" (in which people have not just jobs, but professions and careers, accessible through education).
3. An even more dramatic example may be the response to the problems of American cities--rather than making them more habitable through sounder urban planning, improved public transport, the redress of poverty and the like, the typical course has been relocation to suburbs and exurbs, which bring associated expenses like greater car ownership, longer commutes, and the private security bills of gated communities.
4. Under the circumstances, such an idea as using a sales tax to divert dollars spent on private consumption into the funding of better public services--as Galbraith proposes--could only have been anathema.
Tweet
John Kenneth Galbraith's starting point in The Affluent Society is the observation that historically the great economic problem was that of "producing enough," a concern enshrined in the mainstream of economic thought (by way of Smith, Malthus, Ricardo, Mill and Marshall, and Marx, Veblen and Keynes for that matter), and by the Social Darwinism that found such a warm reception in the United States (where Herbert Spencer became hugely influential). Accordingly, the stress of economic thought was on the maximization of the output of the essentials of life--food, clothing, shelter, fuel--through the most efficient possible use of the available resources prioritized. As it happened, the items in question lent themselves toward private, individual production and consumption, while proponents of this ideal venerated the free operation of a competitive market as a device for forcing capitalists and employers to make the utmost effort in getting production up and price down. The result was, not incidentally, a tendency to value the private sector, and denigrate the public sector (at least, where goods besides physical security are concerned), a pattern which extended to the view of increased private consumption as a gain to prosperity, and increased public consumption a reduction in it. Going alongside this was the attitude that society had little choice but to tolerate poverty, given the gap between what was produced, and what was needed.1
However, Galbraith contends that by the twentieth century the advanced industrial countries of the Western world had largely solved the production problem, the central issue no longer securing an adequate supply of life's essentials--society now "affluent" in that respect. In making his case Galbraith points to the many ways in which these societies had deeply changed since the Industrial Revolution. The truly poor (those who do not get enough to eat, etc.) had been transformed from the vast majority into a minority. An increased share of production was devoted to satisfying not indisputable physical needs (to a great extent, satiated), but the artificial wants manufactured by consumer culture's unprecedented use of techniques like planned obsolescence and advertising.
This was all as people worked less, the 70-hour week giving way to the 40-hour week, while child labor became a thing of the past, and the retirement of the elderly became routine (all of these, indicators of the reduced urgency of the demand for labor and its longtime principal products). Additionally, rather than the failure of supply the failure of demand, as seen in the Great Depression, came to be construed as the great threat to prosperity, while the continued stress on the maximization of production has been motivated less by the desirability of more goods than of production's role in maintaining full employment. Galbraith even contended that the expansion of production, and with it, general well-being, had gone a long way to diminishing the political charge of the issue of inequality.
As a result, in place of the traditional problems of poverty there were now the side effects of an economy of private affluence. The maintenance of full employment as the norm meant inflationary pressure, while consumer culture brought with it a tendency toward oppressive and unsustainable debt accumulation. At the same time the advent of such private prosperity meant an imbalance with the comparative poverty of the public sector and the public goods it supplied, such as educational, health and policing services, urban planning, and the protection of the natural environment raising new issues, not the least of them the ways in which lacks here undermine the private sector's operation. The remediable failures of the education system, for instance, harm scientific R & D efforts. Likewise, the failures of city planning have an adverse effect on the quantity and quality of the housing stock available to home buyers. And so on.
The solution of the old problems, and their replacement by new unsolved problems (by making the public sector affluent as the private sector had become, to the benefit of both), went largely unrecognized because of the tendency of the "conventional wisdom" (a term Galbraith coined in this book) about the issue to lag behind circumstance.2 Nonetheless, Galbraith was confident that change was possible, even likely, and certainly his argument did find an audience at the time. The idea that there could be economic life beyond flat-out GDP maximization in fact became a commonplace in social science, social comment and futurological speculation during the years that followed, William Appleman Williams, Gunther Stent and Alvin Toffler, among many, many others, all featuring it prominently in their work. Since then, ideas such as "sustainable" growth have gained currency in certain circles, as have broader measures of wealth, like the General Progress Indicator.
However, it seems that we are further than ever from embracing the thinking Galbraith presented, American society regarding itself as far from affluent in the way he suggests (even though per-capita GDP has more than doubled since then). The prioritization of sheer GDP growth as the sole test of economic performance, the identification of the private sector with goods and the public sector with costs, the Victorian severity toward the poor, all remain the predominant assumptions over a half century later.
One has to wonder why, given that there seems a great deal of merit to his argument. Certainly one part of the explanation seems to be Galbraith's underestimation of the extent to which the "good life" would continue to be perceived in terms of more private consumption, and on a related note, the intensely private way in which Americans think about their problems, individual and collective (to which C. Wright Mills, stronger on this score, often pointed). Galbraith also had a tendency to underestimate the power of vested interests to affect public policy, and how the shift in the terms of economic debate in the neoliberal revolution (made possible by a combination of economic crisis in the early 1970s and a politics of cultural backlash) would permit new circumstances to reinforce the conventional wisdom he criticized, leaving the mainstream less open to newer thinking.
At least in part because of neoliberal policies like the lowering of trade barriers, financial deregulation, and the weakening of organized labor and worker protections, the last four decades have been marked by much lower rates of economic growth, as well as intensified international economic competition, recurrent financial crises, higher unemployment, and greater inequality. Americans may have remained privately "affluent"--but the vast majority did not get much more affluent by even the most conventional measures, and they felt less secure in what they did have. If anything, given that their personal budgets were often pressed by inflation sufficient to wipe out the additions to purchasing power represented by rises in their income, the increasingly expensive post-secondary education increasingly seen as a requirement for remunerative employment, the exploding costs of health care, and the mounting private debt loads to which all these added, they had reason to feel less affluent than before.
Meanwhile, the reality of public poverty grew only more pronounced under the same circumstances, as slower growth rates, rising dependency ratios (putting pressure on benefits for the retired or elderly as they had been traditionally organized), and the public portion of the growing tab for educational and health care spending (and the shift away from progressive taxation even as the distribution of wealth became more unequal) squeezed government budgets, making bigger deficits and mounting government debt the long-term trend. And the neoliberal context made all the difference as to how this was perceived.
It is conceivable that under other circumstances the redress of the public sector's problems may have been seen as part of a broader project of reviving overall prosperity. The growing numbers of retirees might have been viewed as relieving the pressure of job creation and inflation, for instance. Expanded public programs in the areas of education and health care might have been seen as a way of relieving those burdens on the individual consumer. And so on and so forth. However, it became the conventional wisdom that the thing to do was not to improve public services, but to turn them into for-profit private services (a drive given greater urgency by the construing of those larger numbers of retirees as a fiscal disaster in the making). And while, despite all the cutbacks, the proportion of government budgets devoted to transfer payments cannot be said to have shrunk there is no doubt that the response of individuals was increasingly to substitute private goods for public ones; to respond to the inadequacies of the public education system by paying more out of pocket to send one's children to private school, for instance.3
This transfer of formerly public burdens to the private individual did that much more to make what could have appeared like private affluence instead appear as private scarcity, and make the prioritization of expanded private consumption (especially with the idea of expanding public consumption delegitimized) appear the solution--with old-fashioned GDP growth the means.4 And of course, the broader social inequality attending the process by which the lives of wage and salary-earners became so strained meant that much more need for a convenient solvent for the resulting tensions, the obvious candidate for which was, again, GDP growth (rhetorically, at any rate).
In retrospect it seems Alain Touraine who, far more than any counterpart of his in the English-speaking world (Galbraith included), was prescient in contending that post-industrial society would be the most growth-driven and GDP-obsessed in history, with the post-2008 economic shock doing surprisingly little to shake up the mainstream's embrace of neoliberalism. Nonetheless, the case for moving beyond the "conventional wisdom" he described, for thinking of public as well as private affluence, and looking beyond the narrow definition of growth so dear to economic orthodoxy, has only grown stronger with the passage of the decades.
1. One irony of this view, not fully recognized in Galbraith's writing, is the role of public support in creating private prosperity even in the eras of Smith and Ricardo and Mill, much of the public preferring to stick with simplistic Horatio Alger and Edisonade images of how fortunes are made, corporate giants born and Third World poverty turned into industrialized affluence. It remains an irony of American political life that those who are most inclined to evoke the Founding Fathers are those most inclined to neglect the lessons of Alexander Hamilton's "Report on Manufactures"; those most inclined to speak of American economic exceptionalism, those least inclined to forget that there was such a thing as an "American School" of economic thought, and the role it played in building the American economy.
2. Among Galbraith's specific, practical suggestions was a delinking of work and income through a more generous, but also more carefully managed, program of unemployment insurance so that less-than-full-employment became an acceptable norm politically, relieving inflationary pressure. He also suggested a sales tax on private consumption for the funding of improved public services. More broadly, he argued for the elimination of toil (unpleasant work performed wholly for pay as it renders no other satisfaction), making all workers part of what he called the "New Class" (in which people have not just jobs, but professions and careers, accessible through education).
3. An even more dramatic example may be the response to the problems of American cities--rather than making them more habitable through sounder urban planning, improved public transport, the redress of poverty and the like, the typical course has been relocation to suburbs and exurbs, which bring associated expenses like greater car ownership, longer commutes, and the private security bills of gated communities.
4. Under the circumstances, such an idea as using a sales tax to divert dollars spent on private consumption into the funding of better public services--as Galbraith proposes--could only have been anathema.
Tweet
Tuesday, October 16, 2012
Review: The New Industrial State, by John Kenneth Galbraith
Boston: Houghton Mifflin, 1967, pp. 427.
The conventional view of the American economy is that it is an arena where market forces predominate so that no single buyer or seller exercises significant control over prices; individuals and small business are the principal sources of innovation; and enterprises fight to keep government and organized labor at bay as they pursue that crucial form of feedback, profit, won through their success or failure in the competition to cater best to the wants of the consumer.
However, in his 1967 classic The New Industrial State John Galbraith made the case for quite a different reality. The American economy, Galbraith held, was thoroughly planned--if in a somewhat more diffuse way than contemporaneous Soviet-style economies. Rather than some government bureau attempting to direct the whole, the function was spread among a small number of key actors, in particular the largest industrial corporations. This was because only the large corporation could raise the capital and organize the skills needed to design and produce the highly complex products characteristic of mid-twentieth century technology.1 And even for them, such investments were only viable when the market and its uncertainties were tamed--not least, the tastes of consumers.
Such corporations pursued the reduction of uncertainty at a number of levels, from that of the individual firm (through such methods as vertical integration, or the use of the market power that comes with large size to influence the prices at which they buy their inputs and sell their production), to that of the usually oligopolistic industries of which they were a part (through the avoidance of disruptive competition among firms), to the level of the whole economy through their influence on and cooperation with government (which pays for much R & D, and regulates demand with high public outlays, progressive taxation, and wage and price controls) and their dealings with organized labor (negotiations with which stabilize wage levels, and often bring it into line as an ally when pursuing government contracts). These corporations also made the private consumer a more predictable actor, not only through research of the consumer in advance of the development of the products to be marketed to them, but the advertising molding his or her tastes.2
The corporations, in turn, were dominated by their "technostructures," the assemblages of technicians that made such companies functional, who had to be seen in this collective way because the planning process indispensable to their operations was far too vast and complex to be controlled by a single individual (given the volume and variety of information that had to be collected and processed, and decisions that had to be made). One result was the diminished power of the Chief Executive Officer and other senior management, often reduced to ratifying the decisions of comparatively obscure experts within their firms.3 In their turn, managers (typically salaried employees rather than owner-founders or their heirs) had seen their power grow relative to that of shareholders, while these were even further removed from being able to usefully observe and understand the intricate internal workings of these companies, and their prospects for exercising detailed control over company operations were further diminished by the wide diffusion of stock ownership. The result was that, so long as a company continued to deliver an "acceptable" level of profit, the technostructure enjoyed the degree of autonomy without which its elaborate planning was impossible.
The culture of such "mature" corporations naturally differed from that of such enterprises in their more formative phases, or the businesses of earlier eras altogether. The businessmen of the classic, nineteenth century mold (the Henry Ford type, for example), thoroughly "individualistic," and resolutely anti-intellectual, anti-statist and anti-union in sensibility and policy, were a poor fit with the model of enterprise Galbraith described in this book--and prone to get into trouble when trying to run a mature industrial company (as Ford did in Galbraith's account of his career in his earlier The Liberal Hour).
By contrast the "new" CEOs were more comfortable in an organization, more willing to give expertise some respect, and more pragmatic in their dealings with government and labor, permitting the smoother operation of their businesses and the economy as a whole. They were not unconcerned with profit, but maximizing it was not their sole or even primary object--in part because they were salaried personnel whose own income was less closely connected to their company's short-term fortunes, and in part because other motives had come to the fore, in particular "identification" with the company that gave them their privileged positions, and the satisfaction afforded by the "adaptation" of the company in line with their own, particular ideas about its mission.
Of course, identification and adaptation were not such powerful motives for less senior personnel. However, line workers too were more secure and better compensated than before, living as they did in a time of high employment and rising living standards (affording them much more than life's basics), while they were increasingly trading their blue collars for white ones and the less grinding working conditions that went with them, enabling those factors to influence them in a way they could not have before. The result was to weaken the sense of the employer (at least, in mature companies with a pragmatic attitude toward labor) as an enemy, helping to make the relationship between employee and employer less fraught with insecurity and confrontation, and the softer line of new-style, "enlightened" management toward organized labor that much more viable.
Galbraith's discussions of identification and adaptation assumed company as well as personal goals beyond the merely pecuniary, and Galbraith naturally discussed this matter at length, writing of the "social goals" which are the formal missions of major industrial firms--objects at which they strove to make a profit, rather than having been incidental to profit-making as such. Lest the utopian-sounding rhetoric of privately owned, profit-driven corporations pursuing "social goals" cause confusion, what Galbraith referred to by the term are rising production and consumption, technological advance, and the agendas associated with it: namely the growth of the Gross Domestic Product, and the waging of the Cold War.
This model of the economy's "commanding heights" laid out, Galbraith then proceeded to consider the system's weaknesses, offering an extensive critique of the limits of the goals pursued by the country's technostructures. As he noted, the manner in which government supported R & D and sustained demand was highly militarized, a fact which helped lock in Cold War tensions, and all their dangers (the most extreme of which was major nuclear war). Additionally, values and interests out of line with the accepted goals--like leisure, ecology, aesthetics--were typically denigrated and marginalized. This raised the question of who would promote peace rather than war, protect the environment, or defend intellectual and cultural values, and other such essential goods. Galbraith believed the answer lay in the emergence of what he termed the "educational and scientific estate." The dependence of the technostructure on this "estate" for expert knowledge, technological advance, and the training of the work force, made its members both more numerous, and more powerful, and made them a potential champion of those needs and values so poorly served by the rest of the system.
Taken together these ideas comprise a satisfactorily comprehensive analysis of the heart of the American economy at the time of the book's writing, as well as certain of its key problems and potential palliatives, one which remains relevant today in many ways. What Galbraith's book has to say about how large, high-tech enterprises work is just as true today as it was then--if not more so. The indispensability of the technostructure to the major industrial enterprise, the marginalization of the individual entrepreneur and small business in the economy (and especially in high-tech manufacturing), the fact of oligopoly, the reality of close collaboration between big business and government, are irrefutable facts of twenty-first century life. The long-term decline of organized labor Galbraith recognized is an equally irrefutable fact of American economic history in the past half century.
Yet, there is also no disputing that the "ideology" of the older "entrepreneurial" mentality Galbraith identified with older-style, profit-fixated businessmen resurged in the 1970s with the ascendance of neoliberal economic thought. Of course, for all the rhetoric, small enterprises did not reverse a two century-long trend toward Big Business or Big Government; if anything, the movement in this direction remained as strong as ever. Nonetheless, the associated attitudes profoundly changed the milieu in which the technostructures operate--generally in ways unconducive to their functioning. This has most obviously been the case at the macroeconomic level as deregulation and loose money policies, as well as the abandonment of full employment as a policy goal, intensified some forms of economic competition and made demand less reliable (by holding down wages, encouraging the dependence of consumption on unsustainable borrowing, increasing the incidence of financial crises, etc.). However, it has also been the case in the ways in which individual corporations have been run, by contributing to the dominance of the industries of which Galbraith wrote by a speculation-minded financial sector intent on short-run profit and share price maximization (reinforced by the linking of executive compensation to them). This has led to a preoccupation with cost-cutting (often targeting areas like R & D) directed at making quarterly earnings statements appear more attractive, at the expense of long-range company development; an increasing investment of company resources in mergers-and-acquisitions games as they pursue or fend off takeover attempts, at the expense of other imperatives; and the neglect of a company's core business (i.e. its "social goal") to focus on more profitable speculative finance (as General Motors did).
The economic performance of the advanced industrial nations during this time period, unsurprisingly, has been characterized by technological stagnation (evident in just about every area but IT) and falling growth rates (which compare very unfavorably with the post-World War II period), while the failure of corporate giants has long ceased to be an unheard-of event (as the events of 2008 and after have reminded us all).4 Consequently, even though Galbraith did not anticipate this turn, the economic history of these decades lends a great deal of support to his theory about how high-tech, capital-intensive firms, and the economies founded on them, necessarily operate. Indeed, taken along with the rather weak performance of the scientific and educational estate in the role of "loyal opposition," one can conclude that the principal failure of Galbraith (like many another mid-century American liberal) was his overestimation of the degree to which American political culture would adapt to the economic and technological realities that had become so apparent by mid-century.5
1. Galbraith's ideas regarding the decline of competition, and the emergence of oligopoly and its relationship to technological innovation, were previously (and more fully) elaborated in his earlier American Capitalism: The Concept of Countervailing Power (1952).
2. As their income enables them to go further and further beyond the point of meeting their most basic physical needs (food, shelter, warmth), consumers enjoy an increasing range of choice in their use of their purchasing power--another uncertainty business tries to manage. The result is that instead of being catered to by business, business strives to determine their wants for them. This idea was previously discussed in American Capitalism, and received even fuller treatment in 1958's The Affluent Society (reviewed here).
3. Galbraith remarks that "financial markets have long since accepted the reality of the technostructure as distinct from the entrepreneur," and goes on to jokingly paint an image of how the financial world would hang on "anything affecting his tenure in office," fluctuations in his health major news, and his replacement "handicapped like a horse." Yet, is such fuss not a routine matter where celebrity CEOs like Steve Jobs have been concerned--in part because it remains the norm to identify the achievements of a whole company with its chief?
4. There are those who would point to information technology as a counterexample to Galbraith's analysis of the place of the individual entrepreneur and the start-up in the '70s and after. Yet, the actual history of computing as we know it is dominated by state-subsidized R & D (it was the Defense Department which produced the ARPANet, CERN which gave us the World Wide Web) and the research of established firms like IBM (the hard disk drive, DRAM), Hewlett-Packard (the first to market a "personal computer") and Xerox (the graphical user interface). Additionally, the success of newer firms meant that they quickly fell into line with the model of innovation described here, without which Apple could not have delivered the Ipad, and Google would not be experimenting with driverless cars--the phase of old-fashioned entrepreneurship on closer inspection smaller and shorter-lived than the hype would have it. It is also worth keeping in mind that, even to the extent that it can be construed as exceptional, the sector has received disproportionate attention because of the false impressions created by the inflated share-prices of IT companies (as with Apple and Google right now), the frequent exaggeration of the economic impact of personal computing (the contribution to productivity has been hard to find), and its convenience as ideological fodder (proponents of orthodox economic thinking and purveyors of techno-hype desperately seizing on the myth in their search for some validation of their ideas).
5. Chris Hedges offers a more recent and quite different take on that estate's influence and vitality in Death of the Liberal Class.
Tweet
The conventional view of the American economy is that it is an arena where market forces predominate so that no single buyer or seller exercises significant control over prices; individuals and small business are the principal sources of innovation; and enterprises fight to keep government and organized labor at bay as they pursue that crucial form of feedback, profit, won through their success or failure in the competition to cater best to the wants of the consumer.
However, in his 1967 classic The New Industrial State John Galbraith made the case for quite a different reality. The American economy, Galbraith held, was thoroughly planned--if in a somewhat more diffuse way than contemporaneous Soviet-style economies. Rather than some government bureau attempting to direct the whole, the function was spread among a small number of key actors, in particular the largest industrial corporations. This was because only the large corporation could raise the capital and organize the skills needed to design and produce the highly complex products characteristic of mid-twentieth century technology.1 And even for them, such investments were only viable when the market and its uncertainties were tamed--not least, the tastes of consumers.
Such corporations pursued the reduction of uncertainty at a number of levels, from that of the individual firm (through such methods as vertical integration, or the use of the market power that comes with large size to influence the prices at which they buy their inputs and sell their production), to that of the usually oligopolistic industries of which they were a part (through the avoidance of disruptive competition among firms), to the level of the whole economy through their influence on and cooperation with government (which pays for much R & D, and regulates demand with high public outlays, progressive taxation, and wage and price controls) and their dealings with organized labor (negotiations with which stabilize wage levels, and often bring it into line as an ally when pursuing government contracts). These corporations also made the private consumer a more predictable actor, not only through research of the consumer in advance of the development of the products to be marketed to them, but the advertising molding his or her tastes.2
The corporations, in turn, were dominated by their "technostructures," the assemblages of technicians that made such companies functional, who had to be seen in this collective way because the planning process indispensable to their operations was far too vast and complex to be controlled by a single individual (given the volume and variety of information that had to be collected and processed, and decisions that had to be made). One result was the diminished power of the Chief Executive Officer and other senior management, often reduced to ratifying the decisions of comparatively obscure experts within their firms.3 In their turn, managers (typically salaried employees rather than owner-founders or their heirs) had seen their power grow relative to that of shareholders, while these were even further removed from being able to usefully observe and understand the intricate internal workings of these companies, and their prospects for exercising detailed control over company operations were further diminished by the wide diffusion of stock ownership. The result was that, so long as a company continued to deliver an "acceptable" level of profit, the technostructure enjoyed the degree of autonomy without which its elaborate planning was impossible.
The culture of such "mature" corporations naturally differed from that of such enterprises in their more formative phases, or the businesses of earlier eras altogether. The businessmen of the classic, nineteenth century mold (the Henry Ford type, for example), thoroughly "individualistic," and resolutely anti-intellectual, anti-statist and anti-union in sensibility and policy, were a poor fit with the model of enterprise Galbraith described in this book--and prone to get into trouble when trying to run a mature industrial company (as Ford did in Galbraith's account of his career in his earlier The Liberal Hour).
By contrast the "new" CEOs were more comfortable in an organization, more willing to give expertise some respect, and more pragmatic in their dealings with government and labor, permitting the smoother operation of their businesses and the economy as a whole. They were not unconcerned with profit, but maximizing it was not their sole or even primary object--in part because they were salaried personnel whose own income was less closely connected to their company's short-term fortunes, and in part because other motives had come to the fore, in particular "identification" with the company that gave them their privileged positions, and the satisfaction afforded by the "adaptation" of the company in line with their own, particular ideas about its mission.
Of course, identification and adaptation were not such powerful motives for less senior personnel. However, line workers too were more secure and better compensated than before, living as they did in a time of high employment and rising living standards (affording them much more than life's basics), while they were increasingly trading their blue collars for white ones and the less grinding working conditions that went with them, enabling those factors to influence them in a way they could not have before. The result was to weaken the sense of the employer (at least, in mature companies with a pragmatic attitude toward labor) as an enemy, helping to make the relationship between employee and employer less fraught with insecurity and confrontation, and the softer line of new-style, "enlightened" management toward organized labor that much more viable.
Galbraith's discussions of identification and adaptation assumed company as well as personal goals beyond the merely pecuniary, and Galbraith naturally discussed this matter at length, writing of the "social goals" which are the formal missions of major industrial firms--objects at which they strove to make a profit, rather than having been incidental to profit-making as such. Lest the utopian-sounding rhetoric of privately owned, profit-driven corporations pursuing "social goals" cause confusion, what Galbraith referred to by the term are rising production and consumption, technological advance, and the agendas associated with it: namely the growth of the Gross Domestic Product, and the waging of the Cold War.
This model of the economy's "commanding heights" laid out, Galbraith then proceeded to consider the system's weaknesses, offering an extensive critique of the limits of the goals pursued by the country's technostructures. As he noted, the manner in which government supported R & D and sustained demand was highly militarized, a fact which helped lock in Cold War tensions, and all their dangers (the most extreme of which was major nuclear war). Additionally, values and interests out of line with the accepted goals--like leisure, ecology, aesthetics--were typically denigrated and marginalized. This raised the question of who would promote peace rather than war, protect the environment, or defend intellectual and cultural values, and other such essential goods. Galbraith believed the answer lay in the emergence of what he termed the "educational and scientific estate." The dependence of the technostructure on this "estate" for expert knowledge, technological advance, and the training of the work force, made its members both more numerous, and more powerful, and made them a potential champion of those needs and values so poorly served by the rest of the system.
Taken together these ideas comprise a satisfactorily comprehensive analysis of the heart of the American economy at the time of the book's writing, as well as certain of its key problems and potential palliatives, one which remains relevant today in many ways. What Galbraith's book has to say about how large, high-tech enterprises work is just as true today as it was then--if not more so. The indispensability of the technostructure to the major industrial enterprise, the marginalization of the individual entrepreneur and small business in the economy (and especially in high-tech manufacturing), the fact of oligopoly, the reality of close collaboration between big business and government, are irrefutable facts of twenty-first century life. The long-term decline of organized labor Galbraith recognized is an equally irrefutable fact of American economic history in the past half century.
Yet, there is also no disputing that the "ideology" of the older "entrepreneurial" mentality Galbraith identified with older-style, profit-fixated businessmen resurged in the 1970s with the ascendance of neoliberal economic thought. Of course, for all the rhetoric, small enterprises did not reverse a two century-long trend toward Big Business or Big Government; if anything, the movement in this direction remained as strong as ever. Nonetheless, the associated attitudes profoundly changed the milieu in which the technostructures operate--generally in ways unconducive to their functioning. This has most obviously been the case at the macroeconomic level as deregulation and loose money policies, as well as the abandonment of full employment as a policy goal, intensified some forms of economic competition and made demand less reliable (by holding down wages, encouraging the dependence of consumption on unsustainable borrowing, increasing the incidence of financial crises, etc.). However, it has also been the case in the ways in which individual corporations have been run, by contributing to the dominance of the industries of which Galbraith wrote by a speculation-minded financial sector intent on short-run profit and share price maximization (reinforced by the linking of executive compensation to them). This has led to a preoccupation with cost-cutting (often targeting areas like R & D) directed at making quarterly earnings statements appear more attractive, at the expense of long-range company development; an increasing investment of company resources in mergers-and-acquisitions games as they pursue or fend off takeover attempts, at the expense of other imperatives; and the neglect of a company's core business (i.e. its "social goal") to focus on more profitable speculative finance (as General Motors did).
The economic performance of the advanced industrial nations during this time period, unsurprisingly, has been characterized by technological stagnation (evident in just about every area but IT) and falling growth rates (which compare very unfavorably with the post-World War II period), while the failure of corporate giants has long ceased to be an unheard-of event (as the events of 2008 and after have reminded us all).4 Consequently, even though Galbraith did not anticipate this turn, the economic history of these decades lends a great deal of support to his theory about how high-tech, capital-intensive firms, and the economies founded on them, necessarily operate. Indeed, taken along with the rather weak performance of the scientific and educational estate in the role of "loyal opposition," one can conclude that the principal failure of Galbraith (like many another mid-century American liberal) was his overestimation of the degree to which American political culture would adapt to the economic and technological realities that had become so apparent by mid-century.5
1. Galbraith's ideas regarding the decline of competition, and the emergence of oligopoly and its relationship to technological innovation, were previously (and more fully) elaborated in his earlier American Capitalism: The Concept of Countervailing Power (1952).
2. As their income enables them to go further and further beyond the point of meeting their most basic physical needs (food, shelter, warmth), consumers enjoy an increasing range of choice in their use of their purchasing power--another uncertainty business tries to manage. The result is that instead of being catered to by business, business strives to determine their wants for them. This idea was previously discussed in American Capitalism, and received even fuller treatment in 1958's The Affluent Society (reviewed here).
3. Galbraith remarks that "financial markets have long since accepted the reality of the technostructure as distinct from the entrepreneur," and goes on to jokingly paint an image of how the financial world would hang on "anything affecting his tenure in office," fluctuations in his health major news, and his replacement "handicapped like a horse." Yet, is such fuss not a routine matter where celebrity CEOs like Steve Jobs have been concerned--in part because it remains the norm to identify the achievements of a whole company with its chief?
4. There are those who would point to information technology as a counterexample to Galbraith's analysis of the place of the individual entrepreneur and the start-up in the '70s and after. Yet, the actual history of computing as we know it is dominated by state-subsidized R & D (it was the Defense Department which produced the ARPANet, CERN which gave us the World Wide Web) and the research of established firms like IBM (the hard disk drive, DRAM), Hewlett-Packard (the first to market a "personal computer") and Xerox (the graphical user interface). Additionally, the success of newer firms meant that they quickly fell into line with the model of innovation described here, without which Apple could not have delivered the Ipad, and Google would not be experimenting with driverless cars--the phase of old-fashioned entrepreneurship on closer inspection smaller and shorter-lived than the hype would have it. It is also worth keeping in mind that, even to the extent that it can be construed as exceptional, the sector has received disproportionate attention because of the false impressions created by the inflated share-prices of IT companies (as with Apple and Google right now), the frequent exaggeration of the economic impact of personal computing (the contribution to productivity has been hard to find), and its convenience as ideological fodder (proponents of orthodox economic thinking and purveyors of techno-hype desperately seizing on the myth in their search for some validation of their ideas).
5. Chris Hedges offers a more recent and quite different take on that estate's influence and vitality in Death of the Liberal Class.
Tweet
Friday, October 12, 2012
Toward a Post-Fossil Fuel Era: A Note
In 1998 oil prices dropped to their post-energy crisis low of $10 a barrel, and soon enough began their upward climb, especially apparent after 2003. In the half-decade that followed the price of oil increased from that twenty year high by a factor of five, touching $150 a barrel in July 2008 before dropping again. However, it never got anywhere near its 1998 price, generally staying well above even their 2003 price, despite the downward pressure that the worst economic crisis since the Great Depression of the 1930s placed on demand. (The current figure is $92 a barrel, more than six times the 1998 price in real terms, and more than twice the 2003 price.)
Along with the mounting evidence of climate change (receding Arctic ice and glaciers, etc.), other forms of ecological catastrophe (the Deepwater Horizon explosion, etc.), and the heating of international conflict zones by the geopolitics of energy (the U.S.'s relationship with the Middle East, Moscow's relations with its provinces and neighbors in the Caucasus, the ongoing political theater surrounding the Diaoyu/Senkaku/Tiaoyutai Islands, etc.), that price rise reinforced the lesson made clear in the 1970s, but generally unheeded in the 1980s and 1990s: the necessity of shifting the world's energy base away from finite, depleted and dirty fossil fuels. Alas, dismayingly little was done about the matter in the 2000s.
Germany's case merits examination, as it has been demonstrably more ambitious in this area than the other major industrial economies.1 As the situation stands it gets about 25 percent of its electricity from renewables, a number all the more impressive because only a fifth of that comes from long-established hydroelectric power - elsewhere, by far the predominant source of non-fossil fuel, non-nuclear energy (as is certainly the case in China and the U.S.). Germany's recent expansion of its photovoltaic-based electricity production (accounting for 5 percent of German electricity now) has been especially striking, the total output of its solar energy installations exceeding that of the whole rest of Europe, despite the country's northerly location. One result is that, unlike most other countries (like the United States and Britain), the drop in the carbon intensity of the German economy has exceeded the drop in its energy intensity.2
Nonetheless, the picture is not untroubled. Germany's electricity production is particularly coal-reliant, with that reliance actually increased in the wake of the shutdown of several nuclear reactors - and being locked in for decades by the construction of new coal-fired plants. Additionally, Germany's development of a "smart grid" has not kept pace with the expansion of renewable energy with its more variable output, causing issues with grid reliability which media opponents of "green" initiatives like Der Spiegel are using to bolster specious arguments against renewable energy generally. (Simply put, they play up the headaches, which they treat not as technical teething issues in many cases resolvable today, but as somehow unavoidable problems of of these forms of energy production, while totally ignoring the problems raised by the fossil fuels and nuclear power plants they replace - making this a common approach with climate change "skeptics."3)
What emerges is a picture of progress which is slower and less comprehensive than it should be in even the best cases, while most other countries are doing far less than that - and there seems plenty of reason to worry that advances in this area might stagnate, or even be reversed. The European Union is struggling with an economic crisis, and responding to it with austerity, which bodes poorly for major infrastructural programs. Recent hype over Canadian tar sands and the prospects for the extraction of massive amounts of oil and gas from shale are creating the illusion that the market has eliminated the problems of supply and, in the case of the United States, national independence, while the denial of the ecological side of the issue remains a major force in American politics, with the political right increasingly uncompromising on the issue. Japan's abrupt shift away from nuclear power has, at least in the short term, meant more fossil fuel use to keep the lights on, which may give that lobby a wedge to increase its presence in that market on a longer-term basis. Meanwhile, energy in fast-growing China and India remains tied to traditional sources, with the output of their installations of wind and solar dwarfed by their growing overall consumption.
In short, even while the feasibility of, and need for, a much more comprehensive and sustained shift away from fossil fuels and Generation III nuclear power to alternatives (with renewables the most attractive and feasible of these at the moment), is increasingly difficult to dispute, political realities continue to make it highly uncertain that even the wealthiest and most developed nations will travel this road swiftly, safely and successfully.
1. Where the movement past oil specifically is concerned, however, it is Japan that has been the leader, having reduced its oil consumption by an impressive 22 percent between its recent peak in 1996 and 2011. By contrast, Germany achieved a 14-15 percent drop after the peaking of its consumption in 1998. On the other hand, Germany extracts more GDP per unit of oil consumed - consuming one barrel for every $3500, compared with $2800 for Japan (and under $2200 for the U.S.) when Purchasing Power Parity is taken into account. (At market exchange rates, however, Japan is competitve with Germany, making this an issue of which indicator one regards as more appropriate.)
2. According to EIA statistics, the energy-intensity of the German economy dropped 19 percent between 1996 and 2009, while its carbon-intensity fell 26 percent. The comparable figures for the U.S. are 26 and 28 percent respectively, 8 and 7 percent for Japan, 19 and 20 for France, 32 and 32 for Britain. Only Italy, which has followed something closer to the German path, has done similarly well (its figures 6 and 13 percent, respectively).
3. Similarly, there is a tendency to point to subsidies for "green" energy and completely ignore the long history of massive subsidy for the fossil fuel and nuclear sectors (described by John Kenneth Galbraith as "fantastic favoritism") while complaining about market distortions.
Along with the mounting evidence of climate change (receding Arctic ice and glaciers, etc.), other forms of ecological catastrophe (the Deepwater Horizon explosion, etc.), and the heating of international conflict zones by the geopolitics of energy (the U.S.'s relationship with the Middle East, Moscow's relations with its provinces and neighbors in the Caucasus, the ongoing political theater surrounding the Diaoyu/Senkaku/Tiaoyutai Islands, etc.), that price rise reinforced the lesson made clear in the 1970s, but generally unheeded in the 1980s and 1990s: the necessity of shifting the world's energy base away from finite, depleted and dirty fossil fuels. Alas, dismayingly little was done about the matter in the 2000s.
Germany's case merits examination, as it has been demonstrably more ambitious in this area than the other major industrial economies.1 As the situation stands it gets about 25 percent of its electricity from renewables, a number all the more impressive because only a fifth of that comes from long-established hydroelectric power - elsewhere, by far the predominant source of non-fossil fuel, non-nuclear energy (as is certainly the case in China and the U.S.). Germany's recent expansion of its photovoltaic-based electricity production (accounting for 5 percent of German electricity now) has been especially striking, the total output of its solar energy installations exceeding that of the whole rest of Europe, despite the country's northerly location. One result is that, unlike most other countries (like the United States and Britain), the drop in the carbon intensity of the German economy has exceeded the drop in its energy intensity.2
Nonetheless, the picture is not untroubled. Germany's electricity production is particularly coal-reliant, with that reliance actually increased in the wake of the shutdown of several nuclear reactors - and being locked in for decades by the construction of new coal-fired plants. Additionally, Germany's development of a "smart grid" has not kept pace with the expansion of renewable energy with its more variable output, causing issues with grid reliability which media opponents of "green" initiatives like Der Spiegel are using to bolster specious arguments against renewable energy generally. (Simply put, they play up the headaches, which they treat not as technical teething issues in many cases resolvable today, but as somehow unavoidable problems of of these forms of energy production, while totally ignoring the problems raised by the fossil fuels and nuclear power plants they replace - making this a common approach with climate change "skeptics."3)
What emerges is a picture of progress which is slower and less comprehensive than it should be in even the best cases, while most other countries are doing far less than that - and there seems plenty of reason to worry that advances in this area might stagnate, or even be reversed. The European Union is struggling with an economic crisis, and responding to it with austerity, which bodes poorly for major infrastructural programs. Recent hype over Canadian tar sands and the prospects for the extraction of massive amounts of oil and gas from shale are creating the illusion that the market has eliminated the problems of supply and, in the case of the United States, national independence, while the denial of the ecological side of the issue remains a major force in American politics, with the political right increasingly uncompromising on the issue. Japan's abrupt shift away from nuclear power has, at least in the short term, meant more fossil fuel use to keep the lights on, which may give that lobby a wedge to increase its presence in that market on a longer-term basis. Meanwhile, energy in fast-growing China and India remains tied to traditional sources, with the output of their installations of wind and solar dwarfed by their growing overall consumption.
In short, even while the feasibility of, and need for, a much more comprehensive and sustained shift away from fossil fuels and Generation III nuclear power to alternatives (with renewables the most attractive and feasible of these at the moment), is increasingly difficult to dispute, political realities continue to make it highly uncertain that even the wealthiest and most developed nations will travel this road swiftly, safely and successfully.
1. Where the movement past oil specifically is concerned, however, it is Japan that has been the leader, having reduced its oil consumption by an impressive 22 percent between its recent peak in 1996 and 2011. By contrast, Germany achieved a 14-15 percent drop after the peaking of its consumption in 1998. On the other hand, Germany extracts more GDP per unit of oil consumed - consuming one barrel for every $3500, compared with $2800 for Japan (and under $2200 for the U.S.) when Purchasing Power Parity is taken into account. (At market exchange rates, however, Japan is competitve with Germany, making this an issue of which indicator one regards as more appropriate.)
2. According to EIA statistics, the energy-intensity of the German economy dropped 19 percent between 1996 and 2009, while its carbon-intensity fell 26 percent. The comparable figures for the U.S. are 26 and 28 percent respectively, 8 and 7 percent for Japan, 19 and 20 for France, 32 and 32 for Britain. Only Italy, which has followed something closer to the German path, has done similarly well (its figures 6 and 13 percent, respectively).
3. Similarly, there is a tendency to point to subsidies for "green" energy and completely ignore the long history of massive subsidy for the fossil fuel and nuclear sectors (described by John Kenneth Galbraith as "fantastic favoritism") while complaining about market distortions.
Thursday, June 21, 2012
Was War Impossible? Remembering Ivan Bloch
In 1898 Ivan Bloch published a study titled The Future of War in its Technical, Economic and Political Relations – widely known in English by the title of an abridged version, Is War Now Impossible?
In that study Bloch considered the evolution of military forces in his time. He paid particular attention to recent advances in weaponry (particularly the skyrocketing increases in the killing power of small arms and artillery in that period, and the advent of smokeless powder), advances in the techniques of fortification, and the size of modern armies – as well as the problems all these factors raised for command and control, logistics and the care of the wounded. He concluded that these factors would combine to make war a matter of prolonged, large-scale and extremely costly sieges, which would shift the advantage from the offensive to the defensive. A similar explosion in the technological sophistication and firepower of warships was simultaneously ongoing at sea, which seemed equally unlikely to prove a decisive instrument in the hands of any of the actors he examined, given the geopolitics of the continental states. (Britain, because of its special reliance on sea power as an island nation uniquely dependent on far-flung colonies and trade, was the only major European state really reliant on its navy, with the others more or less wasting their money – Russia most of all.)
At the same time, he looked at the vulnerability of urbanized, industrialized societies to the disruptions caused by the mobilization of a national economy (like the conscription of much of the labor force), and the cut-off of international trade which the outbreak of hostilities would involve; and to the psychological strain such warfare seemed bound to entail, especially over the course of prolonged fighting, both for soldiers in the field and civilians on the home front. In combination with the rough parity in military power between the two alliances prevailing on the continent (specifically the Triple Alliance of Germany, Austria and Italy, and the Franco-Russian entente), all of this guaranteed that any war would be a long contest of attrition, bound to push economies and populations to the breaking point, with the belligerents driving their societies to the point of collapse (victors as well as losers), and the most likely outcome socialist revolution.
Consequently, war, in the sense of general warfare among the great powers, had become impossible as a rational instrument of national policy. However, Bloch appreciated that the fact would not necessarily stop these nations from fighting exactly this kind of war, and that they might in fact experience the conflict before learning their lesson.
Of course, it was two decades before the war he wrote of actually came, and much did change in that time – with many of the changes rendering the fighting even deadlier. In his book Bloch had been attentive to improvements in rifles and artillery; he did not think of the machine gun or chemical weapons. He only dimly foresaw the advent of submarine warfare, and the advent of the aircraft and the armored tank (fielded only during the conflict) not at all. Despite these developments, the belligerents continued to endure the strain longer than he anticipated – but what he predicted did come to pass in its essentials. On the Western Front, the fighting was indeed characterized by trench warfare of the kind he described, and the war's end saw the collapse of Germany, Austria and Russia, which all saw socialist revolutions on their soil (and the Bolsheviks actually emerging triumphant in Russia).1
The western allies would seem to have suffered less than he suggested they would – but this can be chalked up to a profound shift in the pattern of alliances, with Britain joining the Franco-Russian entente a few years later, Italy following it in 1915, and the United States doing the same in 1917. Nonetheless, after the war Italy suffered a period of upheaval that ended with a fascist takeover – a revolution of its own, albeit from the right. France saw a round of dramatic strikes, and while little came of these, the country was left politically exhausted and divided, and remained so in the decades that followed. Britain, the European belligerent most sheltered from the war, suspended the gold standard, and accumulated a massive debt, while facing turmoil across its empire by war's end – including the rebellion that made Ireland free a few short years later, and labor unrest in England itself. Even the United States, despite its late entry, massive resources and comparative insulation from the fighting, proved not to be immune to the effects of the war on the world's economic system, the accumulation and mismanagement of international debts contributing to a worldwide Great Depression in the 1930s in which it was particularly hard hit.
All of this, of course, helped lead to the outbreak of World War II – that second taste of modern warfare he speculated the great powers might indulge in before recognizing the enterprise's futility. Granted, it was not a simple repeat of World War I, the conflict remembered today for armored offensives and strategic bombing rather than the static style of warfare that prevailed on the Western Front a generation earlier – but also its being an even bloodier, more destructive fight than that of 1914-1918. In fact, that destructiveness was such as to spell the end of the European states on which he'd focused as first-rank international actors, excepting the Soviet Union, which along with the United States (again, protected from the war's effects by its late entry into the war, and its distance from the fighting, as well as its sheer size and wealth) dominated the continent at its end. Western Europe rebuilt with American aid provided on a scale that had seemed unthinkable in the aftermath of the First World War, and their colonial empires dissolved in the following decades, while virtually every one of the European participants saw a new political order established in their territory, allies included – the French Fourth Republic, and in a milder form, "Labor" Britain, as well as post-war Germany (divided between East and West) and Italy. (By contrast, the Soviet Union, which suffered massive human and material losses, is thought by some to have never quite recovered from the war.)
It is notable, too, that Bloch's view that war had "become impossible," which had its adherents in the pre-war period (like H.G. Wells) but had little actual impact on practitioners, only continued to gain credence as the century progressed. In the interwar period, after the advent of strategic bombing and chemical weapons, it was not uncommon to view armed forces as instruments capable of putting an end to the modern world. The advent of the ballistic missile and the nuclear bomb made this outlook the conventional wisdom after 1945, so much so that while the American- and Soviet-led blocs competed globally and militarily, and confronted each other in numerous crises, neither resorted to a direct, open clash of arms with its principal opponent (even as thinkers on both sides continued to theorize and fantasize about how nuclear war might be made winnable).
Consequently, even as the armies grew, the alliance systems expanded and the technology evolved far beyond what he anticipated, the twentieth century bore out his essential predictions about how devastating general war had become, and what its consequences would be – enough, in fact, to show up the tiresome smugness of those who dismiss such predictive efforts out of hand. So far, the twenty-first century has done the same. Just last year I wrote that "the relations of the major powers are less defined by concerns about traditional, state-centered threats than at any time since the nineteenth century, if not earlier." That still seems to me an accurate assessment of the situation. The reality of nuclear weaponry and its associated delivery systems continue to make general war too destructive to be a practical option, among not only the great powers, but a circle of states expanding beyond them as well (as seen in areas like the Middle East and South Asia).
However, the possibility of a reversion to more intense military competition remains. Very large question marks still hang over the international system, and especially the three actors far and away most likely to be involved in a clash between great powers – China, Russia, and the United States. (The South China Sea, for instance, presents a more worrisome picture today than it did a few years ago.) Yet, it might reasonably be hoped that we will manage to avoid the stupidity and waste of such a course, which this increasingly crowded, interconnected and precarious planetary civilization can less and less afford.
1. Notably Russia was the one country where he'd dismissed such a possibility – though in fairness, he'd been writing twenty years earlier, when the country was less developed.
In that study Bloch considered the evolution of military forces in his time. He paid particular attention to recent advances in weaponry (particularly the skyrocketing increases in the killing power of small arms and artillery in that period, and the advent of smokeless powder), advances in the techniques of fortification, and the size of modern armies – as well as the problems all these factors raised for command and control, logistics and the care of the wounded. He concluded that these factors would combine to make war a matter of prolonged, large-scale and extremely costly sieges, which would shift the advantage from the offensive to the defensive. A similar explosion in the technological sophistication and firepower of warships was simultaneously ongoing at sea, which seemed equally unlikely to prove a decisive instrument in the hands of any of the actors he examined, given the geopolitics of the continental states. (Britain, because of its special reliance on sea power as an island nation uniquely dependent on far-flung colonies and trade, was the only major European state really reliant on its navy, with the others more or less wasting their money – Russia most of all.)
At the same time, he looked at the vulnerability of urbanized, industrialized societies to the disruptions caused by the mobilization of a national economy (like the conscription of much of the labor force), and the cut-off of international trade which the outbreak of hostilities would involve; and to the psychological strain such warfare seemed bound to entail, especially over the course of prolonged fighting, both for soldiers in the field and civilians on the home front. In combination with the rough parity in military power between the two alliances prevailing on the continent (specifically the Triple Alliance of Germany, Austria and Italy, and the Franco-Russian entente), all of this guaranteed that any war would be a long contest of attrition, bound to push economies and populations to the breaking point, with the belligerents driving their societies to the point of collapse (victors as well as losers), and the most likely outcome socialist revolution.
Consequently, war, in the sense of general warfare among the great powers, had become impossible as a rational instrument of national policy. However, Bloch appreciated that the fact would not necessarily stop these nations from fighting exactly this kind of war, and that they might in fact experience the conflict before learning their lesson.
Of course, it was two decades before the war he wrote of actually came, and much did change in that time – with many of the changes rendering the fighting even deadlier. In his book Bloch had been attentive to improvements in rifles and artillery; he did not think of the machine gun or chemical weapons. He only dimly foresaw the advent of submarine warfare, and the advent of the aircraft and the armored tank (fielded only during the conflict) not at all. Despite these developments, the belligerents continued to endure the strain longer than he anticipated – but what he predicted did come to pass in its essentials. On the Western Front, the fighting was indeed characterized by trench warfare of the kind he described, and the war's end saw the collapse of Germany, Austria and Russia, which all saw socialist revolutions on their soil (and the Bolsheviks actually emerging triumphant in Russia).1
The western allies would seem to have suffered less than he suggested they would – but this can be chalked up to a profound shift in the pattern of alliances, with Britain joining the Franco-Russian entente a few years later, Italy following it in 1915, and the United States doing the same in 1917. Nonetheless, after the war Italy suffered a period of upheaval that ended with a fascist takeover – a revolution of its own, albeit from the right. France saw a round of dramatic strikes, and while little came of these, the country was left politically exhausted and divided, and remained so in the decades that followed. Britain, the European belligerent most sheltered from the war, suspended the gold standard, and accumulated a massive debt, while facing turmoil across its empire by war's end – including the rebellion that made Ireland free a few short years later, and labor unrest in England itself. Even the United States, despite its late entry, massive resources and comparative insulation from the fighting, proved not to be immune to the effects of the war on the world's economic system, the accumulation and mismanagement of international debts contributing to a worldwide Great Depression in the 1930s in which it was particularly hard hit.
All of this, of course, helped lead to the outbreak of World War II – that second taste of modern warfare he speculated the great powers might indulge in before recognizing the enterprise's futility. Granted, it was not a simple repeat of World War I, the conflict remembered today for armored offensives and strategic bombing rather than the static style of warfare that prevailed on the Western Front a generation earlier – but also its being an even bloodier, more destructive fight than that of 1914-1918. In fact, that destructiveness was such as to spell the end of the European states on which he'd focused as first-rank international actors, excepting the Soviet Union, which along with the United States (again, protected from the war's effects by its late entry into the war, and its distance from the fighting, as well as its sheer size and wealth) dominated the continent at its end. Western Europe rebuilt with American aid provided on a scale that had seemed unthinkable in the aftermath of the First World War, and their colonial empires dissolved in the following decades, while virtually every one of the European participants saw a new political order established in their territory, allies included – the French Fourth Republic, and in a milder form, "Labor" Britain, as well as post-war Germany (divided between East and West) and Italy. (By contrast, the Soviet Union, which suffered massive human and material losses, is thought by some to have never quite recovered from the war.)
It is notable, too, that Bloch's view that war had "become impossible," which had its adherents in the pre-war period (like H.G. Wells) but had little actual impact on practitioners, only continued to gain credence as the century progressed. In the interwar period, after the advent of strategic bombing and chemical weapons, it was not uncommon to view armed forces as instruments capable of putting an end to the modern world. The advent of the ballistic missile and the nuclear bomb made this outlook the conventional wisdom after 1945, so much so that while the American- and Soviet-led blocs competed globally and militarily, and confronted each other in numerous crises, neither resorted to a direct, open clash of arms with its principal opponent (even as thinkers on both sides continued to theorize and fantasize about how nuclear war might be made winnable).
Consequently, even as the armies grew, the alliance systems expanded and the technology evolved far beyond what he anticipated, the twentieth century bore out his essential predictions about how devastating general war had become, and what its consequences would be – enough, in fact, to show up the tiresome smugness of those who dismiss such predictive efforts out of hand. So far, the twenty-first century has done the same. Just last year I wrote that "the relations of the major powers are less defined by concerns about traditional, state-centered threats than at any time since the nineteenth century, if not earlier." That still seems to me an accurate assessment of the situation. The reality of nuclear weaponry and its associated delivery systems continue to make general war too destructive to be a practical option, among not only the great powers, but a circle of states expanding beyond them as well (as seen in areas like the Middle East and South Asia).
However, the possibility of a reversion to more intense military competition remains. Very large question marks still hang over the international system, and especially the three actors far and away most likely to be involved in a clash between great powers – China, Russia, and the United States. (The South China Sea, for instance, presents a more worrisome picture today than it did a few years ago.) Yet, it might reasonably be hoped that we will manage to avoid the stupidity and waste of such a course, which this increasingly crowded, interconnected and precarious planetary civilization can less and less afford.
1. Notably Russia was the one country where he'd dismissed such a possibility – though in fairness, he'd been writing twenty years earlier, when the country was less developed.
Death of the Liberal Class, by Chris Hedges
New York: Nation Books, 2010, pp. 248.
Phrases like "liberal Establishment" have always struck me as oxymoronic. It is hard to see how anything can be both those things at once. Chris Hedges' latest book, Death of the Liberal Class, would seem to testify to the untenable position of those making the attempt.
As defined by Hedges, a liberal class (in contrast to the corporate-government-military "power elite") could be found holding positions in organized religion, the arts, universities, the media, unions and the Democratic Party.1 Of course, these institutions were never outside the reach of corporate and other conservative influences, the interests of which they usually did represent, but liberal views and voices were sufficiently present to constitute a force there.2
During that time this liberal class occupied the center of American politics, to the right of socialists and Communists, and to the left of the conservative establishment of the business-centered "power elite." It acted as a check on that elite's power, provided some representation for the disenfranchised (the poor and even the middle class), and in so doing made moderate, but meaningful, reforms possible. However, that class was ultimately coopted by the very power elite whose actions it had sought to mitigate, the party of the New Deal giving way to the party of Bill Clinton and Barack Obama, and the journalism of Progressive-era muckrakers to the crude, sadistic drivel of a Thomas Friedman. The result is that when liberals dare to criticize the status quo at all they limit themselves to only the most tepid kinds of critique, discussing tactics rather than goals or principles, and advocating mild reforms that have little meaning in the context of the "inverted totalitarianism" and "participatory fascism" Hedges identifies.3 Especially from the 1970s on this has meant the dismantling of every obstacle and restraint on corporate power, resulting in the juggernaut of neoliberal globalization, with all its destructive economic, social and ecological consequences – which, through climate change, may even threaten the survival of the species. Long reduced to the courtiers of the power elite, the liberals – by this point, given to celebrating corporate power, militarized foreign policies and the like – can hardly do much about it, making them impotent, irrelevant and despised even by the politically weaker groups they were supposed to defend (whom they have failed miserably).
According to Hedges' history, while the liberal elite was always compromised by its embrace of the power elite, its "greatest sin" (p. 15) was its collusion with the right against the left, during and after World War I, then after the left's revival amid the Great Depression, again during the Cold War (the role of these conflicts no coincidence, conditions of "permanent war" being inimical to the liberal class's balancing act). The corrupted remainder even joined in the attack on those within their own ranks who continued to adhere to liberalism's ostensible convictions. (Hedges profiles Richard Goldstone, Norman Finkelstein and Ralph Nader as current examples of jurists, scholars, journalists and activists betrayed in this manner – while also telling the story of how he was himself pushed off the pages of the New York Times, like many a principled liberal before him.)
Crushing the left resulted in the liberals' ending up the new left of center, eliminating their old role and dumping on them a new one they were incapable of filling. In the process the liberals also grew alienated from the very working class that they were supposed to speak for in a variety of ways, including the turn from economic issues to identity politics, or even a broader turn away from politics of all kinds (for instance, in the disengagement of the "beat" ethos in the '50s, and the preoccupation with psychoanalysis and mysticism in the '60s counterculture), all of which worked out in ways quite conducive to corporate power.
Meanwhile, the very institutions the class inhabited were being dismantled. The membership of both the mainline churches and the labor unions declined. Colleges educate more students than ever, but the professors who teach in them have been transformed into insecure part-timers in no position to carry out research or perform broader intellectual functions, while the receding pool of tenured faculty (going the way of unionized steelworkers to use his analogy) sticks to overspecialized study of the obscure and minute, and to theoretical debates inoffensive to those who hold genuine power. Where some have seen in it a source of hope, even the Internet is a problem for Hedges, the image-based culture of the new electronic media being far less conducive to rational, individual thought and debate than earlier print media, while further undermining artists, journalists and the like by making it impossible for them to earn a living, so that culture is turned over not merely to part-timers, but to "part-time amateurs."
As a result, not only has the left been neutralized, but the former center is moribund, leaving a vacuum in American politics which can only be filled from the political right, in the form of a right-wing populism with all its fascistic tendencies (already evident in movements like the Tea Party and the revival in militia activity), actually funded by the same corporate forces that brought about the crisis in the first place. Indeed, as he has done in earlier works (particularly 2007's American Fascists), Hedges draws a comparison between the United States today and Weimar Germany, characterizing the U.S. as now in greater danger than it was in the 1930s precisely because of the absence of the kind of countervailing left-center forces that existed then.
As far as Hedges is concerned, there can be no salvation from a rightist movement, only political regression, with the soft tactics of "inverted totalitarianism" perhaps being supplanted by the more overt ones of the classic kind. This line of development (or degeneration) concludes with an image of broad economic and ecological collapse (driven in large part by climate change) precipitating Roman Empire-style political collapse, and perhaps even species extinction, if the processes he describes are not halted.
Alas, there is little time for bringing about such a halt, and few options with the conventional avenues for dissent and reform ceasing to function. He sees little point in appealing to the conscience or enlightened self-interest of the power elite. While he regards the greatest potential for change as being among the disenfranchised, he is doubtful about the prospect of a mass movement emerging which can challenge that elite successfully, let alone the prospects for more egalitarian structures of power. Instead he advocates (non-violent) resistance in the form of small, individual acts undertaken more for their moral rightness than the chances of their contributing to a happy ending to the story. Those who would go on serving the role that the corrupted liberal class was supposed to would all but take vows of voluntary poverty to pursue their vocations of relieving misery, slowing the slide toward destruction, and upholding values like truth, justice and reason, with Hedges offering the life of Dorothy Day of the Catholic Workers' movement as an example of the kind of action he envisions.
Hedges' subject is a vast one, and a two hundred page book on it is necessarily the short version of the oft-told story of the bankruptcy of liberal institutions like the Democratic Party (Obama's chapter in which is already a well-established subject with the appearance of books like Paul Street's The Empire's New Clothes, Roger D. Hodge's The Mendacity of Hope and Tariq Ali's The Obama Syndrome during the past two years). Indeed, the book might be more appropriately titled Suicide of the Liberal Class, focusing as it does on the ways in which the class contributed to its own destruction. One might add that even here Hedges' focus is on the liberals' corruption by a combination of opportunism (the desire for patronage by the ambitious, the ruin of intellectuals by money) and fear (of appearing unpatriotic, or "soft on Communism"), rather than their strategic or tactical errors (like their elevation of identity politics over everything else, a story Todd Gitlin tells in The Twilight of Common Dreams).
Hedges also devotes little attention to events to both the liberals' left and right, despite the significance of events at both those ends of the political spectrum for the way in which this story played out. The frailty of the American radicalism that was so important to keeping liberals honest and relevant (a theme explored in works like Gabriel Kolko's Main Currents in American History, or Seymour Martin Lipset and Gary Marks' It Didn't Happen Here) is never really examined.4 Likewise, the ascendancy of the far right within the Republican Party (memorably related by alienated Republican insiders like John Dean, Kevin Phillips and Michael Lind), and the way in which their time in government not only shifted the political terrain, but tied the hands of any would-be liberals succeeding them (a process Thomas Frank explored in The Wrecking Crew), is not discussed. The result is that those looking for a really broad, full view of this history will have to supplement Hedges' book with a good deal of other reading.
It may be relevant that far more than is the case in most of these works Death of the Liberal Class is a jeremiad, its author's anger nothing short of scathing, and the severity of its assessment of our situation deeply depressing. (Given that I've been studying societal collapse for a decade now, I don't say this lightly.) Hedges' personal religious beliefs strongly inform his view of the situation, from his harsh criticism of hedonism and the "cult of the self," to the kind of resistance he advocates, which is suggestive of the example of the early Christians. The result is that even readers sympathetic to his broader position who happen not to share those particular beliefs may be put off by much of what he says. Certainly I found myself taking issue with his dismissal of any serious possibility of redressing the world's problems. (This can seem like a rejection of politics akin to those he criticized earlier generations of liberals for, if of a less obviously self-indulgent kind, as well as an abandonment of the responsibility to try and develop materially effective tools and strategies to deal with the situation.) Additionally, he seemed to me to slight the sciences (which can fairly be thought of as a bastion of the "liberal class"), even as he draws on science for his strongest argument for the dangers presented by our current trajectory – the likelihood of climate catastrophe. (I may also add here that it has long seemed to me that science and technology are certain to play a crucial role in any scenario in which we cope successfully with our ecological problems. Unfashionable as it may be, and dismaying as the technological stagnation of the last decade has been, far and away our best bet for a tolerable outcome is a "technological fix" that cuts the challenge down to size.)
Yet, it would be a mistake to dismiss Hedges' book as a rant or a screed. His passion may occasionally get the better of his style, but never his argumentation. Moreover, dire as his assessment is, there is no sense of tactical exaggeration, or the perverse eagerness to be validated by disaster that often appears in warnings of ecological doom. Rather there is a great deal of solid, well-grounded analysis here, informed by an impressive survey of the relevant literature, and Hedges recounts a great deal of history well worth knowing. His diagnosis of our political paralysis, the hollowness of our pieties and the role of liberalism's betrayals in this – the very heart of his critique – are especially compelling, and his defense of the value of the arts and humanities (so often slighted by others) is the strongest I have seen in some time. Indeed, after following Hedges' writing from War is a Force That Gives Us Meaning (2002) on, The Death of the Liberal Class struck me as a summary work, capping off a long period of reflection and study well worth the attention of those engaged by his earlier writing, and by those looking for an introduction to these issues as well.
NOTES
1. By "power elite" C. Wright Mills referred to the men and women "in command of the major hierarchies and organizations of modern society . . . the strategic command posts of the social structure" (The Power Elite, p. 4), corporate, state and military, which Mills viewed as interlocking, placing them in a common group with common interests - and quite distinct from the liberal elite described above.
2. Where the media is concerned, Hedges refers to Ed Herman and Noam Chomsky's critique of the press in Manufacturing Consent, which offers a "propaganda" model of the media in which the press is controlled by a high cost of operation subordinating it to business (from its need for expensive licensing to advertising revenue), its dependence on "sourcing" (e.g. government and business press releases) because of the high cost of investigative reporting, a sensitivity to the organized media criticism termed "flak," and the role "anti-Communism" has played "as a national religion" (Manufacturing Consent, p. 29).
3. Sheldon Wolin's theory of "inverted totalitarianism" describes a totalitarianism which has no demagogues or charismatic leaders, and no revolutionary structures and symbols (key trappings in the "classic" totalitarianism of Germany and Italy), but rather the preservation (and thoroughgoing corruption) of the old institutions and culture to support virtually complete corporate political control. Charlotte Twight's "participatory fascism" refers to a condition in which voter choice is reduced to the irrelevant.
4. American history in this respect is well worth comparing to that of Britain, where the Liberal Party was eclipsed by the left-of-center Labor Party, a story Pulitzer Prize-winner George Dangerfield recounts in his classic The Strange Death of Liberal England.
Tweet
Phrases like "liberal Establishment" have always struck me as oxymoronic. It is hard to see how anything can be both those things at once. Chris Hedges' latest book, Death of the Liberal Class, would seem to testify to the untenable position of those making the attempt.
As defined by Hedges, a liberal class (in contrast to the corporate-government-military "power elite") could be found holding positions in organized religion, the arts, universities, the media, unions and the Democratic Party.1 Of course, these institutions were never outside the reach of corporate and other conservative influences, the interests of which they usually did represent, but liberal views and voices were sufficiently present to constitute a force there.2
During that time this liberal class occupied the center of American politics, to the right of socialists and Communists, and to the left of the conservative establishment of the business-centered "power elite." It acted as a check on that elite's power, provided some representation for the disenfranchised (the poor and even the middle class), and in so doing made moderate, but meaningful, reforms possible. However, that class was ultimately coopted by the very power elite whose actions it had sought to mitigate, the party of the New Deal giving way to the party of Bill Clinton and Barack Obama, and the journalism of Progressive-era muckrakers to the crude, sadistic drivel of a Thomas Friedman. The result is that when liberals dare to criticize the status quo at all they limit themselves to only the most tepid kinds of critique, discussing tactics rather than goals or principles, and advocating mild reforms that have little meaning in the context of the "inverted totalitarianism" and "participatory fascism" Hedges identifies.3 Especially from the 1970s on this has meant the dismantling of every obstacle and restraint on corporate power, resulting in the juggernaut of neoliberal globalization, with all its destructive economic, social and ecological consequences – which, through climate change, may even threaten the survival of the species. Long reduced to the courtiers of the power elite, the liberals – by this point, given to celebrating corporate power, militarized foreign policies and the like – can hardly do much about it, making them impotent, irrelevant and despised even by the politically weaker groups they were supposed to defend (whom they have failed miserably).
According to Hedges' history, while the liberal elite was always compromised by its embrace of the power elite, its "greatest sin" (p. 15) was its collusion with the right against the left, during and after World War I, then after the left's revival amid the Great Depression, again during the Cold War (the role of these conflicts no coincidence, conditions of "permanent war" being inimical to the liberal class's balancing act). The corrupted remainder even joined in the attack on those within their own ranks who continued to adhere to liberalism's ostensible convictions. (Hedges profiles Richard Goldstone, Norman Finkelstein and Ralph Nader as current examples of jurists, scholars, journalists and activists betrayed in this manner – while also telling the story of how he was himself pushed off the pages of the New York Times, like many a principled liberal before him.)
Crushing the left resulted in the liberals' ending up the new left of center, eliminating their old role and dumping on them a new one they were incapable of filling. In the process the liberals also grew alienated from the very working class that they were supposed to speak for in a variety of ways, including the turn from economic issues to identity politics, or even a broader turn away from politics of all kinds (for instance, in the disengagement of the "beat" ethos in the '50s, and the preoccupation with psychoanalysis and mysticism in the '60s counterculture), all of which worked out in ways quite conducive to corporate power.
Meanwhile, the very institutions the class inhabited were being dismantled. The membership of both the mainline churches and the labor unions declined. Colleges educate more students than ever, but the professors who teach in them have been transformed into insecure part-timers in no position to carry out research or perform broader intellectual functions, while the receding pool of tenured faculty (going the way of unionized steelworkers to use his analogy) sticks to overspecialized study of the obscure and minute, and to theoretical debates inoffensive to those who hold genuine power. Where some have seen in it a source of hope, even the Internet is a problem for Hedges, the image-based culture of the new electronic media being far less conducive to rational, individual thought and debate than earlier print media, while further undermining artists, journalists and the like by making it impossible for them to earn a living, so that culture is turned over not merely to part-timers, but to "part-time amateurs."
As a result, not only has the left been neutralized, but the former center is moribund, leaving a vacuum in American politics which can only be filled from the political right, in the form of a right-wing populism with all its fascistic tendencies (already evident in movements like the Tea Party and the revival in militia activity), actually funded by the same corporate forces that brought about the crisis in the first place. Indeed, as he has done in earlier works (particularly 2007's American Fascists), Hedges draws a comparison between the United States today and Weimar Germany, characterizing the U.S. as now in greater danger than it was in the 1930s precisely because of the absence of the kind of countervailing left-center forces that existed then.
As far as Hedges is concerned, there can be no salvation from a rightist movement, only political regression, with the soft tactics of "inverted totalitarianism" perhaps being supplanted by the more overt ones of the classic kind. This line of development (or degeneration) concludes with an image of broad economic and ecological collapse (driven in large part by climate change) precipitating Roman Empire-style political collapse, and perhaps even species extinction, if the processes he describes are not halted.
Alas, there is little time for bringing about such a halt, and few options with the conventional avenues for dissent and reform ceasing to function. He sees little point in appealing to the conscience or enlightened self-interest of the power elite. While he regards the greatest potential for change as being among the disenfranchised, he is doubtful about the prospect of a mass movement emerging which can challenge that elite successfully, let alone the prospects for more egalitarian structures of power. Instead he advocates (non-violent) resistance in the form of small, individual acts undertaken more for their moral rightness than the chances of their contributing to a happy ending to the story. Those who would go on serving the role that the corrupted liberal class was supposed to would all but take vows of voluntary poverty to pursue their vocations of relieving misery, slowing the slide toward destruction, and upholding values like truth, justice and reason, with Hedges offering the life of Dorothy Day of the Catholic Workers' movement as an example of the kind of action he envisions.
Hedges' subject is a vast one, and a two hundred page book on it is necessarily the short version of the oft-told story of the bankruptcy of liberal institutions like the Democratic Party (Obama's chapter in which is already a well-established subject with the appearance of books like Paul Street's The Empire's New Clothes, Roger D. Hodge's The Mendacity of Hope and Tariq Ali's The Obama Syndrome during the past two years). Indeed, the book might be more appropriately titled Suicide of the Liberal Class, focusing as it does on the ways in which the class contributed to its own destruction. One might add that even here Hedges' focus is on the liberals' corruption by a combination of opportunism (the desire for patronage by the ambitious, the ruin of intellectuals by money) and fear (of appearing unpatriotic, or "soft on Communism"), rather than their strategic or tactical errors (like their elevation of identity politics over everything else, a story Todd Gitlin tells in The Twilight of Common Dreams).
Hedges also devotes little attention to events to both the liberals' left and right, despite the significance of events at both those ends of the political spectrum for the way in which this story played out. The frailty of the American radicalism that was so important to keeping liberals honest and relevant (a theme explored in works like Gabriel Kolko's Main Currents in American History, or Seymour Martin Lipset and Gary Marks' It Didn't Happen Here) is never really examined.4 Likewise, the ascendancy of the far right within the Republican Party (memorably related by alienated Republican insiders like John Dean, Kevin Phillips and Michael Lind), and the way in which their time in government not only shifted the political terrain, but tied the hands of any would-be liberals succeeding them (a process Thomas Frank explored in The Wrecking Crew), is not discussed. The result is that those looking for a really broad, full view of this history will have to supplement Hedges' book with a good deal of other reading.
It may be relevant that far more than is the case in most of these works Death of the Liberal Class is a jeremiad, its author's anger nothing short of scathing, and the severity of its assessment of our situation deeply depressing. (Given that I've been studying societal collapse for a decade now, I don't say this lightly.) Hedges' personal religious beliefs strongly inform his view of the situation, from his harsh criticism of hedonism and the "cult of the self," to the kind of resistance he advocates, which is suggestive of the example of the early Christians. The result is that even readers sympathetic to his broader position who happen not to share those particular beliefs may be put off by much of what he says. Certainly I found myself taking issue with his dismissal of any serious possibility of redressing the world's problems. (This can seem like a rejection of politics akin to those he criticized earlier generations of liberals for, if of a less obviously self-indulgent kind, as well as an abandonment of the responsibility to try and develop materially effective tools and strategies to deal with the situation.) Additionally, he seemed to me to slight the sciences (which can fairly be thought of as a bastion of the "liberal class"), even as he draws on science for his strongest argument for the dangers presented by our current trajectory – the likelihood of climate catastrophe. (I may also add here that it has long seemed to me that science and technology are certain to play a crucial role in any scenario in which we cope successfully with our ecological problems. Unfashionable as it may be, and dismaying as the technological stagnation of the last decade has been, far and away our best bet for a tolerable outcome is a "technological fix" that cuts the challenge down to size.)
Yet, it would be a mistake to dismiss Hedges' book as a rant or a screed. His passion may occasionally get the better of his style, but never his argumentation. Moreover, dire as his assessment is, there is no sense of tactical exaggeration, or the perverse eagerness to be validated by disaster that often appears in warnings of ecological doom. Rather there is a great deal of solid, well-grounded analysis here, informed by an impressive survey of the relevant literature, and Hedges recounts a great deal of history well worth knowing. His diagnosis of our political paralysis, the hollowness of our pieties and the role of liberalism's betrayals in this – the very heart of his critique – are especially compelling, and his defense of the value of the arts and humanities (so often slighted by others) is the strongest I have seen in some time. Indeed, after following Hedges' writing from War is a Force That Gives Us Meaning (2002) on, The Death of the Liberal Class struck me as a summary work, capping off a long period of reflection and study well worth the attention of those engaged by his earlier writing, and by those looking for an introduction to these issues as well.
NOTES
1. By "power elite" C. Wright Mills referred to the men and women "in command of the major hierarchies and organizations of modern society . . . the strategic command posts of the social structure" (The Power Elite, p. 4), corporate, state and military, which Mills viewed as interlocking, placing them in a common group with common interests - and quite distinct from the liberal elite described above.
2. Where the media is concerned, Hedges refers to Ed Herman and Noam Chomsky's critique of the press in Manufacturing Consent, which offers a "propaganda" model of the media in which the press is controlled by a high cost of operation subordinating it to business (from its need for expensive licensing to advertising revenue), its dependence on "sourcing" (e.g. government and business press releases) because of the high cost of investigative reporting, a sensitivity to the organized media criticism termed "flak," and the role "anti-Communism" has played "as a national religion" (Manufacturing Consent, p. 29).
3. Sheldon Wolin's theory of "inverted totalitarianism" describes a totalitarianism which has no demagogues or charismatic leaders, and no revolutionary structures and symbols (key trappings in the "classic" totalitarianism of Germany and Italy), but rather the preservation (and thoroughgoing corruption) of the old institutions and culture to support virtually complete corporate political control. Charlotte Twight's "participatory fascism" refers to a condition in which voter choice is reduced to the irrelevant.
4. American history in this respect is well worth comparing to that of Britain, where the Liberal Party was eclipsed by the left-of-center Labor Party, a story Pulitzer Prize-winner George Dangerfield recounts in his classic The Strange Death of Liberal England.
Tweet
Subscribe to:
Posts (Atom)