Sunday, October 28, 2012

In Defense of Futurology

Anyone who looks into futurology quickly encounters the enormous skepticism surrounding the whole discipline, denunciations like Max Dublin's Futurehype: The Tyranny of Prophecy reflecting common sentiments.

It would be pointless to deny that there is much to critiques like Dublin's. It is indisputable that futurologists do not often get things right, least of all when the predictions are of the dramatic, headline-making kind. It is equally indisputable that not all of the mistakes have been made in good faith; the public constantly has unfounded, dishonest and astonishingly arrogant predictions being foisted on it, which have often originated in or become the tools of vested interest. These have often diverted our attention away from real and immediate problems to phantasmic ones (as with so many proclamations about foreign military threats). They have also been used to deflect calls for meaningful action on pressing matters with illusions of cheap, easy fixes if one only waits (the apparent attitude of many Singularitarians to poverty and ecological problems), or even things working out for the best all by themselves (like the promise that speculative bubbles may have burst disastrously in the past, but this latest one is different). In short, predictions have often been worse than useless, doing a great deal of damage.

Yet, the fact remains that prediction is unavoidable. It is unreasonable to expect that self-conscious, time-conscious beings will not consider what lies ahead of them--and even were this possible, their refraining from doing so would be undesirable. Intelligent action requires presumptions about outcomes, which necessarily involves prediction, explicit or implicit, especially when planning is involved. As the range, scale and complexity of the requisite planning grows, so does the requisite predictive effort. Given the realities of a thoroughly globalized industrial civilization, there is a genuine need for the kinds of elaborate, specialized efforts that have made an industry of forecasting.

Moreover, just as we have to be mindful of the harm caused by bad predictions, it is well worth remembering that the failure to heed those sound predictions eventually validated by events have also done much harm, and threaten to do much more, as with the dismissal of Ivan Bloch's predictions about what a general European war in the early twentieth century would be like, the warnings about financial speculation run amok again and again down to the meltdown of 2008, the problem of climate change today. Seen within a larger context, the problem seems to be less a particular intellectual tool for understanding the world (unique as its influence may be in respects) than the corruption of our intellectual life by political agendas, the failure of the media to lucidly present the issues of the day, the inadequate training of even the better-educated parts of the public to think critically and cope with nuance.

Alas, these quite obvious sides of the issue seem to carry little weight with the knee-jerk detractors who seem to see something intrinsically wrong with this activity--a view which in most cases says more about them than it does the failings of futurology. Some regard predictability as excluding the possibility of freedom (apparently not worrying that freedom in a totally random universe would deprive choice of its meaning), or are simply suspicious of the exercise of human reason, especially when applied to the human sphere (a position which has run throughout the conservative tradition, all the way down to today's postmodernists). Libertarians in particular seem to dislike prediction because of its association with planning, which they associate with impiety toward the Market, and with government intervention in it (while they conveniently overlook the ways in which private actors must plan). I suppose, too, that there is a considerable amount of plain old anti-intellectualism in the sentiment (as those who make the more influential or nuanced predictions tend to be experts). And the truth is that in many a case an individual's level of contempt tends to reflect their gut response to what a particular futurologist happens to be saying. (I have found that climate change "skeptics" get quite nasty toward the whole field when they encounter predictions about this particular subject.)

Rather than writing off futurology (something we simply cannot do), the only viable approach is to strive to get better at making predictions, and to get better at making use of them. We need to critically examine the claims we get, not just the "what" and "when" and "where" of them, but the "how" and the "why." Along with a genuine diversity of views--and a readiness to call out those who really are the venal mouthpieces of special interests--this is far and away the best protection against those who would seek to monopolize real debates, or manufacture fake ones by sowing pseudo-scientific doubt (like those who would have us believe there is no connection between tobacco and cancer, or carbon emissions and climate change).

We should not look to the field for a precision or a certainty beyond what it can actually give. Especially when the distance between the present and the point in the future is considerable, we should expect to get just the rough outline of "things to come"--which is, in the end, what the most successful predictions generally seem to strive for, perhaps because that is all we can detect through the noise. We need to think less in terms of inevitabilities than spectrums of probability robust enough to survive life's frictions, surprises and the range of choices that may appreciably affect them; to think less in terms of simplistic projections of the present than the ways in which trends peter out, and accelerate, and interact with one another; to recognize the attached caveats and respond accordingly--to not shy away from committing to this or that expectation, but be able to react with something better than dumbfounded stupidity if events take a different course.

And even where predictions do fall short, it is well worth remembering that asking the large questions has a way of yielding insights that might not otherwise emerge, a not inconsiderable prize in itself.

Saturday, October 27, 2012

A Scarcity of "New Thinking"

Looking back on intellectual history the middle of the last century - the 1950s especially, but to differing degrees the years immediately preceding and following it - seems to have been an exceptionally fertile time for popular writing on sociology and economics and the history related to them.

Today's more striking public intellectuals are not without ideas of their own, but the writers of the '50s remain an essential point of reference and overall influence, and the work of the newer authors, at its most substantial, tends to carry forward the earlier arguments and critiques. Andrew Bacevich is perfectly clear on the matter of his debt to William Appleman Williams. Morris Berman, discussing his latest, points out that "it is actually part of a lineage, the path initially staked out by Richard Hofstadter, C. Vann Woodward and Louis Hartz" in the late 1940s and 1950s. At the close of The Wrecking Crew it is to Richard Hofstadter that Thomas Frank looks back. Indeed, Chris Hedges, in Death of the Liberal Class, compares our current crop of public intellectuals - unfavorably - with these predecessors of a half century ago.

Why was this the case? Perhaps it was a reflection of the unique opportunities available to that generation of authors. (It is hard to picture a John Kenneth Galbraith or a William H. Whyte working at Fortune Magazine today, or to think of any liberal intellectual - or any intellectual for that matter - who had the kind of long, varied experience of government work Galbraith did during World War II and the Cold War.) It seems likely, too, that this was partly a matter of the novelty of the problems of the post-war years, which have since become so familiar, and remain deeply entangled with our newer problems, so much so that commenting on them it is harder to appear fresh or original. (Reading Barbara Ehrenreich's Bait and Switch, for instance, one sees a world deeply changed since the time of Whyte's The Organization Man - and yet in other ways, very much the same, not least in the use and abuse of personality tests.) Another reason may have been the assumption of a more literate general audience. (Much as Galbraith has been justly praised for his urbane prose style, major publisher's today might take issue with its suitability to a work of pop economics aimed at today's market.) But certainly part of it was the existence of a political climate which permitted the airing of a range of ideas that was wider in some respects, and a stronger hope that the expression of those ideas could actually mean something out in the real world. (The irony that they lived in the era of the House Un-American Activities Committee should not be lost on us.)

The last is only too apparent in the tone of the writings, imbued with a sense that there is little that will be done about today's problems, regardless of what they say (the titles of books like those by Hedges and Berman saying it all). This apathy is hardly going unnoticed. When the hand on the Bulletin of the Atomic Scientists' Doomsday Clock moved one minute forward this month, Kennette Benedict, the magazine's executive director, cited the lack of "new thinking" as a factor.

As far as I can tell, no one has expressed real surprise at that statement.

Sunday, October 21, 2012

On the European Union's Nobel Peace Prize

In Isaac Asimov's classic story "The Evitable Conflict," Stephen Byerley suggests that the major conflicts of human history were never settled by force. Instead each of them "persisted through a series of conflicts, then vanished of itself – 'not with a bang, but with a whimper,' as the economic and social environment changed." So it was with the competition of the Houses of Hapsburg and Valois-Bourbon for European hegemony, or the clash of Catholic and Protestant for dominion over Western Christendom. So it also was with the "nationalist-imperialist wars" of the nineteenth and twentieth centuries
when the most important question in the world was which portions of Europe would control the economic resources and consuming capacity of which portions of non-Europe . . . Until the forces of nationalism spread sufficiently, so that non-Europe ended what all the wars could not, and decided it could exist quite comfortably all non-European.
This explanation of the end of that cycle of wars, published in this story a mere five years after VE Day, seems to me far more persuasive than its attribution to regional integration via the European Union and its predecessors - something inextricable from the rationale given for the EU's receipt of this year's Nobel Peace Prize, its role in uniting the continent.

The fact of Europe's Cold War-era division, the much more powerful presence of the U.S. and Soviet Union on the continent's balance of power, to say nothing of their considerable control over the members of their alliance systems, marginalized the importance of the balance of power among the West European states themselves. It is worth remembering, too, that the elites of these countries were ideologically committed to internationalist anti-Communism far and above intra-European nationalistic grievances (in West Germany's case, even above German unity). At any rate, their defense expenditures and military activities were constrained by the damage to their economies during the war and the costs of rebuilding after, and in the case of Britain, France, the Netherlands and Belgium their efforts to prop up the remnants of their colonial empires - with even that much implausible without the United States footing the bill (the U.S. paid eighty percent of the cost for France's failed war to retain Indochina) and impossible in the face of U.S. disapproval (as demonstrated during the 1956 Suez War). For all the blood shed over them, those empires vanished quickly (non-Europe clearly decided to be non-European, to borrow Asimov's phrase), eliminating one of the principal objects of contention among the European powers for the last four centuries. Meanwhile the division of Germany sharply reduced its margin of demographic and economic superiority over France and Britain, and along with their acquisition of nuclear weaponry, guaranteed its inability to appreciably threaten them - even without the considerable, immediate restraints imposed by the superpowers.

At the same time welfare state capitalism (prompted, in part, by fears of the radicalization of Western electorates in its absence) brought rapid growth and distributed its fruits widely enough to produce generalized and rising affluence, dampening prickly nationalisms and contributing greatly to domestic stability. Along with the war-weariness induced by the conflict (which came on top of the war-weariness that followed in the wake of World War I), reinforced by the tastes later age cohorts had of conflict in colonial campaigns and Cold War missions from Londonderry to Algiers to the East Indies, and war scares where the Bomb seemed dangerously close to coming into play, this greatly diminished the appeal of belligerent grandstanding over territorial disagreements or perceived insults to the national honor. What aggressive nationalism remained tended to be directed internally, against the post-war influx of immigrants (exclusion, rather than expansion, its orientation).

These changes, of course, survived the economic downturn of the 1970s, and the end of the Cold War. Certainly Germany's reunification caused anxieties in certain quarters, but the melodrama that surrounded the event - like Margaret Thatcher's absurd request that Mikhail Gorbachev "halt" the process, or the conversations between Francois Mitterand's aide Jacques Attali with Anatoly Chernyaev regarding the revival of the '30s-era Franco-Soviet alliance - merely testified to lagging perceptions among the generation that lived through the war (and in some cases, one suspects, an even more profound disconnect from reality), demonstrated most clearly by the level of abstraction in the discussions, the scarcity of anything like concrete scenarios. (No one really expected some renewed German assault on Alsace-Lorraine or Operation Barbarossa II, for instance, or articulated anything comparable that they did regard as more plausible.)

Consequently, rather than making peace possible, it would seem that the EU was made possible by the roles of the U.S. and Soviet Union and the bigger Cold War that hung over the old conflicts; by the end of Europe's empires; by the arrival of the nuclear age and the limitations it imposed on old-style war-making; by the comforts of welfare capitalism; by the weariness which followed the hugely destructive wars of the earlier era, the smaller wars of the post-1945 period, and the frightening prospect of a war even more destructive than all those seen in the past - with almost all of these developments significantly originating in or prompted by events outside Europe.

Of course, that raises the question of what the European Union did with the opportunity afforded it by all the factors discussed above. The Union did achieve a unity of sorts - but its achievements have actually been least impressive in the realm of foreign policy and defense, this remaining very much the purview of separate nation-states (and what those states actually did prominent in criticisms of the award from both the right and left). Integration has been much deeper in the continent's economic life, but the contribution this actually made to peace is wide open to question, by those who question the claims made for international trade as pacifier, those who are doubtful about free trade as an engine of prosperity, and those who see the Union's combination of expansion and neoliberal prescriptions (the consequences of which are quite apparent on European streets from Athens to Lisbon) as having been quite inimical to the cause the prize is supposed to celebrate and advance.1

1. I am reminded at this point of a different, more recent science fiction tale, Robert J. Sawyer's novel Flash Forward (the basis for the short-lived TV series). In the book businessman Mr. Cheung, after developing a technique to make human beings immortal, decides to offer the treatment to Nobel Prize winners, peace prize winners included, as "by and large the selections are deserving" of the award. As the critics of this prize demonstrate, this is not a view with which all would agree.

Wednesday, October 17, 2012

Review: The Affluent Society, by John Kenneth Galbraith

Boston: Houghton Mifflin, 1958, pp. 368.

John Kenneth Galbraith's starting point in The Affluent Society is the observation that historically the great economic problem was that of "producing enough," a concern enshrined in the mainstream of economic thought (by way of Smith, Malthus, Ricardo, Mill and Marshall, and Marx, Veblen and Keynes for that matter), and by the Social Darwinism that found such a warm reception in the United States (where Herbert Spencer became hugely influential). Accordingly, the stress of economic thought was on the maximization of the output of the essentials of life--food, clothing, shelter, fuel--through the most efficient possible use of the available resources prioritized. As it happened, the items in question lent themselves toward private, individual production and consumption, while proponents of this ideal venerated the free operation of a competitive market as a device for forcing capitalists and employers to make the utmost effort in getting production up and price down. The result was, not incidentally, a tendency to value the private sector, and denigrate the public sector (at least, where goods besides physical security are concerned), a pattern which extended to the view of increased private consumption as a gain to prosperity, and increased public consumption a reduction in it. Going alongside this was the attitude that society had little choice but to tolerate poverty, given the gap between what was produced, and what was needed.1

However, Galbraith contends that by the twentieth century the advanced industrial countries of the Western world had largely solved the production problem, the central issue no longer securing an adequate supply of life's essentials--society now "affluent" in that respect. In making his case Galbraith points to the many ways in which these societies had deeply changed since the Industrial Revolution. The truly poor (those who do not get enough to eat, etc.) had been transformed from the vast majority into a minority. An increased share of production was devoted to satisfying not indisputable physical needs (to a great extent, satiated), but the artificial wants manufactured by consumer culture's unprecedented use of techniques like planned obsolescence and advertising.

This was all as people worked less, the 70-hour week giving way to the 40-hour week, while child labor became a thing of the past, and the retirement of the elderly became routine (all of these, indicators of the reduced urgency of the demand for labor and its longtime principal products). Additionally, rather than the failure of supply the failure of demand, as seen in the Great Depression, came to be construed as the great threat to prosperity, while the continued stress on the maximization of production has been motivated less by the desirability of more goods than of production's role in maintaining full employment. Galbraith even contended that the expansion of production, and with it, general well-being, had gone a long way to diminishing the political charge of the issue of inequality.

As a result, in place of the traditional problems of poverty there were now the side effects of an economy of private affluence. The maintenance of full employment as the norm meant inflationary pressure, while consumer culture brought with it a tendency toward oppressive and unsustainable debt accumulation. At the same time the advent of such private prosperity meant an imbalance with the comparative poverty of the public sector and the public goods it supplied, such as educational, health and policing services, urban planning, and the protection of the natural environment raising new issues, not the least of them the ways in which lacks here undermine the private sector's operation. The remediable failures of the education system, for instance, harm scientific R & D efforts. Likewise, the failures of city planning have an adverse effect on the quantity and quality of the housing stock available to home buyers. And so on.

The solution of the old problems, and their replacement by new unsolved problems (by making the public sector affluent as the private sector had become, to the benefit of both), went largely unrecognized because of the tendency of the "conventional wisdom" (a term Galbraith coined in this book) about the issue to lag behind circumstance.2 Nonetheless, Galbraith was confident that change was possible, even likely, and certainly his argument did find an audience at the time. The idea that there could be economic life beyond flat-out GDP maximization in fact became a commonplace in social science, social comment and futurological speculation during the years that followed, William Appleman Williams, Gunther Stent and Alvin Toffler, among many, many others, all featuring it prominently in their work. Since then, ideas such as "sustainable" growth have gained currency in certain circles, as have broader measures of wealth, like the General Progress Indicator.

However, it seems that we are further than ever from embracing the thinking Galbraith presented, American society regarding itself as far from affluent in the way he suggests (even though per-capita GDP has more than doubled since then). The prioritization of sheer GDP growth as the sole test of economic performance, the identification of the private sector with goods and the public sector with costs, the Victorian severity toward the poor, all remain the predominant assumptions over a half century later.

One has to wonder why, given that there seems a great deal of merit to his argument. Certainly one part of the explanation seems to be Galbraith's underestimation of the extent to which the "good life" would continue to be perceived in terms of more private consumption, and on a related note, the intensely private way in which Americans think about their problems, individual and collective (to which C. Wright Mills, stronger on this score, often pointed). Galbraith also had a tendency to underestimate the power of vested interests to affect public policy, and how the shift in the terms of economic debate in the neoliberal revolution (made possible by a combination of economic crisis in the early 1970s and a politics of cultural backlash) would permit new circumstances to reinforce the conventional wisdom he criticized, leaving the mainstream less open to newer thinking.

At least in part because of neoliberal policies like the lowering of trade barriers, financial deregulation, and the weakening of organized labor and worker protections, the last four decades have been marked by much lower rates of economic growth, as well as intensified international economic competition, recurrent financial crises, higher unemployment, and greater inequality. Americans may have remained privately "affluent"--but the vast majority did not get much more affluent by even the most conventional measures, and they felt less secure in what they did have. If anything, given that their personal budgets were often pressed by inflation sufficient to wipe out the additions to purchasing power represented by rises in their income, the increasingly expensive post-secondary education increasingly seen as a requirement for remunerative employment, the exploding costs of health care, and the mounting private debt loads to which all these added, they had reason to feel less affluent than before.

Meanwhile, the reality of public poverty grew only more pronounced under the same circumstances, as slower growth rates, rising dependency ratios (putting pressure on benefits for the retired or elderly as they had been traditionally organized), and the public portion of the growing tab for educational and health care spending (and the shift away from progressive taxation even as the distribution of wealth became more unequal) squeezed government budgets, making bigger deficits and mounting government debt the long-term trend. And the neoliberal context made all the difference as to how this was perceived.

It is conceivable that under other circumstances the redress of the public sector's problems may have been seen as part of a broader project of reviving overall prosperity. The growing numbers of retirees might have been viewed as relieving the pressure of job creation and inflation, for instance. Expanded public programs in the areas of education and health care might have been seen as a way of relieving those burdens on the individual consumer. And so on and so forth. However, it became the conventional wisdom that the thing to do was not to improve public services, but to turn them into for-profit private services (a drive given greater urgency by the construing of those larger numbers of retirees as a fiscal disaster in the making). And while, despite all the cutbacks, the proportion of government budgets devoted to transfer payments cannot be said to have shrunk there is no doubt that the response of individuals was increasingly to substitute private goods for public ones; to respond to the inadequacies of the public education system by paying more out of pocket to send one's children to private school, for instance.3

This transfer of formerly public burdens to the private individual did that much more to make what could have appeared like private affluence instead appear as private scarcity, and make the prioritization of expanded private consumption (especially with the idea of expanding public consumption delegitimized) appear the solution--with old-fashioned GDP growth the means.4 And of course, the broader social inequality attending the process by which the lives of wage and salary-earners became so strained meant that much more need for a convenient solvent for the resulting tensions, the obvious candidate for which was, again, GDP growth (rhetorically, at any rate).

In retrospect it seems Alain Touraine who, far more than any counterpart of his in the English-speaking world (Galbraith included), was prescient in contending that post-industrial society would be the most growth-driven and GDP-obsessed in history, with the post-2008 economic shock doing surprisingly little to shake up the mainstream's embrace of neoliberalism. Nonetheless, the case for moving beyond the "conventional wisdom" he described, for thinking of public as well as private affluence, and looking beyond the narrow definition of growth so dear to economic orthodoxy, has only grown stronger with the passage of the decades.

1. One irony of this view, not fully recognized in Galbraith's writing, is the role of public support in creating private prosperity even in the eras of Smith and Ricardo and Mill, much of the public preferring to stick with simplistic Horatio Alger and Edisonade images of how fortunes are made, corporate giants born and Third World poverty turned into industrialized affluence. It remains an irony of American political life that those who are most inclined to evoke the Founding Fathers are those most inclined to neglect the lessons of Alexander Hamilton's "Report on Manufactures"; those most inclined to speak of American economic exceptionalism, those least inclined to forget that there was such a thing as an "American School" of economic thought, and the role it played in building the American economy.
2. Among Galbraith's specific, practical suggestions was a delinking of work and income through a more generous, but also more carefully managed, program of unemployment insurance so that less-than-full-employment became an acceptable norm politically, relieving inflationary pressure. He also suggested a sales tax on private consumption for the funding of improved public services. More broadly, he argued for the elimination of toil (unpleasant work performed wholly for pay as it renders no other satisfaction), making all workers part of what he called the "New Class" (in which people have not just jobs, but professions and careers, accessible through education).
3. An even more dramatic example may be the response to the problems of American cities--rather than making them more habitable through sounder urban planning, improved public transport, the redress of poverty and the like, the typical course has been relocation to suburbs and exurbs, which bring associated expenses like greater car ownership, longer commutes, and the private security bills of gated communities.
4. Under the circumstances, such an idea as using a sales tax to divert dollars spent on private consumption into the funding of better public services--as Galbraith proposes--could only have been anathema.

Tuesday, October 16, 2012

Review: The New Industrial State, by John Kenneth Galbraith

Boston: Houghton Mifflin, 1967, pp. 427.

The conventional view of the American economy is that it is an arena where market forces predominate so that no single buyer or seller exercises significant control over prices; individuals and small business are the principal sources of innovation; and enterprises fight to keep government and organized labor at bay as they pursue that crucial form of feedback, profit, won through their success or failure in the competition to cater best to the wants of the consumer.

However, in his 1967 classic The New Industrial State John Galbraith made the case for quite a different reality. The American economy, Galbraith held, was thoroughly planned--if in a somewhat more diffuse way than contemporaneous Soviet-style economies. Rather than some government bureau attempting to direct the whole, the function was spread among a small number of key actors, in particular the largest industrial corporations. This was because only the large corporation could raise the capital and organize the skills needed to design and produce the highly complex products characteristic of mid-twentieth century technology.1 And even for them, such investments were only viable when the market and its uncertainties were tamed--not least, the tastes of consumers.

Such corporations pursued the reduction of uncertainty at a number of levels, from that of the individual firm (through such methods as vertical integration, or the use of the market power that comes with large size to influence the prices at which they buy their inputs and sell their production), to that of the usually oligopolistic industries of which they were a part (through the avoidance of disruptive competition among firms), to the level of the whole economy through their influence on and cooperation with government (which pays for much R & D, and regulates demand with high public outlays, progressive taxation, and wage and price controls) and their dealings with organized labor (negotiations with which stabilize wage levels, and often bring it into line as an ally when pursuing government contracts). These corporations also made the private consumer a more predictable actor, not only through research of the consumer in advance of the development of the products to be marketed to them, but the advertising molding his or her tastes.2

The corporations, in turn, were dominated by their "technostructures," the assemblages of technicians that made such companies functional, who had to be seen in this collective way because the planning process indispensable to their operations was far too vast and complex to be controlled by a single individual (given the volume and variety of information that had to be collected and processed, and decisions that had to be made). One result was the diminished power of the Chief Executive Officer and other senior management, often reduced to ratifying the decisions of comparatively obscure experts within their firms.3 In their turn, managers (typically salaried employees rather than owner-founders or their heirs) had seen their power grow relative to that of shareholders, while these were even further removed from being able to usefully observe and understand the intricate internal workings of these companies, and their prospects for exercising detailed control over company operations were further diminished by the wide diffusion of stock ownership. The result was that, so long as a company continued to deliver an "acceptable" level of profit, the technostructure enjoyed the degree of autonomy without which its elaborate planning was impossible.

The culture of such "mature" corporations naturally differed from that of such enterprises in their more formative phases, or the businesses of earlier eras altogether. The businessmen of the classic, nineteenth century mold (the Henry Ford type, for example), thoroughly "individualistic," and resolutely anti-intellectual, anti-statist and anti-union in sensibility and policy, were a poor fit with the model of enterprise Galbraith described in this book--and prone to get into trouble when trying to run a mature industrial company (as Ford did in Galbraith's account of his career in his earlier The Liberal Hour).

By contrast the "new" CEOs were more comfortable in an organization, more willing to give expertise some respect, and more pragmatic in their dealings with government and labor, permitting the smoother operation of their businesses and the economy as a whole. They were not unconcerned with profit, but maximizing it was not their sole or even primary object--in part because they were salaried personnel whose own income was less closely connected to their company's short-term fortunes, and in part because other motives had come to the fore, in particular "identification" with the company that gave them their privileged positions, and the satisfaction afforded by the "adaptation" of the company in line with their own, particular ideas about its mission.

Of course, identification and adaptation were not such powerful motives for less senior personnel. However, line workers too were more secure and better compensated than before, living as they did in a time of high employment and rising living standards (affording them much more than life's basics), while they were increasingly trading their blue collars for white ones and the less grinding working conditions that went with them, enabling those factors to influence them in a way they could not have before. The result was to weaken the sense of the employer (at least, in mature companies with a pragmatic attitude toward labor) as an enemy, helping to make the relationship between employee and employer less fraught with insecurity and confrontation, and the softer line of new-style, "enlightened" management toward organized labor that much more viable.

Galbraith's discussions of identification and adaptation assumed company as well as personal goals beyond the merely pecuniary, and Galbraith naturally discussed this matter at length, writing of the "social goals" which are the formal missions of major industrial firms--objects at which they strove to make a profit, rather than having been incidental to profit-making as such. Lest the utopian-sounding rhetoric of privately owned, profit-driven corporations pursuing "social goals" cause confusion, what Galbraith referred to by the term are rising production and consumption, technological advance, and the agendas associated with it: namely the growth of the Gross Domestic Product, and the waging of the Cold War.

This model of the economy's "commanding heights" laid out, Galbraith then proceeded to consider the system's weaknesses, offering an extensive critique of the limits of the goals pursued by the country's technostructures. As he noted, the manner in which government supported R & D and sustained demand was highly militarized, a fact which helped lock in Cold War tensions, and all their dangers (the most extreme of which was major nuclear war). Additionally, values and interests out of line with the accepted goals--like leisure, ecology, aesthetics--were typically denigrated and marginalized. This raised the question of who would promote peace rather than war, protect the environment, or defend intellectual and cultural values, and other such essential goods. Galbraith believed the answer lay in the emergence of what he termed the "educational and scientific estate." The dependence of the technostructure on this "estate" for expert knowledge, technological advance, and the training of the work force, made its members both more numerous, and more powerful, and made them a potential champion of those needs and values so poorly served by the rest of the system.

Taken together these ideas comprise a satisfactorily comprehensive analysis of the heart of the American economy at the time of the book's writing, as well as certain of its key problems and potential palliatives, one which remains relevant today in many ways. What Galbraith's book has to say about how large, high-tech enterprises work is just as true today as it was then--if not more so. The indispensability of the technostructure to the major industrial enterprise, the marginalization of the individual entrepreneur and small business in the economy (and especially in high-tech manufacturing), the fact of oligopoly, the reality of close collaboration between big business and government, are irrefutable facts of twenty-first century life. The long-term decline of organized labor Galbraith recognized is an equally irrefutable fact of American economic history in the past half century.

Yet, there is also no disputing that the "ideology" of the older "entrepreneurial" mentality Galbraith identified with older-style, profit-fixated businessmen resurged in the 1970s with the ascendance of neoliberal economic thought. Of course, for all the rhetoric, small enterprises did not reverse a two century-long trend toward Big Business or Big Government; if anything, the movement in this direction remained as strong as ever. Nonetheless, the associated attitudes profoundly changed the milieu in which the technostructures operate--generally in ways unconducive to their functioning. This has most obviously been the case at the macroeconomic level as deregulation and loose money policies, as well as the abandonment of full employment as a policy goal, intensified some forms of economic competition and made demand less reliable (by holding down wages, encouraging the dependence of consumption on unsustainable borrowing, increasing the incidence of financial crises, etc.). However, it has also been the case in the ways in which individual corporations have been run, by contributing to the dominance of the industries of which Galbraith wrote by a speculation-minded financial sector intent on short-run profit and share price maximization (reinforced by the linking of executive compensation to them). This has led to a preoccupation with cost-cutting (often targeting areas like R & D) directed at making quarterly earnings statements appear more attractive, at the expense of long-range company development; an increasing investment of company resources in mergers-and-acquisitions games as they pursue or fend off takeover attempts, at the expense of other imperatives; and the neglect of a company's core business (i.e. its "social goal") to focus on more profitable speculative finance (as General Motors did).

The economic performance of the advanced industrial nations during this time period, unsurprisingly, has been characterized by technological stagnation (evident in just about every area but IT) and falling growth rates (which compare very unfavorably with the post-World War II period), while the failure of corporate giants has long ceased to be an unheard-of event (as the events of 2008 and after have reminded us all).4 Consequently, even though Galbraith did not anticipate this turn, the economic history of these decades lends a great deal of support to his theory about how high-tech, capital-intensive firms, and the economies founded on them, necessarily operate. Indeed, taken along with the rather weak performance of the scientific and educational estate in the role of "loyal opposition," one can conclude that the principal failure of Galbraith (like many another mid-century American liberal) was his overestimation of the degree to which American political culture would adapt to the economic and technological realities that had become so apparent by mid-century.5

1. Galbraith's ideas regarding the decline of competition, and the emergence of oligopoly and its relationship to technological innovation, were previously (and more fully) elaborated in his earlier American Capitalism: The Concept of Countervailing Power (1952).
2. As their income enables them to go further and further beyond the point of meeting their most basic physical needs (food, shelter, warmth), consumers enjoy an increasing range of choice in their use of their purchasing power--another uncertainty business tries to manage. The result is that instead of being catered to by business, business strives to determine their wants for them. This idea was previously discussed in American Capitalism, and received even fuller treatment in 1958's The Affluent Society (reviewed here).
3. Galbraith remarks that "financial markets have long since accepted the reality of the technostructure as distinct from the entrepreneur," and goes on to jokingly paint an image of how the financial world would hang on "anything affecting his tenure in office," fluctuations in his health major news, and his replacement "handicapped like a horse." Yet, is such fuss not a routine matter where celebrity CEOs like Steve Jobs have been concerned--in part because it remains the norm to identify the achievements of a whole company with its chief?
4. There are those who would point to information technology as a counterexample to Galbraith's analysis of the place of the individual entrepreneur and the start-up in the '70s and after. Yet, the actual history of computing as we know it is dominated by state-subsidized R & D (it was the Defense Department which produced the ARPANet, CERN which gave us the World Wide Web) and the research of established firms like IBM (the hard disk drive, DRAM), Hewlett-Packard (the first to market a "personal computer") and Xerox (the graphical user interface). Additionally, the success of newer firms meant that they quickly fell into line with the model of innovation described here, without which Apple could not have delivered the Ipad, and Google would not be experimenting with driverless cars--the phase of old-fashioned entrepreneurship on closer inspection smaller and shorter-lived than the hype would have it. It is also worth keeping in mind that, even to the extent that it can be construed as exceptional, the sector has received disproportionate attention because of the false impressions created by the inflated share-prices of IT companies (as with Apple and Google right now), the frequent exaggeration of the economic impact of personal computing (the contribution to productivity has been hard to find), and its convenience as ideological fodder (proponents of orthodox economic thinking and purveyors of techno-hype desperately seizing on the myth in their search for some validation of their ideas).
5. Chris Hedges offers a more recent and quite different take on that estate's influence and vitality in Death of the Liberal Class.

Friday, October 12, 2012

Toward a Post-Fossil Fuel Era: A Note

In 1998 oil prices dropped to their post-energy crisis low of $10 a barrel, and soon enough began their upward climb, especially apparent after 2003. In the half-decade that followed the price of oil increased from that twenty year high by a factor of five, touching $150 a barrel in July 2008 before dropping again. However, it never got anywhere near its 1998 price, generally staying well above even their 2003 price, despite the downward pressure that the worst economic crisis since the Great Depression of the 1930s placed on demand. (The current figure is $92 a barrel, more than six times the 1998 price in real terms, and more than twice the 2003 price.)

Along with the mounting evidence of climate change (receding Arctic ice and glaciers, etc.), other forms of ecological catastrophe (the Deepwater Horizon explosion, etc.), and the heating of international conflict zones by the geopolitics of energy (the U.S.'s relationship with the Middle East, Moscow's relations with its provinces and neighbors in the Caucasus, the ongoing political theater surrounding the Diaoyu/Senkaku/Tiaoyutai Islands, etc.), that price rise reinforced the lesson made clear in the 1970s, but generally unheeded in the 1980s and 1990s: the necessity of shifting the world's energy base away from finite, depleted and dirty fossil fuels. Alas, dismayingly little was done about the matter in the 2000s.

Germany's case merits examination, as it has been demonstrably more ambitious in this area than the other major industrial economies.1 As the situation stands it gets about 25 percent of its electricity from renewables, a number all the more impressive because only a fifth of that comes from long-established hydroelectric power - elsewhere, by far the predominant source of non-fossil fuel, non-nuclear energy (as is certainly the case in China and the U.S.). Germany's recent expansion of its photovoltaic-based electricity production (accounting for 5 percent of German electricity now) has been especially striking, the total output of its solar energy installations exceeding that of the whole rest of Europe, despite the country's northerly location. One result is that, unlike most other countries (like the United States and Britain), the drop in the carbon intensity of the German economy has exceeded the drop in its energy intensity.2

Nonetheless, the picture is not untroubled. Germany's electricity production is particularly coal-reliant, with that reliance actually increased in the wake of the shutdown of several nuclear reactors - and being locked in for decades by the construction of new coal-fired plants. Additionally, Germany's development of a "smart grid" has not kept pace with the expansion of renewable energy with its more variable output, causing issues with grid reliability which media opponents of "green" initiatives like Der Spiegel are using to bolster specious arguments against renewable energy generally. (Simply put, they play up the headaches, which they treat not as technical teething issues in many cases resolvable today, but as somehow unavoidable problems of of these forms of energy production, while totally ignoring the problems raised by the fossil fuels and nuclear power plants they replace - making this a common approach with climate change "skeptics."3)

What emerges is a picture of progress which is slower and less comprehensive than it should be in even the best cases, while most other countries are doing far less than that - and there seems plenty of reason to worry that advances in this area might stagnate, or even be reversed. The European Union is struggling with an economic crisis, and responding to it with austerity, which bodes poorly for major infrastructural programs. Recent hype over Canadian tar sands and the prospects for the extraction of massive amounts of oil and gas from shale are creating the illusion that the market has eliminated the problems of supply and, in the case of the United States, national independence, while the denial of the ecological side of the issue remains a major force in American politics, with the political right increasingly uncompromising on the issue. Japan's abrupt shift away from nuclear power has, at least in the short term, meant more fossil fuel use to keep the lights on, which may give that lobby a wedge to increase its presence in that market on a longer-term basis. Meanwhile, energy in fast-growing China and India remains tied to traditional sources, with the output of their installations of wind and solar dwarfed by their growing overall consumption.

In short, even while the feasibility of, and need for, a much more comprehensive and sustained shift away from fossil fuels and Generation III nuclear power to alternatives (with renewables the most attractive and feasible of these at the moment), is increasingly difficult to dispute, political realities continue to make it highly uncertain that even the wealthiest and most developed nations will travel this road swiftly, safely and successfully.

1. Where the movement past oil specifically is concerned, however, it is Japan that has been the leader, having reduced its oil consumption by an impressive 22 percent between its recent peak in 1996 and 2011. By contrast, Germany achieved a 14-15 percent drop after the peaking of its consumption in 1998. On the other hand, Germany extracts more GDP per unit of oil consumed - consuming one barrel for every $3500, compared with $2800 for Japan (and under $2200 for the U.S.) when Purchasing Power Parity is taken into account. (At market exchange rates, however, Japan is competitve with Germany, making this an issue of which indicator one regards as more appropriate.)
2. According to EIA statistics, the energy-intensity of the German economy dropped 19 percent between 1996 and 2009, while its carbon-intensity fell 26 percent. The comparable figures for the U.S. are 26 and 28 percent respectively, 8 and 7 percent for Japan, 19 and 20 for France, 32 and 32 for Britain. Only Italy, which has followed something closer to the German path, has done similarly well (its figures 6 and 13 percent, respectively).
3. Similarly, there is a tendency to point to subsidies for "green" energy and completely ignore the long history of massive subsidy for the fossil fuel and nuclear sectors (described by John Kenneth Galbraith as "fantastic favoritism") while complaining about market distortions.

Subscribe Now: Feed Icon