In his landmark 1993 paper "The Coming Technological Singularity: How to Survive the Post-Human Era," Vernor Vinge raised four possible paths to the "intelligence explosion" that is the Singularity. The first three involved computers--1. the advent of a computer with what we would call strong "artificial intelligence"; 2. a computer network's more spontaneously developing such intelligence; 3. an integration of human and computer intelligence through advanced mind-machine interfaces (allowing, for example, a human brain with its superior pattern recognition and other abilities to access the vastly superior computational capacities and data storage of computer systems); and 4. a purely biological enhancement of human intelligence.
It is, of course, the case that we hear most about number one, somewhat less about two and three, and least of all about number four, the non-computer, biological option.
Why is that the case?
My impression is that this reflects the observable rate of progress in those areas. Where the computing sector has had "Moore's Law," and the geometrical expansion of the computing power available at any given price (and total computing power on Earth) that goes with it, the medical sector's feats have been . . . less impressive. Indeed, the contrast has been sufficiently pointed, and noted, that the term "Eroom's Law" has been coined to refer to how the development of treatments and cures has, rather than becoming quicker and cheaper in the manner of computers, become slower and more expensive instead.
Amid all that it has been far easier to picture rising computer power beating biotechnology to the finish line in the race to produce a greater-than-human intelligence for decades. In fact, even with some questioning the likelihood of Moore's Law's continuing as microchip fabrication reaches the apparent limits of the possible in silicon, and the long-awaited substitutes prove to be (like so much else these days) "behind schedule," it can still seem that way.
Saturday, July 17, 2021
Why Do People Deny That Tony Blair Was a Neoliberal?
I have previously remarked that, where everyone who denied that Bill Clinton and Hillary Clinton and Barack Obama were neoliberals was a troll clearly acting in bad faith, I have encountered people who denied that Tony Blair was a neoliberal who seemed to only be expressing their honest opinion (even if it was totally contradicted by the facts).
Why was that the case? I think the likeliest answer is that they set the bar for not being a neoliberal so low, seeing any deviation from the austerity-deregulation-privatization-etc. agenda as disqualifying one from inclusion in such company. And as they point out, Blair did indeed increase funding for health and education, for example. Yet he did it within conditions of broader austerity, paid for the NHS with increased payroll taxes and for college by charging tuition (breaking campaign promises on this score, repeatedly), and continued the longtime, piecemeal, "backdoor" privatization of both those services (with his PFIs and his "concordat" with the private health sector and his internal markets and his "school choice" and much else). Blair did make working people some concessions, like a minimum wage and some latitude to unionize, but he drew a hard line beyond the minimums he granted, enough so for this to be a sticking point where Britain's participation in the EU's further integration was concerned (Blair regarding the neoliberal EU's courts as too dangerously leftish to be allowed to decide cases between British labor and British management). Meanwhile, whether the issue was upper-income tax levels or monetary policy or privatization, or stringency in the face of users of the social safety net, or constraints on strike action, or government passivity in the face of deindustrialization and embrace of a free-wheeling financial sector as the driver of the economy instead--or even the rhetoric in which he explained and justified it all, which could easily appear to out-Thatcher the early Thatcher--Blair consistently carried forward the model Thatcher handed down.
The result is that, a couple of tweaks apart, his neoliberal credentials are indisputable and overwhelming. More pertinent to the present, however, is what such confusion means in this moment when we are hearing so much about "the end of neoliberalism." The simple truth is that pundits have been calling neoliberalism finished over and over again for four decades now--in part because they assumed that the intellectual discrediting of the model in so many more eyes must necessarily translate to its end. Alas, things are more complicated than that--while certainly it would take more than a few Blair-like tweaks to really bring about a shift to another economic approach. It would require a whole new way of thinking about how economies achieve growth, and distribute its benefits, the way neoliberalism represented when it became a real political force back in the '70s, and no such change is even suggested by anyone remotely mainstream now.
Why was that the case? I think the likeliest answer is that they set the bar for not being a neoliberal so low, seeing any deviation from the austerity-deregulation-privatization-etc. agenda as disqualifying one from inclusion in such company. And as they point out, Blair did indeed increase funding for health and education, for example. Yet he did it within conditions of broader austerity, paid for the NHS with increased payroll taxes and for college by charging tuition (breaking campaign promises on this score, repeatedly), and continued the longtime, piecemeal, "backdoor" privatization of both those services (with his PFIs and his "concordat" with the private health sector and his internal markets and his "school choice" and much else). Blair did make working people some concessions, like a minimum wage and some latitude to unionize, but he drew a hard line beyond the minimums he granted, enough so for this to be a sticking point where Britain's participation in the EU's further integration was concerned (Blair regarding the neoliberal EU's courts as too dangerously leftish to be allowed to decide cases between British labor and British management). Meanwhile, whether the issue was upper-income tax levels or monetary policy or privatization, or stringency in the face of users of the social safety net, or constraints on strike action, or government passivity in the face of deindustrialization and embrace of a free-wheeling financial sector as the driver of the economy instead--or even the rhetoric in which he explained and justified it all, which could easily appear to out-Thatcher the early Thatcher--Blair consistently carried forward the model Thatcher handed down.
The result is that, a couple of tweaks apart, his neoliberal credentials are indisputable and overwhelming. More pertinent to the present, however, is what such confusion means in this moment when we are hearing so much about "the end of neoliberalism." The simple truth is that pundits have been calling neoliberalism finished over and over again for four decades now--in part because they assumed that the intellectual discrediting of the model in so many more eyes must necessarily translate to its end. Alas, things are more complicated than that--while certainly it would take more than a few Blair-like tweaks to really bring about a shift to another economic approach. It would require a whole new way of thinking about how economies achieve growth, and distribute its benefits, the way neoliberalism represented when it became a real political force back in the '70s, and no such change is even suggested by anyone remotely mainstream now.
The End of Neoliberalism? Don't Bet On it
For as long as neoliberalism has been around we have been hearing about its imminent demise--because it has been so consistently disappointing, even by its own lights. (It is worth remembering, for example, that Margaret Thatcher and Ronald Reagan promised their policies would lead to the renewal of their countries' industrial bases. They delivered accelerated deindustrialization instead.) And if the comment and analysis, especially that accorded mainstream space, overwhelmingly consisted of cheerleading for that wave, there were people, even there, who called it as any sane person had to see it. In the wake of the early, yet severe, setbacks, for example, Lester Thurow's Dangerous Currents (1983) was so named because of how policymakers were navigating economic difficulties without a viable theory--because clearly the neoliberal theories were not that. Yet neoliberalism endured, such that a decade on in his bestseller Head to Head (1992) Thurow was declaring its death yet again on the grounds that the post-World War II General Agreement on Tariffs and Trade was giving way to a three-way neomercantilist competition between the U.S., Europe and Japan--and again, his call proved premature.
So did it go again in 1997, after the Asian financial crisis, and in 2000-2001 after the New Economy bubble burst and a raft of colossal financial scandals that in the U.S. had a Republican (!) President signing Sarbanes-Oxley and appointing a vigorous new SEC chief to crack down on Wall Street malfeasance. That mood didn't last, though, but soon enough when that same Republican was presiding over a still bigger financial train wreck there was more talk of "This can't go on" and "This time it's for realz!"--which in many minds was affirmed by the election of a new President--the first Northern, blue state Democrat since Kennedy, one might add--whose slogan, supposedly addressed to the change-minded, was "Yes we can." Instead what he really seemed to be saying was "Yes we can go on being neoliberal," and indeed he did for his eight years in office . . .
Of course, it was less clear than before that he was right about that--certainly where ignoring backlash was concerned, with the backlash from the right, which was to see Obama hand the office over not to his former Secretary of State Hillary Clinton, but to Donald Trump. Since then we have seen trade war, and pandemic, and a still worse economic crisis making the phrase "worst since the '30s" appear ever-more tired without its being at all untrue, and neomercantilist maneuvers in areas like microchip-making--but that hardly signals the end, as becomes clearer if we think about what neoliberalism really means. Neoliberalism, after all, is not just the rollback of the industrial development-welfare-macroeconomic management state (hence the tax cuts and privatizations and deregulation and union-busting and the rest), but a still broader way of running an economy, easier to understand when one looks beyond the official theorizing and the intellectual history to how things actually work.
One can explain the matter this way. Where at mid-century the prevailing growth model had been a "Keynesian Fordism" that leveraged the "propensity to consume," particularly by way of manufacturing, to expand the Gross Domestic Product, neoliberalism (or "Neoliberal Financialization") leverages the "propensity to invest," and centers on finance, making for the centrality of that speculation-minded, creditism-pumped, digitally-enabled, globalized financial sector that is the hallmark of the era. And far from stepping away from that economic model, governments the world over remain strongly committed to it (hence, along with the tax cuts, etc., also the ever-looser monetary policy, the quantitative easing, and the rest, to keep the gamblers on asset values at the casino table), to the point that alternatives are not even imagined by anyone pretending to mainstream respectability. So long as that remains the case, the most that might be imagined would be Keynesian patches to a Neoliberal system--and that is indeed the most we have seen to date, with the direction of a little corporate welfare to manufacturers, a little more money for social services, rather than even the slightest hint of a New Deal era-like change in model.
So did it go again in 1997, after the Asian financial crisis, and in 2000-2001 after the New Economy bubble burst and a raft of colossal financial scandals that in the U.S. had a Republican (!) President signing Sarbanes-Oxley and appointing a vigorous new SEC chief to crack down on Wall Street malfeasance. That mood didn't last, though, but soon enough when that same Republican was presiding over a still bigger financial train wreck there was more talk of "This can't go on" and "This time it's for realz!"--which in many minds was affirmed by the election of a new President--the first Northern, blue state Democrat since Kennedy, one might add--whose slogan, supposedly addressed to the change-minded, was "Yes we can." Instead what he really seemed to be saying was "Yes we can go on being neoliberal," and indeed he did for his eight years in office . . .
Of course, it was less clear than before that he was right about that--certainly where ignoring backlash was concerned, with the backlash from the right, which was to see Obama hand the office over not to his former Secretary of State Hillary Clinton, but to Donald Trump. Since then we have seen trade war, and pandemic, and a still worse economic crisis making the phrase "worst since the '30s" appear ever-more tired without its being at all untrue, and neomercantilist maneuvers in areas like microchip-making--but that hardly signals the end, as becomes clearer if we think about what neoliberalism really means. Neoliberalism, after all, is not just the rollback of the industrial development-welfare-macroeconomic management state (hence the tax cuts and privatizations and deregulation and union-busting and the rest), but a still broader way of running an economy, easier to understand when one looks beyond the official theorizing and the intellectual history to how things actually work.
One can explain the matter this way. Where at mid-century the prevailing growth model had been a "Keynesian Fordism" that leveraged the "propensity to consume," particularly by way of manufacturing, to expand the Gross Domestic Product, neoliberalism (or "Neoliberal Financialization") leverages the "propensity to invest," and centers on finance, making for the centrality of that speculation-minded, creditism-pumped, digitally-enabled, globalized financial sector that is the hallmark of the era. And far from stepping away from that economic model, governments the world over remain strongly committed to it (hence, along with the tax cuts, etc., also the ever-looser monetary policy, the quantitative easing, and the rest, to keep the gamblers on asset values at the casino table), to the point that alternatives are not even imagined by anyone pretending to mainstream respectability. So long as that remains the case, the most that might be imagined would be Keynesian patches to a Neoliberal system--and that is indeed the most we have seen to date, with the direction of a little corporate welfare to manufacturers, a little more money for social services, rather than even the slightest hint of a New Deal era-like change in model.
Thursday, July 15, 2021
Of British Middle Classness
I have been struck again and again by the long record of attempts by British politicians to persuade their public that the old class differences for which Britain has been famous, and even notorious, have been dissolved within a state of generalized classness, where everyone was somehow vaguely "middle class." Thus there was John Major's rhetoric of a "classless society," and, as if Major had somehow succeeded in bringing that about, Tony Blair's display of open contempt for class differences and class conflict as having "no relevance whatsoever to the modern world" in his 1997 General Election Manifesto while shortly after John Prescott declared that "We are all middle class now." There was also negative affirmation of such pretensions in the denial that there was such a thing as a "respectable" working class anymore, the worthy supposed to have long since been uplifted into the middle class, leaving behind only a residue of "chavs" who have only themselves to blame for their lot--while there was scarcely more acknowledgment of anything above the middle class (Blair characterizing the class conflict he was dismissing as "middle class versus working class"), with commentary about the undeniably ultra-privileged speaking of the children of privilege as merely "upper middle class."
However the British public has not been entirely persuaded by this talk--indeed, seem to have been considerably less persuaded than, for example, their American counterparts. After all, the reality of class differences is especially hard to ignore in a country where there is a monarchy and a House of Lords and knighthoods, where there are "public schools" like Eton and the cult of the "gentleman" and the intricate, ostentatious hierarchy of accents George Bernard Shaw so famously satirized to so little real-world consequence--and one must remember is the Old World with its lower physical mobility, stronger family and community ties, and associated longer memories.
Indeed, Selina Todd holds that so far as Britons are concerned, what makes the working class working class is still the fact that it must live by work, and especially work for others with their means of production, rather than live off of the work of others by way of its possession of means of production--bluntly speaking, by who has power and who does not--even if some of those workers happen to have college degrees, white collar jobs, houses and cars. This would seem reflected in nearly sixty percent of Britons identifying themselves as working class in a recent poll (with, it seems, far higher numbers doing so in regions where the prosperity of the super-rich is less in evidence and less influential, like the northeast, where nearly eighty percent regard themselves as such).
In short, in spite of the persistent efforts of conservatives and neoliberals, it would seem that in Britain the traditional conception of class endures.
However the British public has not been entirely persuaded by this talk--indeed, seem to have been considerably less persuaded than, for example, their American counterparts. After all, the reality of class differences is especially hard to ignore in a country where there is a monarchy and a House of Lords and knighthoods, where there are "public schools" like Eton and the cult of the "gentleman" and the intricate, ostentatious hierarchy of accents George Bernard Shaw so famously satirized to so little real-world consequence--and one must remember is the Old World with its lower physical mobility, stronger family and community ties, and associated longer memories.
Indeed, Selina Todd holds that so far as Britons are concerned, what makes the working class working class is still the fact that it must live by work, and especially work for others with their means of production, rather than live off of the work of others by way of its possession of means of production--bluntly speaking, by who has power and who does not--even if some of those workers happen to have college degrees, white collar jobs, houses and cars. This would seem reflected in nearly sixty percent of Britons identifying themselves as working class in a recent poll (with, it seems, far higher numbers doing so in regions where the prosperity of the super-rich is less in evidence and less influential, like the northeast, where nearly eighty percent regard themselves as such).
In short, in spite of the persistent efforts of conservatives and neoliberals, it would seem that in Britain the traditional conception of class endures.
The Limits of Reform in Labour Britain
In the post-World War II period Britain had a government committed to full employment and a massively expanded welfare state (with the NHS, free education up to the college level, a two-tier pension system), which also countenanced the unionization of the work force and was prepared to go so far as nationalization to achieve its economic ends. In the resulting circumstances jobs were plentiful, poverty fell, the country enjoyed a "golden age of social mobility," and there was a broad rise in living standards.
Naturally to many progressives looking back at it from the neoliberal era, and the "New Labour" party of Tony Blair, Gordon Brown and Keir Starmer it seems an enviable lot--such that, inheriting the New Labour vision in its moment of total bankruptcy, Keir Starmer evokes the memory of the era, the "spirit of '45," with a frequency matched only by his utter inability to convince.
Still, it is worth remembering that in its post-war heyday the Labour Party's record, even setting aside the extent to which the Conservative governments in power half the time halted or even rolled back many of its initiatives (particularly in areas like public housing); or for that matter, the way in which an outsized military profile and the associated balance of payments problems tied the Labour Party's own hands with regard to the realization of its own projects when it was in power; was regarded as a great letdown by many of its most ardent supporters.*
In considering this it helps to remember what Old Labour promised. The party's constitution explicitly committed it to the emancipation of the working class on the basis of collective ownership of the means of production and distribution, and many took this seriously, expecting that a Labour government, especially one with a broad mandate like the one elected in '45, would deliver a society that would really be economically, socially, politically equal, where workers would exercise greater power in society and control over their lives--or at least, go a very long way to this. However, the vision of the politically effective portion of the Labour Party was, at least where the near term was concerned, rather less far to the left, rather more technocratic and meritocratic. Their nationalizations were selective and saw nationalized businesses continuing to run on the same business-like lines as before, the scope they allowed unions was for bargaining over wages and conditions and not workers' control of industry (while much of the work force was still not represented by unions at all), their welfare state was funded by contribution rather than distribution, and they preached equality of opportunity rather than equality of outcome (a social ladder still existent, if with working people given somewhat more opportunity to climb it than they had before).
Thus if there was high employment, and poverty was in decline, and workers' living standards rising, even by the differing, more modest metric the center was using to judge the country's progress, it was still not all they might have hoped for. Granted the benefits of postwar consumerism as a consolation prize of sorts as the socialist dream fell by the wayside, workers found that much enjoyment of those benefits required required overtime, second incomes and a good deal of credit, which left them more harried, more insecure, and feeling less in control of their lives than they would have hoped to be in a situation where jobs were plentiful and wages going up. And for all the meritocratic talk the record of social mobility looks less impressive on close examination. Where educational and occupational opportunity was concerned, it was far easier to give students more years in school than to really equalize the educational opportunities of the children of manual workers and the children of privilege, let alone open up access to the more prestigious, remunerative careers that young people were likely to dream of becoming someday, the volume of which did not budge so much. Especially with the economy demanding more assembly-line workers than technologists, more clerks than Chief Executive Officers, more nurses than surgeons, more schoolteachers than Oxford dons or Fleet Street journalists or barristers (never mind novelists), there was not much more room at the top, and traditional public school-and-Oxbridge privilege substantially monopolized it. The result was that in the end not very many moved up, not many of those who did move up moved very far up, and even those who made headway tended to do so only after great exertions and even sacrifices, at the end of which they may have been unsure the game was worth the candle--while more often than not alert to the difference there would have been in the prospects of talent being recognized and duly rewarded, or simple "return on effort," had they been born in a higher social station.
These disappointments and frustrations were by no means slight, or slight grounds for the disenchantment of many with the situation then (memorably dramatized in the writings of "angry young men" like John Osborne). Still, just as with the post-war American anxieties over a business culture turning the country's white collar-wearing workers into "organization men," those worries can seem a comparative luxury in the post-Thatcher, post-Blair world in which working people are told to not even dream of social safety nets and living wages--a reminder of just how far the expectations of what may scarcely be able to call itself a "left" anymore have fallen.
* In the thirty-four years of the 1945-1979 period the Conservatives were the party of government for seventeen years (1951-1964, 1970-1974), fully half the time, with the earlier thirteen year stretch (with its three General Election victories) arguably critical in reining in the party's grander visions.
Naturally to many progressives looking back at it from the neoliberal era, and the "New Labour" party of Tony Blair, Gordon Brown and Keir Starmer it seems an enviable lot--such that, inheriting the New Labour vision in its moment of total bankruptcy, Keir Starmer evokes the memory of the era, the "spirit of '45," with a frequency matched only by his utter inability to convince.
Still, it is worth remembering that in its post-war heyday the Labour Party's record, even setting aside the extent to which the Conservative governments in power half the time halted or even rolled back many of its initiatives (particularly in areas like public housing); or for that matter, the way in which an outsized military profile and the associated balance of payments problems tied the Labour Party's own hands with regard to the realization of its own projects when it was in power; was regarded as a great letdown by many of its most ardent supporters.*
In considering this it helps to remember what Old Labour promised. The party's constitution explicitly committed it to the emancipation of the working class on the basis of collective ownership of the means of production and distribution, and many took this seriously, expecting that a Labour government, especially one with a broad mandate like the one elected in '45, would deliver a society that would really be economically, socially, politically equal, where workers would exercise greater power in society and control over their lives--or at least, go a very long way to this. However, the vision of the politically effective portion of the Labour Party was, at least where the near term was concerned, rather less far to the left, rather more technocratic and meritocratic. Their nationalizations were selective and saw nationalized businesses continuing to run on the same business-like lines as before, the scope they allowed unions was for bargaining over wages and conditions and not workers' control of industry (while much of the work force was still not represented by unions at all), their welfare state was funded by contribution rather than distribution, and they preached equality of opportunity rather than equality of outcome (a social ladder still existent, if with working people given somewhat more opportunity to climb it than they had before).
Thus if there was high employment, and poverty was in decline, and workers' living standards rising, even by the differing, more modest metric the center was using to judge the country's progress, it was still not all they might have hoped for. Granted the benefits of postwar consumerism as a consolation prize of sorts as the socialist dream fell by the wayside, workers found that much enjoyment of those benefits required required overtime, second incomes and a good deal of credit, which left them more harried, more insecure, and feeling less in control of their lives than they would have hoped to be in a situation where jobs were plentiful and wages going up. And for all the meritocratic talk the record of social mobility looks less impressive on close examination. Where educational and occupational opportunity was concerned, it was far easier to give students more years in school than to really equalize the educational opportunities of the children of manual workers and the children of privilege, let alone open up access to the more prestigious, remunerative careers that young people were likely to dream of becoming someday, the volume of which did not budge so much. Especially with the economy demanding more assembly-line workers than technologists, more clerks than Chief Executive Officers, more nurses than surgeons, more schoolteachers than Oxford dons or Fleet Street journalists or barristers (never mind novelists), there was not much more room at the top, and traditional public school-and-Oxbridge privilege substantially monopolized it. The result was that in the end not very many moved up, not many of those who did move up moved very far up, and even those who made headway tended to do so only after great exertions and even sacrifices, at the end of which they may have been unsure the game was worth the candle--while more often than not alert to the difference there would have been in the prospects of talent being recognized and duly rewarded, or simple "return on effort," had they been born in a higher social station.
These disappointments and frustrations were by no means slight, or slight grounds for the disenchantment of many with the situation then (memorably dramatized in the writings of "angry young men" like John Osborne). Still, just as with the post-war American anxieties over a business culture turning the country's white collar-wearing workers into "organization men," those worries can seem a comparative luxury in the post-Thatcher, post-Blair world in which working people are told to not even dream of social safety nets and living wages--a reminder of just how far the expectations of what may scarcely be able to call itself a "left" anymore have fallen.
* In the thirty-four years of the 1945-1979 period the Conservatives were the party of government for seventeen years (1951-1964, 1970-1974), fully half the time, with the earlier thirteen year stretch (with its three General Election victories) arguably critical in reining in the party's grander visions.
Tuesday, June 22, 2021
Of "Middle Classness" and Popular Culture
I have recently been giving some thought to the idea of what it means to be "middle class."
The criteria for middle class status vary from time to time and place to place, but these days, at least, to be a middle class adult seems to mean their one's being a college-educated, salary-earning white collar worker with a career rather than a mere job, affording that worker a certain standard of consumption providing a minimum of comfort, mobility, security and opportunity for their children (home and car ownership, health insurance, college for the kids, retirement, a few extras here and there like a night out or a vacation).
Crunching the numbers it seemed to me that the percentage of Americans who have the full package in even basic form is probably in the single digits; and especially if one insists that, as the mid-century expectation had it, the household gets it on one income, in the low single digits, and likely declining with incomes stagnant and much of the package becoming ever more exorbitant. (Given what college now runs, even those one might judge to be very well-off at a glance can looking needy.)
Considering that it seems there is a great difference indeed between middle class and "middle income" (mixing of which two very different categories does not always seem a slip-up on the part of those who do it).
It seems, too, that popular culture is way off the mark in presenting what is in fact wealth as mere middleness is a universal norm with anything less (the actual norm) a shabby aberration popular culture is way off the mark. This makes the obvious question just why it does it do so.
Obviously popular culture, as Thorstein Veblen already had occasion to note a century ago, caters above all to the upper "affluent middle class," which is, after all, the group living something like this for which all this is not so remote as it is to others. Significant, too, is the extent to which the people who make pop culture in Hollywood and elsewhere have generally inhabited a bubble of privilege for generations, with all that implies for perceptions. (When Chris Pine speaks of his background as "blue collar" I do not get the sense that he is being ironic.) And there is the dramatic convenience that people who have little money find the range of possible activity they can undertake very limited--too limited for a TV writer's convenience, certainly. (That working class, really blue collar family? We won't be seeing an episode about their wacky adventures on vacation anytime soon.)
Yet it seems to me undeniable that we would not see so much of this were audiences not receptive to it.
Simply put, there are stories and characters people follow because they relate to them, because they see themselves in them and connect with them--and others they are attracted to because they represent a fantasy of what they wish they were, how they would like for their life to be, and I think the latter is what is relevant here. In reality the viewer may not have a very attractive home--but when looking at the screen they would rather see the spacious, lavishly appointed house rather than cramped and less appealing surroundings like those in which they actually live. Indeed, it may well be that the worse off they are materially, the more they want the escape.
The criteria for middle class status vary from time to time and place to place, but these days, at least, to be a middle class adult seems to mean their one's being a college-educated, salary-earning white collar worker with a career rather than a mere job, affording that worker a certain standard of consumption providing a minimum of comfort, mobility, security and opportunity for their children (home and car ownership, health insurance, college for the kids, retirement, a few extras here and there like a night out or a vacation).
Crunching the numbers it seemed to me that the percentage of Americans who have the full package in even basic form is probably in the single digits; and especially if one insists that, as the mid-century expectation had it, the household gets it on one income, in the low single digits, and likely declining with incomes stagnant and much of the package becoming ever more exorbitant. (Given what college now runs, even those one might judge to be very well-off at a glance can looking needy.)
Considering that it seems there is a great difference indeed between middle class and "middle income" (mixing of which two very different categories does not always seem a slip-up on the part of those who do it).
It seems, too, that popular culture is way off the mark in presenting what is in fact wealth as mere middleness is a universal norm with anything less (the actual norm) a shabby aberration popular culture is way off the mark. This makes the obvious question just why it does it do so.
Obviously popular culture, as Thorstein Veblen already had occasion to note a century ago, caters above all to the upper "affluent middle class," which is, after all, the group living something like this for which all this is not so remote as it is to others. Significant, too, is the extent to which the people who make pop culture in Hollywood and elsewhere have generally inhabited a bubble of privilege for generations, with all that implies for perceptions. (When Chris Pine speaks of his background as "blue collar" I do not get the sense that he is being ironic.) And there is the dramatic convenience that people who have little money find the range of possible activity they can undertake very limited--too limited for a TV writer's convenience, certainly. (That working class, really blue collar family? We won't be seeing an episode about their wacky adventures on vacation anytime soon.)
Yet it seems to me undeniable that we would not see so much of this were audiences not receptive to it.
Simply put, there are stories and characters people follow because they relate to them, because they see themselves in them and connect with them--and others they are attracted to because they represent a fantasy of what they wish they were, how they would like for their life to be, and I think the latter is what is relevant here. In reality the viewer may not have a very attractive home--but when looking at the screen they would rather see the spacious, lavishly appointed house rather than cramped and less appealing surroundings like those in which they actually live. Indeed, it may well be that the worse off they are materially, the more they want the escape.
Saturday, June 19, 2021
Transportation as a Service and the End of the Romance of The Car?
As those who follow such matters are well aware, the conventional wisdom regarding self-driving cars has changed profoundly in a few years' time--from matter-of-fact expectation that they will shortly be a large and swiftly growing part of everyday life, to sneering dismissal of the prospect of their appearing anytime soon, and perhaps ever.
To be fair, I do not know that those who dismiss the technology are wrong. Indeed, I acknowledge that the present lowered hopes reflect our having painfully acquired a better understanding of just how tough a task it actually is to develop a car that can drive itself as safely as a competent human driver given the present state of the art in the relevant areas (perhaps most obviously, the power of the computers that must do the "deep learning" on which we are relying). However, it does seem to me that the dismissal is as exaggerated as the earlier hype, for a lot of reasons--one of which is the distaste many seem to have for one of the more transformative possibilities the self-driving car brings with it, namely a turn away from individual vehicle ownership to "Transportation as a Service" (Taas).
Those who have sneered at the prospect (often, giving the impression of fear of a shrunken auto market's implications for their particular business) have given many reasons besides technical feasibility. These have prominently included the satisfactions cars render besides transport--what one can call "the romance of the automobile." Exemplary of the tendency, ex-BP CEO Lord Browne emphasized in his Washington Post op-ed that cars are not "simply [a way] to get around," but also "signal our values and extend our private space--things a shared service cannot offer."
Fair enough--except that "signaling our values" to the world at large and "extending our personal space" are comparative luxuries which mean more to some than others, and which some can more easily indulge in than others (facts to which the privileged consistently display an extreme obliviousness). People need transport, pure and simple, and buying such cars as are within their limited means is something they do to meet that need, with any question of immaterial pleasure of far less consequence. Indeed, given how the costs of car ownership weigh on household budgets (any household making under six figures, certainly, is very hard-pressed to afford two vehicles), to say nothing of the other hassles involved (from maintenance to legal liability in the event of accident), it is easy to see many people regarding a transport service merely adequate to meet their needs at a fraction of the price they pay to own and operate a car of their own as a relief.
It is easier still to picture people happily abandoning the hassle of car ownership in favor of Taas when we consider the behavior of the younger age cohorts (from early thirtysomethings on down), in whose lives driving has simply not been so big an element, or even their prospects, to the point of so few of them bothering to get licenses. Some see this as a matter of the preference of many young people for urban over suburban living, and the extent to which, wherever they happen to be, they live their lives online (shopping, socializing, recreating through a screen). However, it is also because they see less point to getting a license when their hard-pressed parents are less able or willing to get them a car, and when their buying a car with their own money is a remote prospect (used vehicles are averaging $25,000 these days--all as the minimum wage their college degrees do not save them from still runs $7.25 an hour)--while the same poverty is, after all, a major reason why they spend so much more time doing things online than going out.
Driving later and less even when they have owned cars, something far more of them will have not done and may not even expect to do, they could be expected to let go of the idea of personal auto ownership that much more easily. Indeed, were Taas to come along in any economic circumstances like the present, I suspect that many would embrace it and never look back, while private auto ownership (which would, of course, be self-driving auto ownership) would become something like a private plane--a luxury purchased only by the wealthy few, mostly because they can.
To be fair, I do not know that those who dismiss the technology are wrong. Indeed, I acknowledge that the present lowered hopes reflect our having painfully acquired a better understanding of just how tough a task it actually is to develop a car that can drive itself as safely as a competent human driver given the present state of the art in the relevant areas (perhaps most obviously, the power of the computers that must do the "deep learning" on which we are relying). However, it does seem to me that the dismissal is as exaggerated as the earlier hype, for a lot of reasons--one of which is the distaste many seem to have for one of the more transformative possibilities the self-driving car brings with it, namely a turn away from individual vehicle ownership to "Transportation as a Service" (Taas).
Those who have sneered at the prospect (often, giving the impression of fear of a shrunken auto market's implications for their particular business) have given many reasons besides technical feasibility. These have prominently included the satisfactions cars render besides transport--what one can call "the romance of the automobile." Exemplary of the tendency, ex-BP CEO Lord Browne emphasized in his Washington Post op-ed that cars are not "simply [a way] to get around," but also "signal our values and extend our private space--things a shared service cannot offer."
Fair enough--except that "signaling our values" to the world at large and "extending our personal space" are comparative luxuries which mean more to some than others, and which some can more easily indulge in than others (facts to which the privileged consistently display an extreme obliviousness). People need transport, pure and simple, and buying such cars as are within their limited means is something they do to meet that need, with any question of immaterial pleasure of far less consequence. Indeed, given how the costs of car ownership weigh on household budgets (any household making under six figures, certainly, is very hard-pressed to afford two vehicles), to say nothing of the other hassles involved (from maintenance to legal liability in the event of accident), it is easy to see many people regarding a transport service merely adequate to meet their needs at a fraction of the price they pay to own and operate a car of their own as a relief.
It is easier still to picture people happily abandoning the hassle of car ownership in favor of Taas when we consider the behavior of the younger age cohorts (from early thirtysomethings on down), in whose lives driving has simply not been so big an element, or even their prospects, to the point of so few of them bothering to get licenses. Some see this as a matter of the preference of many young people for urban over suburban living, and the extent to which, wherever they happen to be, they live their lives online (shopping, socializing, recreating through a screen). However, it is also because they see less point to getting a license when their hard-pressed parents are less able or willing to get them a car, and when their buying a car with their own money is a remote prospect (used vehicles are averaging $25,000 these days--all as the minimum wage their college degrees do not save them from still runs $7.25 an hour)--while the same poverty is, after all, a major reason why they spend so much more time doing things online than going out.
Driving later and less even when they have owned cars, something far more of them will have not done and may not even expect to do, they could be expected to let go of the idea of personal auto ownership that much more easily. Indeed, were Taas to come along in any economic circumstances like the present, I suspect that many would embrace it and never look back, while private auto ownership (which would, of course, be self-driving auto ownership) would become something like a private plane--a luxury purchased only by the wealthy few, mostly because they can.
Why Did the Press Get Solar So Wrong?
As I have remarked before, techno-hype seems to periodically go boom and bust--and we are living in a moment of bust as recent expectations surrounding carbon nanotube-base chips, self-driving vehicles, virtual reality, and much else come to naught. Yet looking at the expectations that proved exaggerated I also find myself noting the less publicized technologies that progressed rather more rapidly than the purveyors of hype expected, with renewable energy, and especially photovoltaic solar, the outstanding example--to the point that the fossil fuel and nuclear sectors may now have trillions of dollars in "stranded" investment on their hands.
Why did the press get solar so wrong? It seems to me there are three reasons.
1. Solar energy represents a solution to a major problem. The press trafficks in fear, not hope. This actually gives it a reason to belittle anything that would be a solution--and of course, believe and repeat any belittling thing that is said about them, of which there has been no shortage, and which has by no means all been a function of thoughtful analysis.
2. As a disruptive technology up against sustaining technologies (a lot of interests feared and hated the thought of an energy transition) solar faced a profound PR battle, compounded by the ecological, political, and even "culture war" implications of the associated choices. (Bluntly put, there was a lot of investment, far beyond Big Oil, in a fossil fuel-powered economy; a lot of hostility to any notion of government shifting its weight from subsidizing fossil fuels to trying to accelerate an energy transition, a prospect the more plausible if renewable energy looked promising; and in general a lot of enmity toward the idea that the prerogatives of business might have to be compromised for the environment's sake.) Naturally there were plenty of people who did everything they could for a very long time to persuade the public that solar power was just a flaky hippie fantasy, and tough-minded, practical people had better keep their minds on good old king coal instead--and never mind that global warming stuff they were hearing about. And they were the kind of people to which the press was inclined to listen. They treat Goldman Sachs with far more respect than Greenpeace, after all--but it was Goldman Sachs which turned out to be wrong.
3. Last, and perhaps least, is the fact that solar, hugely consequential as it is in technical, commercial and ecological terms, is simply not that exciting from the standpoint of the gadget-happy consumer. When it comes to personal, immediate experience, utility-scale solar (and it is this which has smashed the records--not the domestic kind that can mean never paying an electric bill again) still delivers electricity the same way as fossil fuels or nuclear or anything else. People flick the switch and the lights come on, with the actual cause of their coming on far away and unseen. There is thus not much for the purveyors of gadget hype to get all excited about the way they did over the Segway scooter (I still remember when they were telling us this would "change the world!") or virtual assistants. And so this could not and did not offset factors 1 and 2.
All the same, solar has arrived. And if you've been looking for good news about the climate crisis, well, here it is--the best hope yet that we can actually do something about the problem. Hopefully it won't be the last piece of such news.
Why did the press get solar so wrong? It seems to me there are three reasons.
1. Solar energy represents a solution to a major problem. The press trafficks in fear, not hope. This actually gives it a reason to belittle anything that would be a solution--and of course, believe and repeat any belittling thing that is said about them, of which there has been no shortage, and which has by no means all been a function of thoughtful analysis.
2. As a disruptive technology up against sustaining technologies (a lot of interests feared and hated the thought of an energy transition) solar faced a profound PR battle, compounded by the ecological, political, and even "culture war" implications of the associated choices. (Bluntly put, there was a lot of investment, far beyond Big Oil, in a fossil fuel-powered economy; a lot of hostility to any notion of government shifting its weight from subsidizing fossil fuels to trying to accelerate an energy transition, a prospect the more plausible if renewable energy looked promising; and in general a lot of enmity toward the idea that the prerogatives of business might have to be compromised for the environment's sake.) Naturally there were plenty of people who did everything they could for a very long time to persuade the public that solar power was just a flaky hippie fantasy, and tough-minded, practical people had better keep their minds on good old king coal instead--and never mind that global warming stuff they were hearing about. And they were the kind of people to which the press was inclined to listen. They treat Goldman Sachs with far more respect than Greenpeace, after all--but it was Goldman Sachs which turned out to be wrong.
3. Last, and perhaps least, is the fact that solar, hugely consequential as it is in technical, commercial and ecological terms, is simply not that exciting from the standpoint of the gadget-happy consumer. When it comes to personal, immediate experience, utility-scale solar (and it is this which has smashed the records--not the domestic kind that can mean never paying an electric bill again) still delivers electricity the same way as fossil fuels or nuclear or anything else. People flick the switch and the lights come on, with the actual cause of their coming on far away and unseen. There is thus not much for the purveyors of gadget hype to get all excited about the way they did over the Segway scooter (I still remember when they were telling us this would "change the world!") or virtual assistants. And so this could not and did not offset factors 1 and 2.
All the same, solar has arrived. And if you've been looking for good news about the climate crisis, well, here it is--the best hope yet that we can actually do something about the problem. Hopefully it won't be the last piece of such news.
Wednesday, June 16, 2021
Why All the Hate for Self-Driving Cars? Seven Reasons Why the Self-Driving Car Bashing Has (Probably) Gone Overboard
Just a few years ago it was the conventional wisdom that self-driving cars were here, in a big way (almost), with people investing a lot of time and worry in how we would deal with the fact.
Today it seems the conventional wisdom that they are very far away at best, and perhaps never coming at all.
Putting it bluntly, the complacent, credulous optimism of 2015 has given way to smug, know-nothing sneering.
It is a swing from one extreme to the other--with a good deal of irrationality involved in the current, pessimistic appraisals, maybe as much as was to be seen in the past, more optimistic appraisals.
I suggest that there are at least seven reasons why the media is, for the time being, so relentlessly sneering in its attitude toward self-driving cars.
1. The Media Exaggerates Everything.
What we collectively call "the media" is, of course, an overwhelmingly commercial enterprise which makes its profits by fighting for and winning your attention in an exceedingly crowded and brutally competitive "attention economy." From this vantage point simple statements are preferable to long ones, which go right along with crude exaggeration being preferable to nuance--while surprising statements are preferable to what people expect.
The short, exaggerated, surprising statement is, of course, particularly commonplace in the area of technological reporting. Of course, my experience is that this tends to an exaggerated impression of how far some technology has come along, or will come along very soon--and this was indeed the case with the self-driving car a short time ago. Yet we also get the opposite, as with renewable energy, which the media was fairly relentless in dismissing . . . until photovoltaic solar became the cheapest energy source in the history of the world, with the price still dropping.
There were reasons why the media got that one so wrong, and just as in that case, numerous factors can seem to impel the media to treat self-driving cars in the same manner that it treated renewable energy--reasons currently more powerful than the earlier gee-whizzery.
2. Self-Driving Cars Scare a Lot of People in the Business.
Those who have followed technological R & D in the past may be familiar with the terms "sustaining innovation" and "disruptive innovation." Sustaining innovations are cases of improvement in the performance of existing products, according to the metrics by which it is already routine to judge them. Disruptive innovations are cases of qualitatively new products that might fundamentally change the market. New technologies aiding fossil fuel extraction (for example, new artificial lift technologies increasing the flow of oil and gas from wells) would be examples of sustaining innovation. By contrast renewable energy technologies like solar or wind are disruptive innovations, because they change the principal game from the ongoing one of "Who can deliver fossil fuels most efficiently?" to "Which energy source can give us electricity most efficiently?"--possibly driving fossil fuels out of this particular market (with coal already looking like a casualty).
Right now the car industry is looking at the prospection of such disruption, in multiple ways, with self-driving one of them, and the more worrisome because of the prospect it raises of "Transportation as a Service" (Taas). Taas would mean that rather than everyone who can afford it getting their own car, cars would be just something they would call up when they actually want one. This would mean a lot less cars out there (and those made by someone other than their company, should it fail to keep up in a competition they are by no means guaranteed to win). There would also be a lot less demand for everything that currently goes into cars, from the steel, rubber and glass supplied for the making of those cars (if less publicized than IT, do not underestimate the economic significance of the automotive-industrial complex), to service stations and car insurance (goodbye Flo?), to the oil companies which fill those cars' tanks (if they even run on gas anymore). Naturally a world where Taas replaces the current model of car ownership is something they would be inclined to dismiss or belittle. And the media being what it is, it tends to eat up, and mindlessly repeat, anything such people say. Still, it should be admitted that lately they are even more than usually open to the doubters.
3. The Romance of the Car.
On top of the reality that a great many powerful interests are massively invested in the current model of individual ownership of traditionally "manned" vehicles, there is the reality that many more have a less practical, but not necessarily slight, emotional investment in them.
Consider the place of driving in American culture. Getting one's driver's license is a "rite of passage," and getting an actual car to drive--in the eyes of most Americans a great step toward personal independence and recognition as an adult, the more in as so much that goes with being an adult (finding and holding down a job, dating) is very difficult to do without a car in the country's pedestrian-hating, transit-deprived metro areas (and still more outside those areas). Proving oneself as driver and car-owner means that one has truly taken that step successfully. The make of one's car is a significant indicator of socioeconomic gradation, two cars in the garage of one's own house bespeaks solid middle classness, and being able to buy one's child a car of their own when they turn sixteen an indicator that one has provided a genuinely comfortable upbringing, while in the fantasies of wealth that lend credence to the image of a nation of "temporarily embarrassed millionaires," possession of still more luxurious vehicles tends to be prominent. Indeed, the indulgence of a passion for cars--maintaining one's car oneself, collecting cars, restoring some favorite classic car if one has the time and money for such a hobby--has, like the viewing of spectator sports, long been one of the few leisure activities considered seemly for an adult male.
It is all such that the "new car smell," which is basically the smell of a bunch of health-endangering industrial chemicals that a great many people actually find repugnant, is a well-known object of fondness (unshared by their counterparts elsewhere).
Of course, not everyone, even in America, is equally invested in this romance. For many driving is a stressful experience, car ownership burdensome financially and in other ways, and the rewards of less tangible kinds few. (Perhaps their car make is a testament to poverty rather than wealth, and it stings; perhaps they find no pleasure in tinkering with the innards of their car.) But enough people, even people who should know better, are sufficiently invested in it for this to also be a factor here. And for those who think in such terms a world where people do not sit behind the wheel of their own car, and maybe do not even own a car, is unimaginable--or at the very least, depressing--and they accordingly dismissive of the prospect.
4. We Are in a Moment of Downturn in the Technological Hype Cycle.
The idea of a "cycle" of technological hype, where early inflated expectations are often disappointed, leading to disillusionment, followed by recovery, "green shoots"--and perhaps inflated expectations again--has been popularized by the Gartner firm's much-publicized "cycle." That cycle, of course, tracks attitudes toward individual products, but one may speak of such a cycle being evident in regard to technological change more generally.
The '90s was a period of high expectations regarding technologies like artificial intelligence--and the '00s a period of bust that soon had even the ever-ebullient Ray Kurzweil backing off from predictions that clearly did not come to pass. Of course, there was another resurgence by the mid-'10s regarding many of the same technologies, but now we find ourselves in another period of bust.
Of course, a particular technology may make headway even in a period of bust. (Were this not the case we would see no recovery.) Yet common expectations--and this is what we are talking about here--are colored by the general mood, especially to the extent that the unknowns involved in prediction leave observers relying on "judgment" rather than analysis grounded in hard fact. Their gut feeling would seem more likely to err on the side of "Won't happen" than "Will," with all this implies for the broader conversation at the moment.
5. Cynicism is a Good Cover for Ignorance.
Added to these reasons for pessimism (the bad-mouthing of self-driving cars by prominent figures, the old romance of cars and car ownership, the lower expectations of technological change), there is the fact that striking cynical poses helps a journalist who is actually far out of their depth look as if they are not out of their depth. They do not actually understand the technology sufficiently well to render a judgment about it one way or the other (even to the extent that a layperson could) but sneering at least makes them look like they are resistant to a sales pitch--and thus possess a sophistication they do not really have, and never earned.
Put simply, they are the "pseudomature" kids that dimwitted conformists think are "cool." ("I don't get excited by anything. I've seen it all. You can't impress me!")
Of course, part of the "cool kid" package is being a mean-spirited little bully making those who do not say and wear and do the "right" things feel bad about themselves, as publicly as possible, to affirm that they are indeed the cool kid. Right now self-driving cars seem like an easy target for their kind. ("Oh, you were hoping for a self-driving car? Ain't happening. Ha ha!" It does not seem unimportant that many of those who might have most hoped to see self-driving, and especially Taas, make their lives a little better are the old, the young, the disabled, the poor--people who happen to be marginalized in one way or another, and thus favorite prey for bullies.)
6. Elon Musk is Making Self-Driving Cars (and the People Optimistic About Them) Look Bad.
As if all this were not bad enough there are those who make the cynicism easier still, especially a certain "tech billionaire."
Of course, it was never the case that Elon Musk was the only figure from the car industry talking up self-driving vehicles. But he was always beat out the rest of the competition when it come to the sheer aggressiveness of his predictions--starting with his claim that his company Tesla would deliver a truly autonomous vehicle by 2017.
Because the predictions were so startling, and because of his high personal profile (higher than that of any CEO of the established car companies), it was those and not the more cautious predictions that others made that monopolized public attention.
Of course, those predictions, which were as near term as they were dramatic, were not forgotten when the day came, and Musk proved very, very wrong--none of which stopped him from making similarly aggressive predictions again and again. (2017 saw Musk simply say 2019, and then when 2019 came along all he really had to offer were more promises that likewise failed to come to pass.) Soon he was even making claims that the cars had already arrived on the market in the form of his Teslas' Full Self Driving when what had been delivered was actually very far from that. To put it mildly, those dismissive of self-driving cars have been having a field day with this track record.
7. Commentators Are Overcompensating for their Earlier Gullibility.
Even beyond the exaggeration built into the business, the cynical poses, and the rest, there is the reality that the media was--as noted previously--telling a very different story a short while ago. Our remembering that makes them look very foolish. And now they are anxious to shore up what credibility they think they have in the public's eyes on this matter. "I knew it!" they want to say about the way self-driving cars failed to materialize by 2019, 2020, early 2021--but this is practically an invitation to check up on what they said before, a thing doable with a few clicks, which would give away the lie. And so instead they thunder on about the impossibility of the machines so loudly and so passionately and so lengthily that recollection that there had ever been anything different would soon slip from the feeble memories of most.
So far as I can tell, this has already happened for most.
Today it seems the conventional wisdom that they are very far away at best, and perhaps never coming at all.
Putting it bluntly, the complacent, credulous optimism of 2015 has given way to smug, know-nothing sneering.
It is a swing from one extreme to the other--with a good deal of irrationality involved in the current, pessimistic appraisals, maybe as much as was to be seen in the past, more optimistic appraisals.
I suggest that there are at least seven reasons why the media is, for the time being, so relentlessly sneering in its attitude toward self-driving cars.
1. The Media Exaggerates Everything.
What we collectively call "the media" is, of course, an overwhelmingly commercial enterprise which makes its profits by fighting for and winning your attention in an exceedingly crowded and brutally competitive "attention economy." From this vantage point simple statements are preferable to long ones, which go right along with crude exaggeration being preferable to nuance--while surprising statements are preferable to what people expect.
The short, exaggerated, surprising statement is, of course, particularly commonplace in the area of technological reporting. Of course, my experience is that this tends to an exaggerated impression of how far some technology has come along, or will come along very soon--and this was indeed the case with the self-driving car a short time ago. Yet we also get the opposite, as with renewable energy, which the media was fairly relentless in dismissing . . . until photovoltaic solar became the cheapest energy source in the history of the world, with the price still dropping.
There were reasons why the media got that one so wrong, and just as in that case, numerous factors can seem to impel the media to treat self-driving cars in the same manner that it treated renewable energy--reasons currently more powerful than the earlier gee-whizzery.
2. Self-Driving Cars Scare a Lot of People in the Business.
Those who have followed technological R & D in the past may be familiar with the terms "sustaining innovation" and "disruptive innovation." Sustaining innovations are cases of improvement in the performance of existing products, according to the metrics by which it is already routine to judge them. Disruptive innovations are cases of qualitatively new products that might fundamentally change the market. New technologies aiding fossil fuel extraction (for example, new artificial lift technologies increasing the flow of oil and gas from wells) would be examples of sustaining innovation. By contrast renewable energy technologies like solar or wind are disruptive innovations, because they change the principal game from the ongoing one of "Who can deliver fossil fuels most efficiently?" to "Which energy source can give us electricity most efficiently?"--possibly driving fossil fuels out of this particular market (with coal already looking like a casualty).
Right now the car industry is looking at the prospection of such disruption, in multiple ways, with self-driving one of them, and the more worrisome because of the prospect it raises of "Transportation as a Service" (Taas). Taas would mean that rather than everyone who can afford it getting their own car, cars would be just something they would call up when they actually want one. This would mean a lot less cars out there (and those made by someone other than their company, should it fail to keep up in a competition they are by no means guaranteed to win). There would also be a lot less demand for everything that currently goes into cars, from the steel, rubber and glass supplied for the making of those cars (if less publicized than IT, do not underestimate the economic significance of the automotive-industrial complex), to service stations and car insurance (goodbye Flo?), to the oil companies which fill those cars' tanks (if they even run on gas anymore). Naturally a world where Taas replaces the current model of car ownership is something they would be inclined to dismiss or belittle. And the media being what it is, it tends to eat up, and mindlessly repeat, anything such people say. Still, it should be admitted that lately they are even more than usually open to the doubters.
3. The Romance of the Car.
On top of the reality that a great many powerful interests are massively invested in the current model of individual ownership of traditionally "manned" vehicles, there is the reality that many more have a less practical, but not necessarily slight, emotional investment in them.
Consider the place of driving in American culture. Getting one's driver's license is a "rite of passage," and getting an actual car to drive--in the eyes of most Americans a great step toward personal independence and recognition as an adult, the more in as so much that goes with being an adult (finding and holding down a job, dating) is very difficult to do without a car in the country's pedestrian-hating, transit-deprived metro areas (and still more outside those areas). Proving oneself as driver and car-owner means that one has truly taken that step successfully. The make of one's car is a significant indicator of socioeconomic gradation, two cars in the garage of one's own house bespeaks solid middle classness, and being able to buy one's child a car of their own when they turn sixteen an indicator that one has provided a genuinely comfortable upbringing, while in the fantasies of wealth that lend credence to the image of a nation of "temporarily embarrassed millionaires," possession of still more luxurious vehicles tends to be prominent. Indeed, the indulgence of a passion for cars--maintaining one's car oneself, collecting cars, restoring some favorite classic car if one has the time and money for such a hobby--has, like the viewing of spectator sports, long been one of the few leisure activities considered seemly for an adult male.
It is all such that the "new car smell," which is basically the smell of a bunch of health-endangering industrial chemicals that a great many people actually find repugnant, is a well-known object of fondness (unshared by their counterparts elsewhere).
Of course, not everyone, even in America, is equally invested in this romance. For many driving is a stressful experience, car ownership burdensome financially and in other ways, and the rewards of less tangible kinds few. (Perhaps their car make is a testament to poverty rather than wealth, and it stings; perhaps they find no pleasure in tinkering with the innards of their car.) But enough people, even people who should know better, are sufficiently invested in it for this to also be a factor here. And for those who think in such terms a world where people do not sit behind the wheel of their own car, and maybe do not even own a car, is unimaginable--or at the very least, depressing--and they accordingly dismissive of the prospect.
4. We Are in a Moment of Downturn in the Technological Hype Cycle.
The idea of a "cycle" of technological hype, where early inflated expectations are often disappointed, leading to disillusionment, followed by recovery, "green shoots"--and perhaps inflated expectations again--has been popularized by the Gartner firm's much-publicized "cycle." That cycle, of course, tracks attitudes toward individual products, but one may speak of such a cycle being evident in regard to technological change more generally.
The '90s was a period of high expectations regarding technologies like artificial intelligence--and the '00s a period of bust that soon had even the ever-ebullient Ray Kurzweil backing off from predictions that clearly did not come to pass. Of course, there was another resurgence by the mid-'10s regarding many of the same technologies, but now we find ourselves in another period of bust.
Of course, a particular technology may make headway even in a period of bust. (Were this not the case we would see no recovery.) Yet common expectations--and this is what we are talking about here--are colored by the general mood, especially to the extent that the unknowns involved in prediction leave observers relying on "judgment" rather than analysis grounded in hard fact. Their gut feeling would seem more likely to err on the side of "Won't happen" than "Will," with all this implies for the broader conversation at the moment.
5. Cynicism is a Good Cover for Ignorance.
Added to these reasons for pessimism (the bad-mouthing of self-driving cars by prominent figures, the old romance of cars and car ownership, the lower expectations of technological change), there is the fact that striking cynical poses helps a journalist who is actually far out of their depth look as if they are not out of their depth. They do not actually understand the technology sufficiently well to render a judgment about it one way or the other (even to the extent that a layperson could) but sneering at least makes them look like they are resistant to a sales pitch--and thus possess a sophistication they do not really have, and never earned.
Put simply, they are the "pseudomature" kids that dimwitted conformists think are "cool." ("I don't get excited by anything. I've seen it all. You can't impress me!")
Of course, part of the "cool kid" package is being a mean-spirited little bully making those who do not say and wear and do the "right" things feel bad about themselves, as publicly as possible, to affirm that they are indeed the cool kid. Right now self-driving cars seem like an easy target for their kind. ("Oh, you were hoping for a self-driving car? Ain't happening. Ha ha!" It does not seem unimportant that many of those who might have most hoped to see self-driving, and especially Taas, make their lives a little better are the old, the young, the disabled, the poor--people who happen to be marginalized in one way or another, and thus favorite prey for bullies.)
6. Elon Musk is Making Self-Driving Cars (and the People Optimistic About Them) Look Bad.
As if all this were not bad enough there are those who make the cynicism easier still, especially a certain "tech billionaire."
Of course, it was never the case that Elon Musk was the only figure from the car industry talking up self-driving vehicles. But he was always beat out the rest of the competition when it come to the sheer aggressiveness of his predictions--starting with his claim that his company Tesla would deliver a truly autonomous vehicle by 2017.
Because the predictions were so startling, and because of his high personal profile (higher than that of any CEO of the established car companies), it was those and not the more cautious predictions that others made that monopolized public attention.
Of course, those predictions, which were as near term as they were dramatic, were not forgotten when the day came, and Musk proved very, very wrong--none of which stopped him from making similarly aggressive predictions again and again. (2017 saw Musk simply say 2019, and then when 2019 came along all he really had to offer were more promises that likewise failed to come to pass.) Soon he was even making claims that the cars had already arrived on the market in the form of his Teslas' Full Self Driving when what had been delivered was actually very far from that. To put it mildly, those dismissive of self-driving cars have been having a field day with this track record.
7. Commentators Are Overcompensating for their Earlier Gullibility.
Even beyond the exaggeration built into the business, the cynical poses, and the rest, there is the reality that the media was--as noted previously--telling a very different story a short while ago. Our remembering that makes them look very foolish. And now they are anxious to shore up what credibility they think they have in the public's eyes on this matter. "I knew it!" they want to say about the way self-driving cars failed to materialize by 2019, 2020, early 2021--but this is practically an invitation to check up on what they said before, a thing doable with a few clicks, which would give away the lie. And so instead they thunder on about the impossibility of the machines so loudly and so passionately and so lengthily that recollection that there had ever been anything different would soon slip from the feeble memories of most.
So far as I can tell, this has already happened for most.
Friday, June 11, 2021
NASA's "Technology Readiness Levels": A System Worth Learning
Back in the 1970s NASA developed a nine-level system of "Technology Readiness Levels" (since widely adopted by other American and foreign agencies) as a way of measuring just how far a technology has progressed from concept to reality.
Level 1, the lowest, indicates that the "basic principles" of the technology have been "observed and reported"--that, at the risk of putting it crudely, someone is telling us that a technology is feasible "in theory." This is, of course, very, very far from such a technology actually becoming "a thing," as reflected in NASA's definitions' document having the exit criteria from Level 1 "[p]eer reviewed publication of research underlying the proposed concept/application."
By contrast a technology at Level 9 is one which has already proceeded through the "validation" of key elements in a "laboratory environment" (Level 4), demonstrations of a "prototype in an operational environment" (Level 6), and even the completion and "flight qualifi[cation]" of an actual system "through test and demonstration" (Level 8), to the system's being "proven through successful mission operations" (Level 9), proper documentation of which means that the system has fully "graduated" from development.
When you read a great deal about technological research and development you quickly find that a very great deal of journalism talks about technologies that are on Level 8, or 6, or 4, or 2 (or even 1) as if they were on Level 9. Consider, for instance, the media reports about a paper in the Journal of Plasma Physics last year discussing a concept for a plasmoid-based thruster that might deliver an exhaust velocity of over a million miles an hour. (When Mars is at its closest this would get us to the red planet in under two days--less time than it took the Apollo missions to make the far shorter trip to our moon, while it would get us to the moon in less time than it probably takes you to get to work in the morning.)
Exciting stuff for those of us who follow space technology? Absolutely. But the wording of the mainstream media reports, which used the word "invented" in reference to the thruster (a term associated with the actual physical existence of a thing) gave the false impression that a technology that would seem to be at Level 2 (published paper) is on Level 4 (key elements being validated), or even higher (some pieces giving the impression that the thing is on its way to the launch pad for its first flight). Perhaps it will be, someday, but the point is that it is not there now, or even a sure thing in the near term. (The design uses a tokamak fusion reactor to generate those plasmoids, after all. They've been working on those since 1958, and have yet to reach "fusion breakeven"--a fact unacknowledged in the pieces I have read.)
The strong contrary impression the writing on the matter offers is just one (if particularly blatant) example of the illiterate and irresponsible tech journalism in which we are awash. There is no excusing it, especially in this age where R & D so often entails such massive investment, and so many of the major problems we face require, as at least part of the solution, our investing in the right technologies.
It seems to me that the popularization of the Technology Readiness Levels system, or at least something like it, could be helpful in clearing these things up because of its admirably clear benchmarks. (Either the paper has been published, or it hasn't; either the prototype has been completed, and tested, or it hasn't.) Perhaps it could become something like the use of the famed star system when critics talk about movies--the Level a technology is on, maybe even the Level it might be on at such and such a date, given as a matter of course. In the absence of such a courtesy on the part of the author we can consider what they have to say in terms of that system--judging for ourselves whether they are presenting something of substance, or subjecting us to the simple-minded gee-whizzery of which there is always too much about, eternally distorting a very important conversation.
Level 1, the lowest, indicates that the "basic principles" of the technology have been "observed and reported"--that, at the risk of putting it crudely, someone is telling us that a technology is feasible "in theory." This is, of course, very, very far from such a technology actually becoming "a thing," as reflected in NASA's definitions' document having the exit criteria from Level 1 "[p]eer reviewed publication of research underlying the proposed concept/application."
By contrast a technology at Level 9 is one which has already proceeded through the "validation" of key elements in a "laboratory environment" (Level 4), demonstrations of a "prototype in an operational environment" (Level 6), and even the completion and "flight qualifi[cation]" of an actual system "through test and demonstration" (Level 8), to the system's being "proven through successful mission operations" (Level 9), proper documentation of which means that the system has fully "graduated" from development.
When you read a great deal about technological research and development you quickly find that a very great deal of journalism talks about technologies that are on Level 8, or 6, or 4, or 2 (or even 1) as if they were on Level 9. Consider, for instance, the media reports about a paper in the Journal of Plasma Physics last year discussing a concept for a plasmoid-based thruster that might deliver an exhaust velocity of over a million miles an hour. (When Mars is at its closest this would get us to the red planet in under two days--less time than it took the Apollo missions to make the far shorter trip to our moon, while it would get us to the moon in less time than it probably takes you to get to work in the morning.)
Exciting stuff for those of us who follow space technology? Absolutely. But the wording of the mainstream media reports, which used the word "invented" in reference to the thruster (a term associated with the actual physical existence of a thing) gave the false impression that a technology that would seem to be at Level 2 (published paper) is on Level 4 (key elements being validated), or even higher (some pieces giving the impression that the thing is on its way to the launch pad for its first flight). Perhaps it will be, someday, but the point is that it is not there now, or even a sure thing in the near term. (The design uses a tokamak fusion reactor to generate those plasmoids, after all. They've been working on those since 1958, and have yet to reach "fusion breakeven"--a fact unacknowledged in the pieces I have read.)
The strong contrary impression the writing on the matter offers is just one (if particularly blatant) example of the illiterate and irresponsible tech journalism in which we are awash. There is no excusing it, especially in this age where R & D so often entails such massive investment, and so many of the major problems we face require, as at least part of the solution, our investing in the right technologies.
It seems to me that the popularization of the Technology Readiness Levels system, or at least something like it, could be helpful in clearing these things up because of its admirably clear benchmarks. (Either the paper has been published, or it hasn't; either the prototype has been completed, and tested, or it hasn't.) Perhaps it could become something like the use of the famed star system when critics talk about movies--the Level a technology is on, maybe even the Level it might be on at such and such a date, given as a matter of course. In the absence of such a courtesy on the part of the author we can consider what they have to say in terms of that system--judging for ourselves whether they are presenting something of substance, or subjecting us to the simple-minded gee-whizzery of which there is always too much about, eternally distorting a very important conversation.
Subscribe to:
Posts (Atom)