Thursday, August 01, 2013

"Every Day, In Every Way, Things Are Growing Better & Better" — Professor Robert Gordon Says: "Not So Fast, My Friends..."

Northwestern Professor (economics) Robert Gordon fell flat on his face at a recent TTED conference. The dismal scientist was too dismal for the TED-attendees. If this is (fair & balanced) economic truth, so be it.

[x NY 'Zine]
The Blip
By Benjamin Wallace-Wells

Tag Cloud of the following piece of writing

created at TagCrowd.com

Picture this, arranged along a time line.

For all of measurable human history up until the year 1750, nothing happened that mattered. This isn’t to say history was stagnant, or that life was only grim and blank, but the well-being of average people did not perceptibly improve. All of the wars, literature, love affairs, and religious schisms, the schemes for empire-making and ocean-crossing and simple profit and freedom, the entire human theater of ambition and deceit and redemption took place on a scale too small to register, too minor to much improve the lot of ordinary human beings. In England before the middle of the eighteenth century, where industrialization first began, the pace of progress was so slow that it took 350 years for a family to double its standard of living. In Sweden, during a similar 200-year period, there was essentially no improvement at all. By the middle of the eighteenth century, the state of technology and the luxury and quality of life afforded the average individual were little better than they had been two millennia earlier, in ancient Rome.

Then two things happened that did matter, and they were so grand that they dwarfed everything that had come before and encompassed most everything that has come since: the first industrial revolution, beginning in 1750 or so in the north of England, and the second industrial revolution, beginning around 1870 and created mostly in this country. That the second industrial revolution happened just as the first had begun to dissipate was an incredible stroke of good luck. It meant that during the whole modern era from 1750 onward—which contains, not coincidentally, the full life span of the United States—human well-being accelerated at a rate that could barely have been contemplated before. Instead of permanent stagnation, growth became so rapid and so seemingly automatic that by the fifties and sixties the average American would roughly double his or her parents’ standard of living. In the space of a single generation, for most everybody, life was getting twice as good.

At some point in the late sixties or early seventies, this great acceleration began to taper off. The shift was modest at first, and it was concealed in the hectic up-and-down of yearly data. But if you examine the growth data since the early seventies, and if you are mathematically astute enough to fit a curve to it, you can see a clear trend: The rate at which life is improving here, on the frontier of human well-being, has slowed.

If you are like most economists—until a couple of years ago, it was virtually all economists—you are not greatly troubled by this story, which is, with some variation, the consensus long-arc view of economic history. The machinery of innovation, after all, is now more organized and sophisticated than it has ever been, human intelligence is more efficiently marshaled by spreading education and expanding global connectedness, and the examples of the Internet, and perhaps artificial intelligence, suggest that progress continues to be rapid.

But if you are prone to a more radical sense of what is possible, you might begin to follow a different line of thought. If nothing like the first and second industrial revolutions had ever happened before, what is to say that anything similar will happen again? Then, perhaps, the global economic slump that we have endured since 2008 might not merely be the consequence of the burst housing bubble, or financial entanglement and overreach, or the coming generational trauma of the retiring baby boomers, but instead a glimpse at a far broader change, the slow expiration of a historically singular event. Perhaps our fitful post-crisis recovery is no aberration. This line of thinking would make you an acolyte of a 72-year-old economist at Northwestern named Robert Gordon, and you would probably share his view that it would be crazy to expect something on the scale of the second industrial revolution to ever take place again.

“Some things,” Gordon says, and he says it often enough that it has become both a battle cry and a mantra, “can happen only once.”

Gordon assumed his present public identity—as a declinist and an accidental social theorist, as a roving publicist of depressing PowerPoints—last August, when he presented his theory in a working paper titled “Is U.S. Economic Growth Over?” [PDF] He has held a named chair at Northwestern for decades and is one of the eminent macroeconomists of his generation, but the scope of his bleakness has given him, over the past year, a newfound public profile. It has been a good time to be bleak, and Gordon, bleaker than everyone else, commands attention. “Very impressive,” the former Treasury secretary Larry Summers wrote Gordon from his iPad the day after the paper appeared. Ben ­Bernanke, the Federal Reserve chairman, delivered a commencement address this spring considering the paper’s implications, and the financial press has weighed in vociferously for and against.

Gordon has two predictions to offer, the first of which is about the near future. For at least the next fifteen years or so, Gordon argues, our economy will grow at less than half the rate it has averaged since the late-nineteenth century because of a set of structural headwinds that Gordon believes will be even more severe than most other economists do: the aging of the American population; the stagnation in educational achievement; the fiscal tightening to fix our public and private debt; the costs of health care and energy; the pressures of globalization and growing inequality. Over the past year, some other economists who once agreed with Gordon—most prominently Tyler Cowen of George Mason University—have taken note of the recent discoveries of abundant natural-gas reserves in the United States, and of the tentative deflation of health-care costs, and softened their pessimism. But to Gordon these are small corrections that leave the basic story unchanged. He believes we can no longer expect to double our standard of living in one generation; it will now take at least two. The common expectations that your children will attend college even if you haven’t, in other words, or will have twice as rich a life, in this view no longer look realistic. Some of these hopes are already outdated: The generation of Americans now in their twenties is the first to not be significantly better educated than their parents. If Gordon is right, then for all but the wealthiest one percent of Americans, the rate of improvement in the standard of living—year over year, and generation after generation­—will be no faster than it was during the dark ages.

Gordon’s second prediction is almost literary in its scope. The forces of the second industrial revolution, he believes, were so powerful and so unique that they will not be repeated. The consequences of that breakthrough took a century to be fully realized, and as the internal combustion engine gave rise to the car and eventually the airplane, and electricity to radio and the telephone and then mass media, they came to rearrange social forces and transform everyday lives. Mechanized farm equipment permitted people to stay in school longer and to leave rural areas and move to cities. Electrical appliances allowed women of all social classes to leave behind housework for more fulfilling and productive jobs. Air-conditioning moved work indoors. The introduction of public sewers and sanitation reduced illness and infant mortality, improving health and extending lives. The car, mass media, and commercial aircraft led to a liberation from the narrow confines of geography and an introduction to a far broader and richer world. Education beyond high school was made accessible, in the aftermath of World War II, to the middle and working classes. These are all consequences of the second industrial revolution, and it is hard to imagine how those improvements might be extended: Women cannot be liberated from housework to join the labor force again, travel is not getting faster, cities are unlikely to get much more dense, and educational attainment has plateaued. The classic example of the scale of these transformations is Paul Krugman’s description of his kitchen: The modern kitchen, absent a few surface improvements, is the same one that existed half a century ago. But go back half a century before that, and you are talking about no refrigeration, just huge blocks of ice in a box, and no gas-fired stove, just piles of wood. If you take this perspective, it is no wonder that the productivity gains have diminished since the early seventies. The social transformations brought by computers and the Internet cannot match any of this.

But even if they could, that would not be enough. “The growth rate is a heavy taskmaster,” Gordon says. The math is punishing. The American population is far larger than it was in 1870, and far wealthier to begin with, which means that the innovations will need to be more transformative to have the same economic effect. “I like to think of it this way,” he says. “We need innovations that are eight times as important as those we had before.”

There are many ways in which you can interpret this economic model, but the most lasting—the reason, perhaps, for the public notoriety it has brought its author—has little to do with economics at all. It is the suggestion that we have not understood how lucky we have been. The whole of American cultural memory, the period since World War II, has taken place within the greatest expansion of opportunity in the history of human civilization. Perhaps it isn’t that our success is a product of the way we structured our society. The shape of our society may be far more conditional, a consequence of our success. Embedded in Gordon’s data is an inquiry into entitlement: How much do we owe, culturally and politically, to this singular experience of economic growth, and what will happen if it goes away?

There are some people, scattered around this planet, for whom the question of economic growth many years hence is urgently important, for whom it seems to blot out all other matters. Economists, and think-tankers, and environmentalists concerned with climate change, and the dreamier kind of CNBC host, yes. But also ordinary people—liberals alarmed about their children’s student debt or conservatives outraged about the national deficit—who are not convinced that we will grow rich enough to pay these bills in the future, who hold ambient anxieties that things are getting not better but worse.

Among growth-worriers, there is a ­science-fiction streak. To be possessed by nightmares about the future requires that one be dreaming about the future in the first place. I don’t think I have had a single conversation about long-term economic growth that did not involve a detour into the matter of robots. Not robotization, but robots: how their minds worked, their strategies when engaged in a game of chess. Very strong and well-defended opinions about the driverless car are held. People in this camp are open to the possibility that the future could be very different from the present, and so robots, ­evocative of a wholly transformed world—perhaps for good, perhaps not—are of special interest. One leading theorist in the Gordon camp urged me to read a Carter-era text called The Zero Sum ­Society (1980), which suggests a grim dystopia that emerges once economic growth hits zero point zero, at which moment to gain anything requires that you take it from somebody else. “Once you start to think about growth,” the Nobel laureate Robert Lucas has said, “it is hard to think about anything else.”

Earlier this year, Gordon flew out to Long Beach to give a TED talk detailing his theory and its implications. TED’s audience is so primed for optimism about the future that Gordon, a rebuker of futurists, knew before he began that he’d lost the room—not in a Seth MacFarlane–at–the–Academy Awards way, but in a Bill O’Reilly–at–Al Sharpton’s–political–group kind of way, as a matter of tribal identity. TED had invited MIT’s Erik Brynjolfsson, an expert in the economics of technology and a known optimist about future breakthroughs, to give the counterpoint address. Gordon (short, round, and earnest) projects a donnish air; Brynjolfsson (tall, redheaded, bearded), the kind of cocky casualness that in Silicon Valley scans as cool. Gordon gave his account; introduced his graph; emphasized the abject poverty of life at the turn of the twentieth century; demonstrated how each American deficiency in education, inequality, demographics limited how much our economy might grow—and then, sensing that the crowd was not all that much moved, sat back to watch Brynjolfsson make the case against.

Brynjolfsson let a long beat elapse. “Growth is not dead,” he said casually, and then he grinned a little bit, and the audience laughed, and the tension that had lingered after Gordon’s pessimism dissipated. Brynjolfsson had the aspirational TED inflection down cold: “Technology is not destiny,” he said. “We shape our destiny.”

The second industrial revolution itself, he said, proved the point. After factories were electrified, Brynjolfsson explained, “the amazing thing is productivity didn’t increase in those factories for 30 years—30 years!” It sometimes take a while for humans to figure out how to use innovations, he said, and perhaps we are just now beginning to comprehend the full possibilities of computerization. In Brynjolfsson’s view, we are now in the beginnings of the new machine age, an extended moment of revolution in artificial intelligence. “A child’s PlayStation,” he said, is more powerful than a military super­computer from 1996; a chess program contained on a cell phone can defeat every grandmaster. Brynjolfsson pointed out that Watson, the IBM AI project, having successfully amassed enough everyday knowledge to defeat the grand champions on "Jeopardy!," was “now applying for jobs at call centers, and getting them. In finance, and in law, and getting them.”

Economists often note that even experts are very bad at predicting the world to come and constantly underestimate it. Optimists like Brynjolfsson say that though productivity gains from computer technologies have declined since 2004, that’s no reason to expect the decline to continue. They see prospects. A recent ­McKinsey report detailing economic sectors that might grow found, for instance, great possibilities in intelligent machines: trillions of dollars in the so-called Internet of Things, for instance, and 3-D printing.

I called Brynjolfsson at his office at MIT to try to get a better sense of what a ­roboticized society might look like. It turns out the optimist’s case is darker than I expected. “The problem is jobs,” he said. Sixty-five percent of American workers, Brynjolfsson explained, occupy jobs whose basic tasks can be classified as information processing. If you are trying to find a competitive advantage for people over machines, this does not bode well: “The human mind did not evolve to multiply triple-digit numbers,” he told me. The robot mind has. In other words, the long history of Marx-inflected pleas, from ­Bartleby (1853) through to "Fight Club," that office work was dehumanizing may have been onto something. Those jobs were never really designed for the human mind. They were designed for robots. The existing robots just weren’t good enough to take them. At first.

At opposite ends of the pay scale, there are jobs that seem safe from the robot menace, Brynjolfsson said—high-paying creative and managerial work, and non-routine physical work, like gardening. (The smartest machines still struggle to recognize an ordinary kitchen fork if it is rotated by 30 degrees.) As for the 65 percent of us who are employed in “information processing” jobs, Brynjolfsson said, the challenge is to integrate human skills with machine capacities—his phrase is “racing with machines.” He mentioned a biotech company that relied on human workers to refine the physical shapes of synthetic proteins, jobs at which the most sophisticated algorithms remain hopeless. I expressed some doubts about how many jobs there might be in endeavors like this. “The grand challenge is: Can we scale them up?” Brynjolfsson said. “We haven’t seen that yet. Otherwise, employment would be going up rather than down.”

Even among the most committed stagnation theorists, there is little doubt that innovation will continue—that our economy will continue to be buttressed by new ideas and products. But the great question at the center of the growth argument is how transformative those breakthroughs will be, and whether they will have the might to improve human experience as profoundly as the innovations of a century ago. One way to think about economic growth is as a product of human capital and technology: At moments like this, when human capital is not growing much (when the labor force is unlikely to grow, when it is not becoming more educated), all of the pressure rests on technology. For this reason, some economists who think Gordon greatly understates the potential of computers still agree that it will be hard for technology to sustain the growth rates we’ve become accustomed to. “We’re not going to get to 2.25 percent GDP growth—that’s way out on the tail,” Dale Jorgenson of Harvard told me. “There’s going to be a slowdown. It’s not a secular stagnation. It’s a change in demography. And this is a watershed event.”

Provoked by Gordon’s paper, Daniel Sichel of Wellesley and a team of collaborators have worked out a model by which future U.S. growth might match the rates it has historically achieved. It was not a science-fiction scenario, Sichel explained to me; it required a faster rate of improvements in microprocessor technology, and new computer technologies to be adopted quickly by sectors (education, health care) that have tended to move more slowly. But this is Sichel’s optimistic model; his median projection—his sense of what is most likely to happen—isn’t much more hopeful than Gordon’s. That we might continue to experience the kind of growth we’ve enjoyed for the past several decades remains a defensible possibility. But so does Gordon’s idea, that something great is gone.

In 2007, Mexicans stopped emigrating to the United States. The change was not very big at first, and so for a few years it seemed like it might be a blip. But it wasn’t. In 2000, 770,000 Mexicans had come across the Rio Grande, but by 2007 less than 300,000 did, and by 2010, even though violence in Mexico seemed ceaseless, there were fewer than 150,000 migrants. Some think that more Mexicans are now leaving the United States than are coming to it. “We’re never going to get back to the numbers we had in the late nineties,” says Wayne Cornelius, a political scientist at UC–San Diego who has spent the past 40 years studying this cross-border movement. A small part of this story is the increase in border protection, but the dominant engine has been the economic shifts on both sides of the border—it has become easier for poor Mexicans to improve their quality of life in Mexico and harder to do so in the United States. Because migrants from a particular Mexican village often settle in the same American place, they provide a fast conduit of economic information back home: There are no jobs in construction or housing. Don’t come. The Pew Hispanic Center has traced the migration patterns to economic performance in real time: a spike of migration during 1999 and 2000, at the height of the boom; a brief downturn in border crossing after the 2001 stock-market crash followed by a plateau; then the dramatic emptying out after the housing industry gave way in 2006. We think of the desire to be American as a form of idealism, and sometimes it is. But it also has something to do with economic growth. We are a nation of immigrants to the extent that we can make immigrants rich.

These hingelike mechanisms, in which social changes depend upon the promise of rapidly escalating well-being, are studded throughout the aftermath of the second industrial revolution. The United States did not really become a melting pot until the 1880s, when the economy was beginning to draw on the breakthroughs of electricity and the engine and attract migrants from Southern and Eastern Europe. The labors that housework required in the nineteenth century were so consuming that housewives in North Carolina walked 148 miles a year carrying 35 tons of water for nonautomated chores. It took until the fifties for household appliances to decline so much in price that they were ubiquitous; the next decade was the one of women’s liberation. The prospects for African-American employment increased most dramatically during World War II and in the period just after: 16.4 percent of black men held middle-class jobs in 1950; by 1960 it was 24 percent; by 1970, 35 percent. Progressives will often describe the history of social liberation by quoting Martin Luther King Jr.’s line that the arc of the moral universe bends toward justice; the implication is that metaphysics are somehow involved. But this history has also taken place during unique economic times, and perhaps that is not coincidence.

If you buy Gordon’s story, then the effect of the second industrial revolution was to replace the specific entitlement of the Gilded Age (of family, of place of birth) with a powerful general entitlement, earned simply through citizenship. “Just the fact of being an American male and graduating from high school meant you could have a good-paying job and expect that you could have children who would double your own standard of living,” Gordon says. This certainty, that the future would be so much better than the past that it could be detected in the space of a generation, is what we call the American Dream. The phrase itself was coined only in 1931, once the gains of the second industrial revolution had dispersed and inequality had begun to dissipate. There is a whole set of manners, which we have come to think of as part of our national identity, that depends upon this expectation that things will always get better: Our laissez-faire-ism; our can-do-ism; the optimistic cast of our religiosity, which persisted even when other Western nations turned toward atheism; our cult of the individual. We think of the darkening social turn that happened around 1972 as having something to do with the energies of the sixties collapsing in on themselves, but in Gordon’s description something more mechanistic was happening. “The second industrial revolution had run its course,” he says, and so, in many ways, had its social implications.

It is at about this point in his litany that Gordon’s face will achieve its fully elfin dimensions, and he will grin and say: “How do you like your smartphone now?”

Gordon has been getting e-mails from regular people who have learned something about his theory, and who have been trying to make sense of the consequences. He has a separate e-mail box where they have accumulated; he tries to reply to each one. The messages are more muted than you might think, more introspective. From a Cincinnati investment manager: “There is no way productivity growth in the future will achieve the rate of the sixties, right?” From an attorney: “I have reached comparably pessimistic conclusions from a less rigorous analysis.” From an activist in Rhode Island: “I strongly believe if we understand the end of growth, we can make provisions for the economy we actually have.” This is not a bad way of thinking of the cultural corrections that in retrospect we will probably categorize as Obama-ism: The renewed skepticism about capitalism, the urgency of the problem of inequality, the artisanal turn away from modernity, the rapid decline of American exceptionalism. We may be making provisions for the economy that we actually have.

Gordon’s recent work has been suffused with a sense of loss, of the end of things. In certain ways these have also become the themes of his life. He lives in Evanston in a grand house, built in 1889, the second one in from Lake Michigan. Gordon and his wife, a film scholar, bought the place fifteen years ago and restored it, including the stables, though they have no horses, and the extra rooms, though they have no children. Gordon comes from a famous family of economists; his parents, Harvard graduate students in the dismal science, met at a departmental event during the thirties, and ever since, the Gordons (the parents, Gordon himself, and his more radical younger brother, David, who died in 1996) have been tabulating the effects of this spectacular American century. Gordon’s own father had grown up not well-off in Baltimore, but once he started teaching at Berkeley, the family experienced its growing prominence and prosperity as a subset of the country’s own. Returning to the West Coast during college, Gordon would mark the progress of the last spokes of the great interstate-highway system, a new road laid down each vacation. “When I went to lunch together with my friends in grad school,” Gordon said, “I would draw the whole interstate-­highway system. It was that incredible. I could number every road.”

One recent afternoon, I met Gordon at his house, and we drove to lunch through Northwestern’s main campus. Around Gordon and me—bicycling across the quad, wandering half-drunk into the streets—were the members of the first American generation who would be no more educated than their parents. “You look at the numbers, at how much more it costs now to get ahead—all the tutors, the college-prep courses, in some cases the private admissions consultants—and it is just astonishing,” Gordon said. What he was describing was a society where the general privilege of simply being American was once again losing out to the specific, inherited privilege of being born rich.

All of which moved Gordon to talk about the emotions that accompanied the beginning of the great boom. “Try to imagine what a contemporary person might feel,” Gordon said, referring to the twenties and thirties. Movies were getting unbelievably better—in just fifteen years after the first talking motion picture, Al Jolson’s "The Jazz Singer," the studios produced four of the top ten movies (per the American Film Institute) ever made.

He kept talking about movies: The “We’re not in Kansas anymore” moment when "The Wizard of Oz" switches from black and white to “the paradise of full color.” The great three-year public frenzy about who would play Scarlett in "Gone With the Wind," maybe the first full incarnation of the modern celebrity machine, which ended when three studio executives arrived at a movie theater in the San Fernando Valley and replaced the ordinarily scheduled feature with the new print. “There was a pause, and the movie didn’t start. And then the public-address system came on and said, ‘The program—’ ” Gordon stopped. He was crying. “You see how choked up I get about this,” he said. He rubbed his eyes a bit and continued. “ ‘The program originally scheduled for tonight has been replaced with Gone With the Wind.' And suddenly they’re going to be able to tell their children and their grandchildren. This stuff is just so powerful.”

In the book that Gordon is writing now, in which he details his theory, he breaks his narrative between the Old World and the New at 1940. That year is a convenient midpoint, because it more or less splits the difference between the beginning of the second industrial revolution and the present day. It also happens to be the year of Gordon’s birth. There is a certain degree of solipsism in Gordon, in the insistence that human existence has reached its peak during his lifetime, in his conviction that he can detect the trajectory of the future. But perhaps this is a corrective to the solipsism of our own optimism, to the convenient way that we forget our distant history and assume that something like this version of America progress, ever-escalating, is both inevitable and sustainable, to our certainty that the future must contain something better to come. Ω

[Benjamin Wallace-Wells is a contributing writer at New York Magazine. Previously, he was a contributing writer at The New York Times Magazine and at at Rolling Stone. He has worked as a reporter on the Philadelphia Inquirer's Metro Desk. His writing has been published in the Boston Globe, The New Republic, and Policy Review. He was an editor of the Washington Monthly from 2003 to 2006. Wallace-Wells received a BA from Dartmouth College.]

Copyright © 2013 New York Media



Creative Commons License
Sapper's (Fair & Balanced) Rants & Raves by Neil Sapper is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License. Based on a work at sapper.blogspot.com. Permissions beyond the scope of this license may be available here.



Copyright © 2013 Sapper's (Fair & Balanced) Rants & Raves

No comments:

Post a Comment

☛ STOP!!! Read the following BEFORE posting a Comment!

Include your e-mail address with your comment or your comment will be deleted by default. Your e-mail address will be DELETED before the comment is posted to this blog. Comments to entries in this blog are moderated by the blogger. Violators of this rule can KMA (Kiss My A-Double-Crooked-Letter) as this blogger's late maternal grandmother would say. No e-mail address (to be verified AND then deleted by the blogger) within the comment, no posting. That is the (fair & balanced) rule for comments to this blog.