Saturday, May 08, 2010

A Prophet Without Honor...?

[x YouTube/TheBacmaster Channel
"You've Got Your Troubles" (1965)
By The Fortunes

Christopher Lasch (1932-1994) was, in Alan Wolfe's words, "the great diagnostician of our rootlessness." Lasch and this blogger shared a pair of life experiences: we both were only children with no siblings and both of us failed our first attempt at our qualifying examinations for the doctorate. Of course, Lasch was enrolled in the doctoral program of the history department at Columbia University and this blogger was enrolled in a doctoral program at Not-Columbia (aka Texas Technique) in Lubbock, TX. Beyond that point, there is no comparison. Christopher Lasch became a major figure in the historical profession; this blogger spent his time at the furthermost margins of the profession. To paraphrase The Fortunes: Kit Lasch had his troubles and this blogger had... whatever. If this is a (fair & balanced) account of a real contributor to our historical literature, so be it.

[x TNR]
Jeremiah, American-Style
By Alan Wolfe

Tag Cloud of the following article

created at

Review of Hope in a Scattering Time: A Life of Christopher Lasch
By Eric Miller (Eerdmans, 394 pp., $32)

In a moving tribute to Christopher Lasch written shortly after his death in 1994, Dale Vree, a Catholic convert and the editor of the New Oxford Review, wrote that “Calvinism was his true theological inspiration.” Lasch was certainly not one of the faithful. “Even before I took so rashly to writing about religion,” he once scribbled to himself, “it was an embarrassment to admit that I had none.” Yet despite his skepticism, the crucial idea associated with Calvinism since the sixteenth century—an insistence on the complete and utter depravity of the human race—fit Lasch’s increasingly dark vision of human purpose almost to perfection. “Calvinism (via Perry Miller) was my downfall,” he wrote to an inquiring Barbara Ehrenreich in an undated letter. “Or was it Luther’s commentary on Paul’s epistle to the Romans, taught to me by Sidney Ahlstrom? Some ancestral throwback to some distant German past? Or just orneriness and perversity? I kept it under wraps for years, but it was bound to come out in the end.”

If Lasch’s peculiar form of secular Calvinism was a throwback to his family’s past, it must have been a distant one. Lasch was born to Robert and Zora Schaupp Lasch in Omaha, Nebraska in 1932. Although descended from Midwestern Lutherans on her father’s side, Zora “had not a spark of religious faith,” as she described herself in her unpublished autobiography. A feminist and a rebel, she was very much a product of the Roaring Twenties, Nebraska-style. After receiving her doctorate in philosophy from Bryn Mawr, she returned to the state university in Lincoln but was denied permission to teach ethics because of her commitment to naturalism. She included among her acquaintances John Dewey and Bertrand Russell. The only thing she had in common with the fundamentalist William Jennings Bryan was her home state.

Robert came from an even more agnostic background than his wife. His parents did not attend church, and although they enrolled their son briefly in Sunday school, before long, as Robert put it, “the pretense of piety was gratefully dropped.” He spent his life in the field of journalism and his politics in the service of the left. A Midwestern progressive committed to social justice at home and skeptical of the projection of American power abroad, he won the Pulitzer Prize in 1966 [to go with an earlier Rhodes Scholarship] for his antiwar editorials published in the St. Louis Post-Dispatch. Robert outlived Kit, as his son would always be known. Doing so enabled him to communicate his displeasure toward Kit’s preoccupation with religious themes. “You must be back from your vacation by now,” he wrote him in 1990, “the biggest question of it being: ‘What in God’s name were you doing at a conference on theology?’” Robert found it difficult enough to accept Kit’s increasing cultural conservatism. The son’s interest in religious questions left the father baffled.

Kit Lasch died at the age of sixty-two, and so we will never know whether he would have eventually gone all the way and joined a church; chances are he was too much of an iconoclast for that. It hardly matters, though, for when it came to the form and content of the jeremiad, the prophetic tradition of reminding us ferociously of our fallen state, there was no greater master during his lifetime. It seems appropriate that the first full-length biography of this major critic of American society and American culture should be written by a historian who teaches at a Christian college—Geneva, near Pittsburgh—named after Calvin’s hometown. And Eric Miller’s fine book has been published by Eerdmans, a press founded by a Christian Reformed Church immigrant from Holland and located in Grand Rapids, Michigan, the epicenter of Calvinist America. Compared with an era that produced Reinhold Niebuhr and Paul Tillich, America in the past few decades, for all its God talk, has lacked major theologians capable of attracting a widespread readership. The Lord works in mysterious ways, and if religious truths can be discussed in the general culture and even make it into presidential speeches, one can hear these devout thinkers saying to themselves: does it really not matter that the person pronouncing them is without confessional commitments of his own?

Lasch was controversial when he lived, and he remains subject to strong and divided opinion long after his death. The fact that he combined a respect for Freud with a love of the Puritans is the least of it. He became a radical just as a new radicalism in America was about to be born, but he would always have an ambivalent relationship with the younger leftists who came after him. His populistic sensibilities became stronger over time, but so did his ties to the elite worlds of academia and opinion journalism. His appreciation for the common man was married to a disdain for mass culture. Lasch was a pioneer historian of the family, but feminists hated—that is not too strong a term—his nostalgia for the nuclear one. It is common to say, as Miller does in his biography, that Lasch simply did not fit into the traditional categories of left and right. But that is because Lasch really did not fit anywhere. He was a genuine American original.

So Lasch is owed his due and Miller is the right person to provide it. Although at times a bit heavy-handed in its insistence on the religious themes in Lasch’s work, Miller’s biography is as thorough as it is thoughtful. This is anything but a quickly written effort to explore the relationship between a thinker and his times. Miller has not only dug deeply, he has also pondered carefully. Lasch’s writings always fascinated me: I looked forward to each of his books, expecting to encounter a writer willing to discuss the most serious of themes in ways that resonated among large numbers of his fellow citizens—and a writer who would provoke me into flabbergasting disagreement in the process. I never met the man, but thanks to this book I now feel that I have. I could not be more grateful to Miller for facilitating the introduction.

Some people are born to be writers, and Lasch was one of them. His parents made their living with words, and had many connections to famous writers: before she married Robert, Zora had lived with Willa Cather’s sister and found the famous novelist “rather patronizing, certainly not conducive to any warm regard on my part.” Kit Lasch, too, lived in proximity with literature: for three years he roomed at Harvard with John Updike, another chronicler of the American obsessions with sex, social class, ambition, and faith. Lasch would never be as good a literary critic as he was a social critic. “I think his stuff lacks perception and doesn’t go very deep,” he reported to his parents about Updike. “He is primarily a humorist. As he himself admits, he is probably a hack. At least he has more of a hack in him than a profound artist.” It was by no means obvious that a child of the populist Midwest would choose the most prestigious private university in the country for his education, but Lasch was too precocious for any Big Ten school. His mentors at Harvard were quick to discern his talent: Donald Meyer, his tutor there, compared discovering Lasch among his students to Leo Durocher learning that Willie Mays would be his center fielder.

Lasch went directly to Columbia University, the best department in American history at that time, to pursue his doctorate. Meyer’s role at Harvard was assumed at Columbia by William E. Leuchtenburg. But the real influence on Lasch would prove to be the man to whom he would so often be compared: Richard Hofstadter. For Lasch, Columbia was a tougher nut to crack: he failed his first attempt at his qualifying examinations for the doctorate. But his skills were recognized and before long he was on his way, teaching at Williams and completing with astonishing speed the dissertation that resulted in his first book, American Liberals and the Russian Revolution. Rejected by the University of Chicago Press, it was eventually published by Columbia in 1962. It printed 1,500 copies, and 807 were sold by 1964. The only royalty check that Lasch received for the book was for $23.73. But Lasch followed three years later with The New Radicalism in America, the breakthrough book that established him, while still in his thirties, as one of the most promising historians in the country.

Both books inaugurated his lifelong argument with liberalism. The dissertation relied heavily on George F. Kennan and Reinhold Niebuhr to attack liberals for their naïve belief that men such as Lenin, Trotsky, and Stalin were somehow on the side of virtue; already in his early work one can see in Lasch the skepticism toward progress that would mark his later writings. The New Radicalism in America took a somewhat different tack, but sailed in the same direction. Here Lasch wrote brief biographical portraits of the leading intellectuals of the 1920s and 1930s, such as Mabel Dodge Luhan, Lincoln Steffens, and Randolph Bourne. He chose them, he said, because they put culture and not politics at the center of their concerns. While Lasch found much to admire in them, he also worried that in cutting themselves off from their country they downplayed the intellect in favor of the latest crusade. It was a work of history, but Lasch brought it up to his own era with scathing comments about such figures as Norman Mailer and the liberals around the Kennedy White House. “You will be interested to know,” Arthur Schlesinger Jr. wrote Richard Rovere, “that a smart aleck kid named Christopher Lasch (son of Bob Lasch of the St. Louis Post-Dispatch), exposes us both in a new book called The New Radicalism in America.” Unlike our age of Limbaugh, Lasch’s book appeared at a time when no one hated liberals more than leftists. Lasch, at that time, led the way.

As if to return to his Midwestern roots, Lasch’s first major teaching positions were at the University of Iowa and Northwestern. Eventually he settled at the University of Rochester, joining the faculty in 1970 and remaining there until his death. Miller, who is stronger on the intellectual trajectory of Lasch’s work than on the details of his personal life, explores in fascinating fashion his complex relationship with Eugene Genovese, who recruited Lasch to Rochester. Both men were the same age, and shared (at the time) the same Marxist politics, and would move eventually to the right. But in his temperament Genovese strongly resembled the Russian Marxists about whom Lasch had warned in his first book. Steely and suspicious whatever his political views, Genovese persuaded Lasch that the two of them could build an outstanding history department at Rochester together. It never worked out. The idea of running a department democratically was, in Genovese’s view, laughable. Supported by an administration that wanted no trouble, Genovese not only drove away some of Rochester’s stars, such as Herbert Gutman, he also froze Lasch’s salary during the terrible inflationary years of the mid-1970s—and this in spite of Lasch’s increasing fame. Meanwhile Lasch recruited top students, many of whom would go on to write well-received books of their own, and simply waited out Genovese, who eventually moved to Georgia.

As Lasch tried, in the 1970s, to give structure to his reservations about liberalism, he assigned a prominent place to Freud as well as Marx. The latter provided an emphasis on the tensions between social classes that would always characterize Lasch’s writings, even when they took on their most conservative coloration. The former gave Lasch a reason for being sensitive to ulterior motives: if well-meaning liberals could become apologists for totalitarian tyranny, then the views of anyone could produce results quite different from those intended. It was all the rage to combine Marx and Freud during the 1960s and 1970s, and Lasch found himself reading philosophers from the Frankfurt School such as Adorno and Horkheimer, who were trying to do the same thing.

The insights that Lasch gleaned from both Marx and Freud helped to shape the arguments in the trilogy that would make him something of a household name: Haven in a Heartless World (1977), The Culture of Narcissism (1979), and The Minimal Self (1984). “I’ve written something that is guaranteed to bore and possibly offend almost every class of reader,” Lasch wrote to his friend Gerald Graff about Haven in a Heartless World. His editor at Knopf, Ashbel Green, agreed with the boredom part: Knopf rejected the manuscript even though a book on the family from Lasch had been under contract. Finally accepted by Basic Books, it proved to be something of a publishing sensation. Praised by George Gilder in National Review as “shrewd and brilliant,” the book infuriated New Leftists and especially feminists.

Lasch’s purpose in this book was not to blame women for the collapse of the traditional family; he was still enough of a Marxist to point his finger at capitalism and its apologists for America’s high divorce rates and unhappy children. But he also left no doubt that in his view feminism was, if not the cause, then at least the symptom of family decline. Social critics thrive because so much is wrong with society that one can always find targets aplenty. The case against Lasch was not that his targets were innocent, but that his indictments were selective. Those to the left of Lasch asked: why blame people whose primary commitments were to greater equality and autonomy? But for Lasch, calling attention to the role of the advocates of personal liberation was precisely the point. As gloomy as Lasch’s analysis was, he seemed to take special pleasure in going after those whose views were shaped by the 1960s, and the latter had no trouble detecting his animus.

With The Culture of Narcissism, also rejected by Knopf, Lasch broke out of debates within academic circles to make his mark on national politics. Time and Newsweek wrote about it. People featured Lasch along with Olivia Newton-John. Within four months of publication, 45,000 copies were sold. Not nearly as much a work of intellectual history as his previous books, this book explored such concerns as aging, schooling, advertising, managerialism, and even sports, the last of which received a chapter of its own. (Lasch was not a funny man, but this chapter included what I regard as his best witticism: his definition of sport as “the utmost concentration of purpose, on behalf of activities utterly useless.”) This was the perfect book for its time. It was morbidly clever, brilliantly on target, idiosyncratically compelling. It deserved its status as a best seller and lifted (for a time) the prospects for all serious nonfiction writers to reach larger audiences. Allan Bloom, among others, would follow where Lasch had led.

The most historically significant readers of The Culture of Narcissism proved to be Jody Powell and Patrick Caddell, who were working in the Carter White House. Much has been written about Jimmy Carter’s “malaise” speech and Lasch’s role in it (including a recent book by Kevin Mattson, “What the Heck Are You Up To, Mr. President”),and Miller’s treatment of the episode is somewhat disappointing. He faithfully records Lasch’s unhappiness with the speech. Although he could hardly help noticing how much Carter had borrowed from him, Lasch felt that “the ideas were torn out of context and came across as moral denunciations of selfishness,” and he fired off a seven-page memo to Caddell outlining his concerns. (In 1980, Lasch would vote for the hopeless campaign of Barry Commoner.) Miller takes Lasch’s side in all this, but it is not difficult to imagine hard-boiled political pros coming to the conclusion that Lasch, having made his contribution, was too unreliable and needed to be dumped. That’s how politics works. A chronicler of fads should know when his time has passed. Anyway, intellectuals, even “public intellectuals,” do not do their finest work for politicians.

It was certainly a boon for the republic of letters that any ambitions Lasch may have had for direct political influence were quashed. Miller correctly points out that Lasch was at his best when he wrote as a member in good standing of the intellectual class directing his fire against people very much like himself. I am not sure how much Americans in general benefited from Lasch’s secular sermons, but those who toil in the vineyard of ideas unquestionably did. The years in which Lasch’s trilogy appeared were accompanied by significant intellectual posturing in America: an irresponsible and jargon-filled form of postmodernism began to dominate the humanities, leaving a romantic and reckless neoconservatism to take over the White House. Agree or disagree with Lasch, he sustained a fierce integrity and love of argument at a time when all too much thinking was becoming calcified and doctrinaire. (He appeared often in these pages in those years.) The further removed from politics Lasch was, the richer were his thoughts. “Neither the Right nor the Left addresses the overwhelming question of our time, the question of limits,” he wrote in the London Daily Telegraph in 1986, adding that “the problems confronting us are cultural and spiritual rather than political.” In those few words Lasch outlined the course his intellectual trajectory would take in the remaining years of his life.

The True and Only Heaven, Lasch’s longest and most serious book, appeared in 1991, and addressed head-on the question of limits. The title was lifted from Hawthorne, and signified the extent to which Lasch had chosen to write in the spirit of Puritan New England. The book defies summation: Lasch took on so many thinkers from so many eras—the Scotsmen Adam Smith and Thomas Carlyle, the founders of this magazine, Dr. Spock, the anti-busing activist Louise Day Hicks—as to leave readers gasping at the sheer chutzpah of his ambition. But underlying it all was a deep transformation in Lasch’s own thinking. Marx and Freud were out, populism and Calvinism were in. And if the former combination led to the Frankfurt School and Wilhelm Reich, the latter enabled Lasch to finally come to terms with Richard Hofstadter. “I’ve come to see Hofstadter as a latter-day version of H.L. Mencken endlessly belaboring the ‘booboisie,’” Lasch wrote to his former student (and now professor of history at Columbia) Casey Blake. It was a shrewd if cruel comment. Hofstadter was the quintessential New York intellectual, appreciative of modernity, skeptical of religion, in love with cosmopolis. Lasch, by now the anti-Hofstadter, turned all that on its head. For all their differences, populists and Calvinists were both suspicious of progress, which is why Lasch loved them both.

“It is almost impossible to read True and Only Heaven, especially at a fifteen-year remove from its publication,” Miller writes, “and not see in it a work of profound scholarship.” He is right. Those fifteen years have brought us, among other treasures, the blog; and while I appreciate the energy and the combativeness of today’s form of intellectual debate, turning back to Lasch’s magnum opus reminds one that progress in the way ideas are conveyed is just as much a mixed blessing as progress in the ideas that are conveyed. Who cannot love Lasch’s combination of learning, passion, stylistic flair, engagement with the largest of themes, and fearless hectoring? And at such length? I can think of no writer today who combines erudition with passion the way Lasch did.

In February 1992, thirteen months after The True and Only Heaven appeared, Lasch learned that he had cancer. Although he would continue to write (some books were published posthumously), his life’s work was done. Over a thirty-year span, he had published eight major books, any one of which would have constituted a significant achievement. It is fitting that he died much as he thought. “I despise the cowardly clinging to life, purely for the sake of life, that seems so deeply ingrained in the American temperament,” he told one of his doctors. Like Hume, who on philosophical grounds refused to be converted on his deathbed, Lasch was willing to apply his living insights to his dying days. The last word on his life should be left to his father. “You have a staggering record of accomplishment in a worthwhile calling” is how Robert said goodbye to his son. “You do not need a divinely fashioned grand design to justify your life.”

Eric Miller deals mostly with Lasch’s life. What about his legacy? On that question, we cannot let the religious thinkers who adopted him have the last word. Of Lasch’s verve and reach, there can be no doubt. Of his relevance to the years that followed, the matter remains open.

The issue that preoccupied his critics even when Lasch was in his prime was whether his anger led him to exaggerations that made his conclusions untrustworthy. The passage of time, I believe, validates those critics more than Lasch. Miller faithfully gives the criticisms. Mark Edmundson, for example, characterized The True and Only Heaven as “little more than a bitter screed against liberal yuppies, a highbrow version of what the popular press has been working at sporadically over the past decade.... The only passages that hum with any kind of life—even if it’s a rather repugnant life—are passages of denunciation in which Lasch chains himself up on the rock and derides the objects of his current rancor.” Such were the pyrotechnics that Lasch could provoke. Edmundson gives Lasch too little credit for his insights into intellectual history, but he is right about the tone. No matter what the subject, Lasch saw nothing but decline.

There are only a limited number of ways one can improve on Increase Mather, and by the end of his life Lasch had run out of them. Miller’s biography shows that Lasch never seemed to understand the effects of his own venom. It was not just that he felt misunderstood by Jimmy Carter. Lasch loved to attack, but he always seemed surprised that the objects of his attack fought back. In his own mind, he was the courageous teller of truths that no one wanted to hear; and so his critics must have been engaged in a prolonged effort at denial, at misreading him as a way to escape their responsibility for the damage that their own ideas had caused. “If you talk about the growing tensions that so often seem to characterize relations between men and women, it’s assumed that you want women to return to the kitchen,” Lasch complained to friends. But whether or not he wanted women confined to cooking, he clearly wanted to make it more difficult for them to divorce their husbands or to pay as much attention to their careers as to their children. Perhaps Lasch’s feminist critics sensed in Haven in a Heartless World the attitudes that Lasch would later express in a Harper’s forum. When asked how the Bill of Rights should be rewritten to account for the contemporary condition, he responded with a proposed article that included this: “No state shall pass laws authorizing divorce for any but the weightiest reasons. In the case of couples with children under the age of twenty-one, divorce is hereby forbidden.” Feminists, Lasch wrote to another friend, “willfully and maliciously distorted” his thinking. To me it seems that they understood his views pretty well.

From the vantage point of today, it is possible to detect an even more serious flaw in Lasch’s choice of targets. Lasch was engaged in an endless war against the leftism of the 1960s when the true enemy of American traditionalism would turn out to be the radical right—one of the persistent objects of Hofstadter’s attention—that came to power after his death. To be sure, Lasch was absolutely right in insisting upon the narcissistic strain in American culture. But what public figure today can possibly compete with Sarah Palin for the title of the most narcissistic personality of our times? (Lasch would no doubt respond by describing Palin’s prominence as an ironic consequence of feminism’s influence.) Compared to the catastrophic irresponsibility of Wall Street practices, or Republican nostalgia for the Confederacy, or the blatant unconstitutionality of much of the Bush and Cheney years, the harm done to the United States by left-wing elitists in the 1970s begins to pale. Lasch confused the counterculture with permanent revolution. The extremes that provoked him were quickly recognized, even by some of their protagonists, as overheated and dangerous, and the radicalism of the 1970s eventually receded or corrected itself. But the reactionary conservatism that we witness today, by contrast, will never learn to say it is sorry.

All of which leads to Lasch’s single most important miscalculation. From start to finish, his enemy was liberalism. What was implicit in his first books became explicit to the point of caricature in his later writings. The liberal order, he wrote in 1983, “should have collapsed a long time ago,” because it could not propose a “theory of the good society” and was therefore doomed to become “a politics of mass media.” Blaming liberals for Ronald Reagan was little different, or more plausible, than blaming feminists for family decline. Focused on the left in which he had grown up and come to political maturity, Lasch turned to populism in his final books just as conservatives discovered how to turn popular anger into a maelstrom of negativity. So it must be affirmed that against the ignorant conspiratorialism of today’s right, liberalism’s theory of the good society, however thin, looks downright robust. “His public invective in the mid-eighties against liberalism—which he certainly hoped was not resurgent—was as harsh as ever,” Miller points out. “He rarely gave either the political tradition or movement any sort of blessing.” Had Lasch been less misanthropic, he might have been more perceptive.

Miller identifies so strongly with his subject that he accepts Lasch’s judgment that he was crucified on a cross of misunderstanding. He finds Lasch’s critics smug and condescending. They treated him, Miller concludes, “as someone to keep at arm’s length—a quirky and dangerous fellow who at some point along the way had taken a frightening wrong turn.” The problem is that Lasch did take a wrong turn—and it was precisely the turn toward secular Calvinism that Miller praises. It is one thing to admire Jonathan Edwards and to lament the fact that we no longer take theology seriously—I myself love to assign Edwards to my students; but a historian, of all people, should have recognized that however valuable an appreciation of our depraved nature may have been in one era, it may cause significant injury in another. Yes, progressive history—the idea that each era is better than the one that precedes it—clearly fails to do justice to the sheer contingency of human life. But so does Lasch’s insistence that everything is for the worst in this worst of all possible worlds.

In the end even Miller, who is so admiring of Lasch, feels that something did not quite hold together in his worldview. “Lasch,” he concludes, needed “a more full, complete, rich tradition than ‘populism,’ one that extended well beyond the public political sphere to touch and form mind and heart.” But Lasch confounded his faithful admirers just as he did his leftist critics; he knew about the realm of God to which Miller alludes and did not want any part of it—at least for himself. The great diagnostician of our rootlessness never found his own roots. Lasch did not like modernity, but he was modernity’s child. We must be thankful that we live in a society sufficiently mobile, alienated, elitist, and skeptical to have produced such a man. Ω

[Alan Wolfe is a contributing editor at TNR and Professor of Political Science and Director of the Boisi Center for Religion and American Public Life at Boston College. Wolfe received a B.S. from Temple University and a Ph.D. in Political Science from the University of Pennsylvania. His most recent book is The Future of Liberalism (2009) and, at present, Wolfe is writing a book about political evil.]

Copyright © 2010 The New Republic

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.Copyright © 2010 Sapper's (Fair & Balanced) Rants & Raves

Welcome To The Anthropocene Age: We're Toast!

Global Warming is but one giant environmental problem; how about Ocean Acidification, the Nitrogen/Phosphorus Cycle, Land Use/Deforestation, Water Depletion, Species Extinction, Chemical Pollution, Aerosol Pollution, and good ol' Ozone Depletion? All of these destructive forces are working in combination at this very moment. Barry Commoner said it best in The Closing Circle (1971):

The Four Laws of Ecology

1) Everything is connected to everything else.

2) Everything must go somewhere.

3) Nature knows best.

4) There is no such thing as a free lunch.

Those words echoed in this blogger's mind upon reading Bradford Plumer's report on the state of our planet. As Pogo Possum said, "We have met the enemy and he is us." Denial is the principal weapon of Nature's enemies. If this is (fair & balanced) doom-and-gloom, so be it.

[x TNR]
Planet Doom: Nine Scenarios For Imminent Apocalypse—Only One Is Global Warming
By Bradford Plumer

Tag Cloud of the following article

created at

For most of the 2.5 million years that humans and their predecessors have been around, the Earth has been a volatile place. Subtle shifts in the planet’s orbit have triggered large temperature swings; glaciers have marched across North America and Europe and then retreated. But, about 10,000 years ago, something unusual happened: The Earth’s climate settled into a relatively stable state, global temperatures started hovering within a narrow band, and sea levels stopped rising and falling so drastically. Historians argue that this fortuitous geological period, known as the Holocene, allowed civilization to develop. Modern humans no longer had to pluck whatever they could from an unreliable environment and move on. The relatively predictable climate patterns allowed them to colonize coastal regions, clear forests for agriculture, and raise livestock. “Once you get stability, you can start planning for the future,” explains Sander van der Leeuw, who directs the School of Human Evolution and Social Change at Arizona State University. “You’re no longer acting on the moment but acting with the future in mind. And that’s what our whole society is based on—being able to invest in the future.”

Because of quirks in the Earth’s orbit, the planet would likely remain in this stable state for at least the next 10,000 years if left to its own devices. But the planet isn’t being left to its own devices. Humans are now drastically altering the natural world in so many ways—mowing down forests, depleting freshwater supplies, fiddling with the planet’s thermostat—that some experts have recently begun arguing that we’ve left the Holocene and moved into an entirely new geological epoch of our own making, dubbed the Anthropocene.

Global warming is the big change that tends to get all the attention. By now, it’s well-established that humans are adding heat-trapping gases to the atmosphere at an unprecedented rate. Climatologists have assembled ample evidence that we run the risk of catastrophe if we add too much carbon dioxide to the air. Yet it may be too narrow to focus on climate change in isolation. In recent years, some scientists have begun casting a wary eye at all the other ways we’re upending the Earth’s natural state, from disrupting the planet’s nitrogen cycle to using up resources to dousing our rivers and oceans with new synthetic chemicals. Those researchers are starting to ask whether these trends, too, could have their own tipping points—whether, as with climate, there are boundaries over which we would be incredibly foolish to cross. And what they have found suggests that, by pushing the Earth outside of the state that has persisted for the last 10,000 years, we risk squandering the stability that made civilization possible in the first place.

The idea that there are “tipping points” in nature has been discussed by scientists since the 1970s. An ecosystem can change slowly and gradually over many years until, suddenly, it reaches a threshold, at which point rapid and potentially irreversible shifts ensue.

One much-studied example is the Sahara Desert. About 6,000 years ago, the vast expanse of the Sahara was filled with lakes, wetlands, and a lush variety of species—ancient cave paintings in the region depict crocodiles. Then, some 5,500 years ago, the region became a massive desert over the relatively short course of about 200 years. “That sort of rapid change just doesn’t make sense,” notes Jonathan Foley, director of the University of Minnesota’s Institute on the Environment. “Unless, that is, you allow for non-linear systems in nature.” One hypothesis is that a slight change in Earth’s orbit led to a gradual decline in solar energy, which meant fewer monsoons hitting the region. The tipping point then came when a prolonged drought killed off the local vegetation that helped supply the area with moisture. At that point, a feedback loop kicked in, and the Sahara started drying of its own accord—more plants died off, which, in turn, made the region even more parched—until the whole area desertified in remarkably short order.

Over the last decade, a number of climate scientists have looked for tipping points that could be triggered by global warming. It’s worrisome enough that a warming world would produce steadily rising sea levels, droughts, and heat waves. But what keeps many researchers up at night is the idea of rapid, non-linear changes that utterly transform the planet. To take one example, as Arctic sea ice begins melting, the dark water underneath starts absorbing a greater share of sunlight, which, in turn, causes the ice to melt even faster. Models suggest that this feedback could reach a point where the sea ice begins melting at an incredibly fast pace and essentially collapses—indeed, some Arctic researchers think we may have already crossed this threshold. Other possible scenarios include the disintegration of the ice sheet on Greenland, rapid die-off of rain forests in the Amazon, or, even more extreme, a shutdown of the ocean currents in the Atlantic that maintain Europe’s livable climate. (Fortunately, that last one, while feasible, doesn’t appear likely anytime soon.)

Scientists continue to argue—sometimes fiercely—over the details and precise timing of these dramatic changes. But the prospect that global warming could produce non-linear disruptions has, understandably, helped inform many climate-policy goals. After all, it would be one thing for human civilization to adapt to sea levels that rose smoothly by, say, one centimeter a year. It would be another to deal with fast and unpredictable rises. That’s why the United Nations has recommended limiting warming to no more than 2°C above pre-industrial levels—above that point, we risk large-scale changes beyond our control. But it’s also true that scientists can’t be certain of where that threshold lies. James Hansen of NASA has famously argued that even the U.N. targets are too rash, and that we should limit atmospheric carbon dioxide to below 350 parts per million (a point we’ve already passed), based on his study of how quickly ice sheets reacted in prehistoric periods when the Earth’s temperature was just 1°C to 2°C warmer than today.

Those discussions, meanwhile, have led some scientists to wonder if there might be other tipping-point thresholds lurking in nature. In 2008, Johan Rockström, the director of the Stockholm Resilience Centre in Sweden, brought together two dozen experts across a variety of disciplines to see if they could identify what they called “the rules of the game.” What are the natural boundaries that humanity should stay within if we want to keep the Earth in the stable state of the last 10,000 years?

In a paper published last fall in the prestigious journal Nature, Rockström and his co-authors created, for the first time, a checklist of all the possible ways the planet is hurtling toward a potentially perilous new course. They identified nine boundaries concerning matters such as climate change, ocean acidification, extinction rates, land and water use, ozone depletion, and the disruption of the Earth’s nitrogen and phosphorus cycles. But what was most astounding was just how little we still know about the natural experiment our civilization is unwittingly conducting.

Take, for instance, ocean acidification. Ocean chemists have long known that, as carbon-dioxide concentrations in the atmosphere rise, more and more of the gas gets absorbed by the ocean and lowers its pH level—in essence, making the waters more acidic. But, up until ten years ago, no one had seriously modeled this effect, and it was assumed to be negligible. “It hadn’t been on anyone’s radar screens,” explains Ellen Thomas, a Yale paleoceanographer. “The thinking had been, ‘Well, the oceans are so huge, what can a little extra CO2 in the air really do?’” Then, in 1999, a climate modeler at Stanford named Ken Caldeira was asked by the Energy Department to study the effects of capturing carbon dioxide from smokestacks and injecting it deep into the sea as a way of cutting greenhouse-gas emissions. Caldeira studied the change in ocean pH and then, just for reference, compared it to what would happen if man-made CO2 levels kept rising. The latter proved shocking. Caldeira discovered that the oceans were already on pace to become more acidic than at any time in the past 300 million years. He published his results in 2003, coining the term “ocean acidification.”

In the years since, scientists have realized that ocean acidification is, in fact, a huge deal—a problem as worrisome as many of the worst effects of global warming. The lower pH in the oceans will prevent marine organisms from building calcium-carbonate shells. That, in turn, has the potential to weaken or even wipe out many coral reef systems—and the millions of species that depend on them. And that could prove devastating. In Asia alone, some one billion people rely on reef fisheries for food or income. Local economies that rely on oysters, clams, or scallops could go bust. Areas like Florida that depend on reefs as a bulwark against storms would find themselves more vulnerable to hurricane damage. And, while marine scientists still can’t pin down the exact point at which a global calamity would ensue, lessons from the past are unnerving. During the last major wave of ocean acidification, 55 million years ago, a vast array of deep-sea species were wiped out—which, in turn, upended marine ecosystems around the world. Given how heavily we rely on the ocean’s current food web, that’s a prospect we can’t dismiss lightly.

Rockström’s group also identified a number of potential tipping points related to modern-day industrial agriculture. With nearly seven billion people on the planet and a growing demand for meat, farmers are razing forests and diverting freshwater for irrigation at a stunning pace. It’s not hard to see that we could soon reach the point where irreversible degradation becomes likely in many areas. The Aral Sea in Central Asia, once the world’s fourth-largest lake, has already become so drained that it is now largely desert, causing the area’s once-robust fishing economy to implode and leaving trawlers marooned on the sands. And Foley points out that, on some islands in Indonesia, destruction of rain forests has reached the point where local weather patterns have been altered, drying out the area and, in turn, making the remaining trees more susceptible to forest fires. In Borneo, the black smoke has grown so thick that it has interfered with air and sea traffic and is causing widespread respiratory illnesses.

Another unsustainable trend the group pinpointed involved our reliance on synthetic fertilizer. Ever since the early twentieth century, when German chemist Fritz Haber devised a method for capturing nitrogen from the air to make ammonia gas, humans have depended on artificial nitrogen fertilizer to boost crop productivity. It was a world-changing invention, enabling the planet to keep feeding itself even as the population ballooned. But the practice has also inflicted heavy damage on the water and soil we depend on. The vast bulk of the nitrogen and phosphorus used in artificial fertilizers makes its way into the rivers and oceans. In areas like the Gulf of Mexico and the Baltic Sea, the excess nitrogen has overstimulated the growth of algae, which chokes off the water’s oxygen and kills most other organisms in the vicinity. These vast floating “dead zones” are popping up around the world—areas the size of New Jersey where fish can’t survive. Beyond that, excess nitrogen has been found to deplete organic carbon in the soil and decimate plant species in certain areas. In essence, we have overwhelmed the Earth’s nutrient cycle—Rockström’s group suggested that global nitrogen use was more than four times the “safe” threshold.

Granted, some experts find the idea of a single boundary too simplistic. Stuart Pimm, a professor of conservation ecology at Duke, points to the paper’s discussion of biodiversity as an example. Throughout history, the normal “background” rate for extinction has been about one in a million species per year. In the modern era, thanks to activities like deforestation and overfishing, an average of about 100 species per million die off each year—a staggering rate that has the potential to collapse ecosystems. In response, Rockström’s group proposed a threshold limit of no more than ten species per million per year. “But that number was completely arbitrary,” Pimm says. What’s more, the idea of a single worldwide limit on extinction can obscure some important nuances. Since the 1960s, scientists have studied “keystone” species that prop up an entire ecosystem—if, for instance, starfish are removed from a bay, the local mussel population can explode and drive out other species. “Some ecosystems can lose a number of species and be OK,” Pimm argues. “Others can lose just a few species and they’re manifestly not OK.”

Still, the specific numerical thresholds proposed by Rockström’s group were, in some sense, a side issue. Experts will continue to haggle over whether, say, 350 ppm or 450 ppm is a “safe” limit for carbon dioxide in the air. But what was more notable—indeed, eerie—about the Nature paper was its emphasis on how little we can predict about the many dramatic changes underway. Consider the mounting evidence that aerosol pollution—such as sulfates—can alter local rainfall patterns. But, according to Kevin Noone, a colleague of Rockström’s at the Stockholm Resilience Centre, there simply aren’t enough data yet to be able to settle on a “safe” level of aerosol pollution that would avoid large-scale disruption of, say, the Asian monsoon cycle. Or take chemical pollution. Currently, there are more than 80,000 synthetic chemicals on the global market, and we know that many of them can harm human health or disturb the reproductive cycles of certain species (in much the same way DDT caused eggshell thinning among birds of prey). But no one knows whether there’s a point at which some combination of chemical pollutants could throw large-scale ecosystems out of whack. “There’s evidence out there that these artificial chemicals can have adverse effects, sure, but no one’s really looked at them yet from the perspective of global thresholds,” says Will Steffen, a climatologist at the Australian National University who contributed to the Nature study. “That was one where we couldn’t even guess a boundary.”

Scarier still, many of these different processes can interact in unforeseen ways. The heavy use of synthetic nitrogen fertilizers can cause soil to release nitrous oxide, a greenhouse gas. Overfishing can make coral reefs less resilient in the face of stresses like ocean acidification and warmer temperatures. Global warming can speed up the rate of extinction. “Once you have all of these dynamic processes in so many dimensions interacting with each other, we no longer have the capability to anticipate what will happen,” says van der Leeuw. “That then raises the possibility that these processes will start affecting each other in different ways that can quickly have cascade effects we can’t see right now.”

And that creates a dilemma: Right now, too many environmental problems are studied in isolation. “Ever since Newton, the tendency of science has been to slice the pie up smaller and smaller,” says Noone. As an illustrative example, he notes that scientific literature now boasts a Journal of Chemical Physics, a Journal of Physical Chemistry, as well as the journal Physical Chemistry Chemical Physics.

Yet the research into planetary boundaries suggests that it may be impossible to deal with just one or two environmental issues at a time. At the moment, global warming gets the bulk of attention. But treating it as the only problem could lead to misguided solutions. For instance, in recent years, some researchers have broached the idea of geoengineering as a cure for climate change—one idea is that we would reflect some of the Earth’s sunlight in order to cool the planet down. If rising temperatures were our sole concern, that might not be a bad idea. But, of course, that solution would do nothing about the carbon-dioxide emissions that are acidifying the world’s oceans, or the frenetic pace of deforestation that could alter the Earth’s landscape irreparably.

The Nature study, with its many visions of destruction, is enough to cause one to crawl into a cave and die. But the notion of planetary boundaries is also very different from past doomsaying. Inherent in the concept, in fact, is cause for optimism. Back in the 1970s, the Club of Rome think tank commissioned an influential book titled The Limits to Growth, which predicted that the combination of a rapidly swelling world population and finite natural resources would put severe limits on economic growth. As it turned out, many of the book’s ominous predictions were mistaken—the authors had underestimated the ability of both markets and technology to overcome resource constraints.

Similarly, an important lesson from recent history proves that it’s possible for humanity to stay safely within the Earth’s natural boundaries and still thrive. One threshold that Rockström’s group identified involved the stratospheric ozone layer, which shields the planet from harmful ultraviolet (UV) rays. As it turns out, this is the one known planetary boundary that we’re no longer in danger of crossing—but the story of how that came to pass is illustrative.

In the 1970s, scientists were puzzling over the fact that readings of ozone concentration in the Antarctic region were much lower than anyone had expected. (The readings were so odd that, for nearly a decade, atmospheric scientists dismissed the data as due to instrument error.) At around the same time, scientists were discovering that concentrations of chlorofluorocarbons (CFCs)—chemicals used in refrigerators and air conditioners—were lingering in the atmosphere at high rates. In 1974, chemists Mario Molina and Sherwood Rowland published a paper arguing that the two phenomena may be linked, although it was a difficult hypothesis to prove. DuPont, the world’s biggest manufacturer of CFCs, disputed the connection, and it wasn’t until the mid-’80s that hard evidence confirming the link came in—including the discovery of a gaping ozone hole over Antarctica.

Looking back, it’s remarkable to ponder the serendipity of these discoveries—and how little margin for error we had. As luck would have it, DuPont had been using chlorine instead of bromine to produce CFCs. As far as anyone could tell, the two elements were interchangeable. But, as another prescient ozone researcher, Paul Crutzen, later noted, bromine is 45 times as effective at destroying ozone as chlorine. Had DuPont chosen to use bromine, the ozone hole could well have spanned the globe by the 1970s instead of being largely confined to Antarctica—long before anyone had a glimmering of the problem. It’s not hard to see what massive worldwide ozone depletion would’ve meant. Punta Arenas, the southernmost town of Chile, sits under the Antarctic ozone hole, and skin cancer rates there have soared by 66 percent since 1994. If humans had destroyed stratospheric ozone across the globe, we would likely be unable to set foot outdoors without layers of sunscreen and dark shades to prevent eye damage. Worse, the excess UV rays could have killed off many of the single-celled organisms that form the basis for the ocean’s food chain and disrupted global agriculture (studies show that bean and pea crop yields decline about 1 percent for every percent increase in UV exposure).

Happily, though, scientists did discover the ozone hole. And, despite industry warnings that abolishing CFCs would impose unbearable costs, world leaders agreed to phase out the chemicals in 1987, and economic ruin never arrived. DuPont developed a substitute for CFCs, and ozone levels in the atmosphere have stabilized, with the hole over Antarctica expected to heal by 2050. A topic that once graced the cover of Time and generated heated congressional debates now barely gets mentioned. We learned to stay within one planetary boundary without impeding human prosperity. That should give us every reason to think we can respect the others we are now barreling past. Ω

[Bradford Plumer is an assistant editor of The New Republic, where he reports on energy and environmental issues for the magazine.]

(Click here to see a slideshow of the nine scenarios for imminent apocalypse.)

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.Copyright © 2010 Sapper's (Fair & Balanced) Rants & Raves