Monday, March 08, 2004

How Will It All End?

The question of the end time is a tough one. Big Chill or Big Crunch or Big Boom or Big Something Else? Jim Holt likes Big Questions. If this is (fair & balanced) existentialism, so be it.



[x Slate]
How Will the Universe End?
by Jim Holt



Alvy Singer's Terrible Fear


One of my favorite moments in Woody Allen's film Annie Hall is when Alvy Singer (Allen's alter ego) is shown having an existential crisis as a little boy. His mother summons a psychiatrist, one Dr. Flicker, to find out what's wrong.

"Why are you depressed, Alvy?" Dr. Flicker asks.

"The universe is expanding," Alvy says. "The universe is everything, and if it's expanding, some day it will break apart and that will be the end of everything."

"Why is that your business?" interrupts his mother. Turning to the psychiatrist, she announces, "He's stopped doing his homework!"

"What's the point?" Alvy says.

"What has the universe got to do with it!" his mother shouts. "You're here in Brooklyn! Brooklyn is not expanding!"

Dr. Flicker jumps in: "It won't be expanding for billions of years, Alvy, and we've got to enjoy ourselves while we're here, eh? Ha ha ha." (Cut to a view of the Singer house, which happens to be under the Coney Island roller coaster.)

I used to take Dr. Flicker's side in this matter. How silly to despond about the end of everything! After all, the cosmos was born only around 13 billion years ago, when the Big Bang happened, and parts of it will remain hospitable to our descendants for a good hundred billion years, even as the whole thing continues to spread out.

A half-dozen years ago, however, astronomers peering through their telescopes began to notice something rather alarming. The expansion of the universe, their observations indicated, was not proceeding at the stately, ever-slowing pace that Einstein's equations had predicted. Instead, it was speeding up. Some "dark energy" was evidently pushing against gravity, sending galaxies hurtling away from one another at a runaway rate. New measurements earlier this year confirmed this strange finding. Last July 22, the New York Times ran an ominous headline: "ASTRONOMERS REPORT EVIDENCE OF 'DARK ENERGY' SPLITTING THE UNIVERSE." David Letterman found this so disturbing that he mentioned it several consecutive nights in his Late Show monologue, wondering why the Times buried the story on Page A-13.

Until recently, the ultimate destiny of the universe looked a little more hopeful—or remote. Back around the middle of the last century, cosmologists figured out that there were two possible fates for the universe. Either it would continue to expand forever, getting very cold and very dark as the stars winked out one by one, the black holes evaporated, and all material structures disintegrated into an increasingly dilute sea of elementary particles: the Big Chill. Or it would eventually stop expanding and collapse back upon itself in a fiery, all-annihilating implosion: the Big Crunch.

Which of these two scenarios would come to pass depended on one crucial thing: how much stuff there was in the universe. So, at least, said Einstein's theory of general relativity. Stuff—matter and energy—creates gravity. And, as every undergraduate physics major will tell you, gravity sucks. It tends to draw things together. With enough stuff, and hence enough gravity, the expansion of the universe would eventually be arrested and reversed. With too little stuff, the gravity would merely slow the expansion, which would go on forever. So, to determine how the universe would ultimately expire, cosmologists thought that all they had to do was to weigh it. And preliminary estimates—taking account of the visible galaxies, the so-called "dark matter," and even the possible mass of the little neutrinos that swarm though it all—suggested that the universe had only enough weight to slow the expansion, not to turn it around.

Now, as cosmic fates go, the Big Chill might not seem a whole lot better than the Big Crunch. In the first, the temperature goes to absolute zero; in the second, it goes to infinity. Extinction by fire or by ice—what's to choose? Yet a few imaginative scientists, haunted, like Woody Allen, by visions of the end of the universe, came up with formulations of how our distant descendants might manage to go on enjoying life forever, despite these unpleasant conditions. In the Big Chill scenario, they could have an infinity of slower and slower experiences, with lots of sleep in between. In the Big Crunch scenario, they could have an infinity of faster and faster experiences in the run-up to the final implosion. Either way, the progress of civilization would be unlimited. No cause for existential gloom.

So, Letterman had reason to be upset by the dark energy news. It spells inescapable doom for intelligent life in the far, far future. No matter where you are located, the rest of the universe would eventually be receding from you at the speed of light, slipping forever beyond the horizon of knowability. Meanwhile, the shrinking region of space still accessible to you will fill up with a kind of insidious radiation that would eventually choke off information processing—and with it, the very possibility of thought. We seem to be headed not for a Big Crunch or a Big Chill but something far nastier: a Big Crackup. "All our knowledge, civilization and culture are destined to be forgotten," one prominent cosmologist has declared to the press. It looks as if little Alvy Singer was right after all. The universe is going to "break apart," and that will indeed mean the end of everything—even Brooklyn.

Hearing this news made me think of the inscription that someone once said should be on all churches: important if true. Applied to cosmology—the study of the universe as a whole—that is a big "if." Cosmic speculations that make it into the newspapers should often be taken with a pinch of salt. A few years ago, some astronomers from Johns Hopkins made headlines by announcing that the cosmos was turquoise; two months later they made headlines again by announcing that, no, it was actually beige. This may be a frivolous example, but even in graver matters—like the fate of the universe—cosmologists tend to reverse themselves every decade or so. As one of them once told me, cosmology is not really a science at all since you can't do experiments with the universe. It's more like a detective story. Even the term that is sometimes applied to theorizing about the end of the universe, "eschatology" (from the Greek word for "furthest") is borrowed from theology.

Before I was going to start worrying about the extinction of absolutely everything in some inconceivably distant epoch, I thought it would be a good idea to talk to a few leading cosmologists. Just how certain were they that the cosmos was undergoing a disastrous runaway expansion? Was intelligent life really doomed to perish as a result? How could they, as scientists, talk about the ultimate future of "civilization" and "consciousness" with a straight face?

It seemed natural to start with Freeman Dyson, an English-born physicist who has been at the Institute for Advanced Study in Princeton since the 1940s. Dyson is one of the founding fathers of cosmic eschatology, which he concedes is a "faintly disreputable" subject. He is also a fierce optimist about the far future, one who envisions "a universe growing without limit in richness and complexity, a universe of life surviving forever and making itself known to its neighbors across unimaginable gulfs of space and time." In 1979, he wrote a paper called "Time Without End," in which he used the laws of physics to show how humanity could flourish eternally in a slowly expanding universe, even as the stars died and the chill became absolute. The trick is to match your metabolism to the falling temperature, thinking your thoughts ever more slowly and hibernating for longer and longer periods while extraneous information is dumped into the void as waste heat. In this way, Dyson calculated, a complex society could go on perpetually with a finite energy reserve, one equivalent to a mere eight hours of sunlight.

The day I went to see Dyson, it was raining in Princeton. It took me a half-hour to walk from the train station to the Institute for Advanced Study, which sits by a pond in 500 acres of woods. The institute is a serene, otherworldly place. There are no students to distract the eminent scientists and scholars in residence from pursuing their intellectual fancies. Dyson's office is in the same building where Einstein spent the last decades of his career fruitlessly searching for a unified theory of physics. An elfin, courtly man with deep-set eyes and a hawklike nose, Dyson frequently lapsed into silence or emitted snuffles of amusement. I started by asking him whether the evidence that the universe was caught up in an accelerated expansion had blighted his hopes for the future of civilization.

"Not necessarily," he said. "It's a completely open question whether this acceleration will continue forever or whether it will peter out after a while. There are several theories of what kind of cosmic field might be causing it and no observations to determine which of them is right. If it's caused by the so-called 'dark energy' of empty space, then the expansion will keep speeding up forever, which is bad news as far as life is concerned. But if it's caused by some other kind of force field—which, out of ignorance, we label 'quintessence'—then the expansion might well slow down as we go into the future. Some quintessence theories even say that the universe will eventually stop expanding altogether and collapse. Of course, that, too, would be unfortunate for civilization since nothing would survive the Big Crunch."

Well, then, I said, let's stick with the optimistic scenario. Suppose the acceleration does turn out to be temporary and the future universe settles into a nice cruise-control expansion. What could our descendants possibly look like a trillion trillion trillion years from now, when the stars have disappeared and the universe is dark and freezing and so diffuse that it's practically empty? What will they be made of?

"The most plausible answer," Dyson said, "is that conscious life will take the form of interstellar dust clouds." He was alluding to the kind of inorganic life forms imagined by the late astronomer Sir Fred Hoyle in his 1957 science fiction novel, The Black Cloud. "An ever-expanding network of charged dust particles, communicating by electromagnetic forces, has all the complexity necessary for thinking an infinite number of novel thoughts."

How, I objected, can we really imagine such a wispy thing, spread out over billions of light-years of space, being conscious?

"Well," he said, "how do you imagine a couple of kilograms of protoplasm in someone's skull being conscious? We have no idea how that works either."

Practically next door to Dyson at the institute is the office of Ed Witten, a gangly, 50-ish fellow who is widely regarded as the smartest physicist of his generation, if not the living incarnation of Einstein. Witten is one of the prime movers behind superstring theory, which, if its hairy math is ever sorted out, may well furnish the Theory of Everything that physicists have long been after. He has an unnerving ability to shuffle complicated equations in his head without ever writing anything down, and he speaks in a hushed, soft voice. Earlier this year, Witten was quoted in the press calling the discovery of the runaway expansion of the universe "an extremely uncomfortable result." Why, I wondered, did he see it that way? Was it simply inconvenient for theoretical reasons? Or did he worry about its implications for the destiny of the cosmos? When I asked him, he agonized for a moment before responding, "Both."

Yet Witten, too, thought there was a good chance that the runaway expansion would be only temporary, as some of the quintessence theories predicted, rather than permanent, as the dark-energy hypothesis implied. "The quintessence theories are nicer, and I hope they're right," he told me. If the acceleration does indeed relax to zero, and the Big Crackup is averted, could civilization go on forever? Witten was unsure. One cause for concern was the possibility that protons will eventually decay, resulting in the dissolution of all matter within another, oh, 10^33 years or so. Freeman Dyson had scoffed at this when I talked with him, pointing out that no one had ever observed a proton decaying, but he insisted that intelligent beings could persist even if atoms fell to pieces, by re-embodying themselves in "plasma clouds"—swarms of electrons and positrons. I mentioned this to Witten. "Did Dyson really say that?" he exclaimed. "Good. Because I think protons probably do decay."



Star Trek and the Lifeboat Scenario


Back at the Princeton railroad station after visiting Ed Witten and Freeman Dyson, waiting for the train to New York and munching on a vile "veggie" sandwich that I had picked up at the convenience store across the parking lot, I pondered proton decay and Dyson's scenario for eternal life. How would his sentient Black Clouds, be they made up of cosmic dust or of electron-positron plasma, while away the eons in an utterly freezing and dark universe? What passions would engross their infinite number of ever-slowing thoughts? After all (as Alvy Singer's alter ego once observed), eternity is a long time, especially toward the end. Maybe they would play games of cosmic chess, in which each move took trillions of years. But even at that rate they would run through every possible game of chess in a mere 10^(10^70) years—long before the final decay of the burnt-out cinders of the stars. What then? Would they come around to George Bernard Shaw's conclusion (reached by him at the age of 92) that the prospect of personal immortality was an "unimaginable horror"? Or would they feel that, subjectively at least, time was passing quickly enough? After all, as Fran Lebowitz pointed out, once you've reached the age of 50, Christmas seems to come every three months.

It was almost with a sense of relief that I spoke to Lawrence Krauss a few days later. Krauss, a boyish fellow in his late 40s who teaches at Case Western Reserve in Cleveland, is one of the physicists who guessed on purely theoretical grounds, even before the astronomical data came in, that the cosmos might be undergoing a runaway expansion. "We appear to be living in the worst of all possible universes," Krauss told me, clearly relishing the note of anti-Leibnizian pessimism he struck. "If the runaway expansion keeps going, our knowledge will actually decrease as time passes. The rest of the universe will be literally disappearing before our very eyes surprisingly soon—in the next ten or twenty billion years. And life is doomed—even Freeman Dyson accepts that. But the good news is that we can't prove we're living in the worst of all possible universes. No finite set of data will ever enable us to predict the fate of the cosmos with certainty. And, in fact, that doesn't really matter. Because, unlike Freeman, I think that we're doomed even if the runaway phase turns out to be only temporary."

What about Dyson's vision of a civilization of sentient dust clouds living forever in an expanding universe, entertaining an infinite number of thoughts on a finite store of energy? "It turns out, basically for mathematical reasons, that there's no way you can have an infinite number of thoughts unless you do a lot of hibernating," Krauss said. "You sleep for longer and longer periods, waking up for short intervals to think—sort of like an old physicist. But what's going to wake you up? I have a teenage daughter, and I know that if I didn't wake her up, she'd sleep forever. The Black Cloud would need an alarm clock that would wake it up an infinite number of times on a finite amount of energy. When a colleague and I pointed this out, Dyson came up with a cute alarm clock that could actually do this, but then we argued that this alarm clock would eventually fall apart because of quantum mechanics."

So, regardless of the fate of the cosmos, things look pretty hopeless for intelligent life in the long run. But I should remember, Krauss said, that the long run is a very long time. He told me about a meeting he attended at the Vatican a few years back on the future of the universe: "There were about 15 people, theologians, a few cosmologists, some biologists. The idea was to find common ground, but after three days it was clear that we had nothing to say to one another. When theologians talk about the 'long term,' raising questions about resurrection and such, they're really thinking about the short term. We weren't even on the same plane. When you talk about 10^50 years, the theologians' eyes glaze over. I told them that it was important that they listen to what I had to say—theology, if it's relevant, has to be consistent with science. At the same time I was thinking, 'It doesn't matter what you have to say, because whatever theology has to say is irrelevant to science."

At least one cosmologist I knew of would be quite happy to absorb theology into physics, especially when it came to talking about the end of the universe. That's Frank Tipler, a professor at Tulane University in New Orleans. In 1994 Tipler published a strangely ingenious book called The Physics of Immortality, in which he argued that the Big Crunch would be the happiest possible ending for the cosmos. The final moments before universal annihilation would release an infinite amount of energy, Tipler reasoned, and that could drive an infinite amount of computation, which would produce an infinite number of thoughts—a subjective eternity. Everyone who ever existed would be "resurrected" in an orgy of virtual reality, which would correspond pretty neatly to what religious believers have in mind when they talk about heaven. Thus, while the physical cosmos would come to an abrupt end in the Big Crunch, the mental cosmos would go on forever.

Was Tipler's blissful eschatological scenario—which he called "the Omega Point"—spoiled by the news that the cosmos seemed to be caught up in a runaway expansion? He certainly didn't think so when I talked to him. "The universe has no choice but to expand to a maximum size and then contract to a final singularity," he exclaimed in his thick Southern drawl. (He's a native of Alabama and a self-described "redneck.") Any other cosmic finale, he said, would violate a certain law of quantum mechanics called "unitarity." Moreover, "the known laws of physics require that intelligent life persist until the end of time and gain control of the universe." When I mentioned that Freeman Dyson (among others) could not see why this should be so, Tipler shouted in exasperation, "Ah went up to Princeton last November and ah tode him the argument! Ah tode him!" Then he told me, too. It was long and complicated, but the nub of it was that intelligent beings must be present at the end to sort of massage the Big Crunch in a certain way so that it would not violate another law of quantum mechanics, the "Beckenstein bound." So, our eternal survival is built into the very logic of the cosmos. "If the laws of PHEE-ysics are with us," he roared, "who can be against us?"

Tipler's idea of an infinite frolic just before the Big Crunch was seductive to me—more so, at least, than Dyson's vision of a community of increasingly dilute Black Clouds staving off the cold in an eternal Big Chill. But if the universe is in a runaway expansion, both are pipe dreams. The only way to survive in the long run is to get the hell out. Yet how do you escape a dying universe if—as little Alvy Singer pointed out—the universe is everything?

A man who claims to see an answer to this question is Michio Kaku. A theoretical physicist at City College in New York, Kaku looks and talks a bit like the character Sulu on Star Trek. (He can be seen in the recent Michael Apted film about great scientists, Me and Isaac Newton.) He is not the least bit worried about the fate of this universe. "If your ship is sinking," he said to me, "why not get a lifeboat and leave?" We earthlings can't do this just yet, Kaku observed. That is because we are a mere Type 1 civilization, able to marshal the energy only of a single planet. But eventually, assuming a reasonable rate of economic growth and technological progress, we will graduate to being a Type 2 civilization, commanding the energy of a star, and thence to being a Type 3 civilization, able to summon the energy of an entire galaxy. Then space-time itself will be our plaything. We'll have the power to open up a "wormhole" through which we can slip into a brand new universe.

"Of course," Kaku added, "it may take as long as 100,000 years for such a Type 3 civilization to develop, but the universe won't start getting really cold for trillions of years." There is one other thing that the beings in such a civilization will need, Kaku stressed to me: a unified theory of physics, one that would show them how to stabilize the wormhole so it doesn't disappear before they can make their escape. The closest thing we have to that now, superstring theory, is so difficult that no one (with the possible exception of Ed Witten) knows how to get it to work. Kaku wasn't the least bit gloomy that the universe might be dying. "In fact," he said, "I'm in a state of exhilaration, because this would force us, really force us, to crack superstring theory. People say, 'What has superstring theory done for me lately? Has it given me better cable TV reception?' What I tell them is that superstring theory—or whatever the final, unified theory of physics turns out to be—could be our one and only hope for surviving the death of this universe."

Although other cosmologists were rudely dismissive of Kaku's lifeboat scenario—"a good prop for a science fiction story," said one; "somewhat more fantastical than most of Star Trek," remarked another—it sounded good to me. But then I started thinking. To become a Type 3 civilization, one powerful enough to engineer a stable wormhole leading to a new universe, we would have to gain control of our entire galaxy. That means colonizing something like a billion habitable planets. But if this is what the future is going to look like, then almost all the intelligent observers who will ever exist will live in one of these billion colonies. So, how come we find ourselves sitting on the home planet at the very beginning of the process? The odds against being in such an unusual situation—the very earliest people, the equivalent of Adam and Eve—are a billion to one.



Does the End of the Universe Matter?


My vague qualm about the unlikeliness of Kaku's lifeboat theory was considerably sharpened when I talked to J. Richard Gott III, an astrophysicist at Princeton University. Gott is known for making bold quantitative predictions about the longevity of things—from Broadway shows like Cats to America's space program to intelligent life in the universe. He bases these predictions on what he calls the Copernican Principle, which says, in essence: You're not special. "If life in the universe is going to last a long time, why do we find ourselves living when we do, only 13 billion years after the beginning?" Gott said to me, speaking in an improbable Tennessee accent whose register occasionally leapt up an octave, like Don Knotts'. "And it is a disturbing fact that we as a species have only been around for 200,000 years. If there are going to be many intelligent species descended from us flourishing in epochs far in the future, then why are we so lucky to be the first?" Doing a quick back-of-the-envelope calculation, Gott determined that it was 95 percent likely that humanity would last more than 5,100 years but would die out before 7.8 million years (a longevity that, coincidentally, is quite similar to that of other mammal species, which tend to go extinct around 2 million years after appearing). Gott was not inclined to speculate on what might do us in—biological warfare? asteroid collision? nearby supernova? sheer boredom with existence? But he did leave me feeling that the runaway expansion of our universe, if real, was the least of our worries.

Despite the pessimistic tenor of Gott's line of thought, he was positively chirpy in conversation. In fact, all the cosmologists I had spoken to so far had a certain mirthfulness about them when discussing eschatological matters—even Lawrence Krauss, the one who talked about this being the worst of all possible universes. ("Eschatology—it's a great word," Krauss said. "I had never heard of it until I discovered I was doing it.") Was no one made melancholy or irritable by the prospect of our universe decaying into nothingness? I thought of Steven Weinberg, the Nobel laureate in physics who, in his 1977 book about the birth of the universe, The First Three Minutes, glumly observed, "The more the universe seems comprehensible, the more it also seems pointless." It was Weinberg's pessimistic conclusion in that book—he wrote that civilization faced cosmic extinction from either endless cold or unbearable heat—that had inspired Freeman Dyson to come up with his scenario for eternal life in an expanding cosmos.

I called Weinberg at the University of Texas, where he teaches. "So, you want to hear what old grumpy has to say, eh?" he growled in a deep voice. He began with a dazzling theoretical exposition that led up to a point I had heard before: No one really knows what's causing the current runaway expansion or whether it will continue forever. The most natural assumption, he added, was that it would. But he wasn't really worried about the existential implications. "For me and you and everyone else around today, the universe will be over in less than 10^2 years," he said. In his peculiarly sardonic way, Weinberg seemed as jolly as all the other cosmologists. "The universe will come to an end, and that may be tragic, but it also provides its fill of comedy. Postmodernists and social constructivists, Republicans and socialists and clergymen of all creeds—they're all an endless source of amusement."

It was time to tally up the eschatological results. The cosmos has three possible fates: Big Crunch (eventual collapse), Big Chill (expansion forever at a steady rate), or Big Crackup (expansion forever at an accelerating rate). Humanity, too, has three possible fates: eternal flourishing, endless stagnation, or ultimate extinction. And judging from all the distinguished cosmologists who weighed in with opinions, every combination from Column A and Column B was theoretically open. We could flourish eternally in virtual reality at the Big Crunch or as expanding black clouds in the Big Chill. We could escape the Big Crunch/Chill/Crackup by wormholing our way into a fresh universe. We could face ultimate extinction by being incinerated by the Big Crunch or by being isolated and choked off by the Big Crackup. We could be doomed to endless stagnation—thinking the same patterns of thoughts over and over again, or perhaps sleeping forever because of a faulty alarm clock—in the Big Chill. One distinguished physicist I spoke to, Andrei Linde of Stanford University, even said that we could not rule out the possibility of their being something after the Big Crunch. For all of the fascinating theories and scenarios they spin out, practitioners of cosmic eschatology are in a position very much like that of Hollywood studio heads: Nobody knows anything.

Still, little Alvy Singer is in good company in being soul-sick over the fate of the cosmos, however vaguely it is descried. At the end of the 19th century, figures like Swinburne and Henry Adams expressed similar anguish at what then seemed to be the certain heat-death of the universe from entropy. In 1903 Bertrand Russell described his "unyielding despair" at the thought that "all the labors of the ages, all the devotion, all the inspiration, all the noonday brightness of human genius, are destined to extinction in the vast death of the solar system, and that the whole temple of Man's achievement must inevitably be buried beneath the debris of a universe in ruins." Yet a few decades later, he declared such effusions of cosmic angst to be "nonsense," perhaps an effect of "bad digestion."

Why should we want the universe to last forever, anyway? Look—either the universe has a purpose or it doesn't. If it doesn't, then it is absurd. If it does have a purpose, then there are two possibilities: Either this purpose is eventually achieved, or it is never achieved. If it is never achieved, then the universe is futile. But if it is eventually achieved, then any further existence of the universe is pointless. So, no matter how you slice it, an eternal universe is either a) absurd, b) futile, or c) eventually pointless.

Despite this cast-iron logic, some thinkers believe that the longer the universe goes on, the better it is, ethically speaking. As John Leslie, a cosmological philosopher at the University of Guelph in Canada, told me, "This is true simply on utilitarian grounds: The more intelligent happy beings in the future, the merrier." Philosophers of a more pessimistic kidney, like Schopenhauer, have taken precisely the opposite view: Life is, on the whole, so miserable that a cold and dead universe is preferable to one teeming with conscious beings.

If the current runaway expansion of the cosmos really does portend that our infinitesimal flicker of civilization will be followed by an eternity of bleak emptiness, then that shouldn't make life now any less worth living, should it? It may be true that nothing we do in A.D. 2004 will matter when the burnt-out cinder of our sun is finally swallowed by a galactic black hole in a trillion trillion years. But by the same token, nothing that will happen in a trillion trillion years matters to us now. In particular (as the philosopher Thomas Nagel has observed), it does not matter now that in a trillion trillion years nothing we do now will matter.

Then what is the point of cosmology? It's not going to cure cancer or solve our energy problems or give us a better sex life, obviously enough. Still, it is bracing to realize that we live in the first generation in the history of humanity that might be able to answer the question, How will the universe end? "It amazes me," Lawrence Krauss said, "that, sitting in a place on the edge of nowhere in a not especially interesting time in the history of the universe, we can, on the basis of simple laws of physics, draw conclusions about the future of life and the cosmos," he said. "That's something we should relish, regardless of whether we're here for a long time or not."

So, remember the advice offered by Monty Python in their classic "Galaxy Song." When life gets you down, the song says, and you're feeling very small and insecure, turn your mind to the cosmic sublimity of the ever-expanding universe—"because there's bugger-all down here on Earth."

Jim Holt writes the "Egghead" column for Slate. He also writes for The New Yorker and the New York Times Magazine.

Copyright © 2004 Slate Magazine

Paul Burka Is Ambivalent About W; (F&B) R&R Isn't So Sanguine

This piece popped up in the same issue of Texas Monthly as the latest from the Kinkster. Paul Burka makes a lot of sense, but he doesn't hold W to task for the humongous deficit in the state budget as W skipped off to DC. A modern state cannot succeed without taxes. The money must come from somewhere. Increasing the sales tax is criminal. Avoiding an income tax is criminal. Tax cuts are a gigantic shell game. Ultimately the money must be found somewhere. In California, the voters just imposed taxes on themselves in terms of issuing state bonds. The steroid-produced governor of California accomplished this feat by avoiding the T-word. Finally, there are a couple of clinkers in Burka's piece: principles when he meant principals and attributing "Every man his own historian" to a Rice University history professor. Carl L. Becker wrote those words in his presidential address to the American Historical Association in 1931. If this (fair & balanced) quibbling, so be it.



[x Texas Monthly]
The Man Who Isn't There
In 2000 I thought Texas governor George W. Bush was the best person to lead this country. He still would be—except the guy in the White House is a different George W. Bush. And that's why I'm so ambivalent about reelecting him.
by Paul Burka

I NEVER EXPECTED TO KNOW A PRESIDENT of the United States. I had met several presidents-to-be, the first being Richard Nixon during a family vacation in Washington when I was barely a teenager. He was vice president then, and he was eating breakfast in the restaurant of the hotel where we were staying, talking with a senator from California. My mother sent me over to get his autograph. Later, I saw Jimmy Carter campaigning at the Texas Capitol the day before he defeated Lloyd Bentsen in the 1976 Texas primary. Mine was one of many hands he grabbed on his way into the House chamber. In November 1979 I interviewed the elder George Bush as he was gearing up for the 1980 Republican primary race, which he lost to Ronald Reagan. I knew he was doomed when I rode with him to the Houston airport, which now bears his name, and only one person in the terminal recognized him.

Then, for six years, I saw George W. Bush up close. I really didn't have personal contact with him that often—every few months, I would say—but when I did see him, it was quality time. In 35 years of hanging around the Capitol, as a staffer and as a journalist, I have never seen anyone that good at the game of politics. It was impossible to be around the guy and not like him. He filled a room. He was always himself. He said what he thought. He had the ability to let down his guard without losing the dignity of "I am your governor." Not the governor—your governor. I never had a bad interview with him. Once he told me that he was going to beat Al Gore because "I know who I am and he doesn't know who he is." On his last night as governor, he hosted the annual reception for the Capitol press corps. National reporters who were on the Bush beat were still in town because of the long agony over the Florida vote, and let me tell you, nobody missed that party. Even the cameramen showed up. When I went up to him in the reception line, I handed him a note in which I thanked him for being incredibly generous with his time, access, and candor, and I told him that covering him had been the best experience of my professional life.

A lot of people will wince at that anecdote—including my editor, not to mention certain readers and colleagues who thought I was, to put it bluntly, in the tank for Bush. Methinks I protest too much, but I ask you, does this sound in the tank? "The governor sided with insurance companies over doctors, employers over employees, and concerns about the cost of care over concerns about the quality of care." That was my reaction to his misguided veto in 1995 of the Patient Protection Act, which was designed to curb the abuses of managed care. In 1999 I wrote that Bush's proposal for an across-the-board property tax cut was "in deep trouble" because his attention had been diverted by the presidential race. I even quoted a Republican legislator as saying, "You can get in to see him, if you're from Iowa." I encountered the governor at the Capitol on the day the story appeared, and when I tried to shake his hand, he pulled it back.

But did Bush generally come across well in my stories? Sure. When there was something negative to write, I wrote it, but aside from occasional disagreements over issues, there wasn't a lot to be negative about. He had all the qualities of a great governor. He was a strong and popular leader. He had a mesmerizing personality. He was a uniter, not a divider—a centrist who fought the extremists in his own party. He had the courage to tackle the most important issues: public education and the tax structure. He had a great staff. He made appointments based on ability, not litmus tests. He had the decency to stay above petty politics. He was motivated by the public interest, not ideology. It's not "in the tank" if it's the truth. The defensiveness rests.

But I'm speaking of Governor George W. Bush, the man I voted for in 1998 and 2000, not President George W. Bush. They seem to me to be two different people—not entirely so, but enough that there is cause for worry. I don't regret my vote in 2000; if 9/11 had to happen, I'm glad that it happened on his watch. He has captured Saddam Hussein and will never rest until the same fate is ensured for Osama bin Laden. But the sundering of the country along geographical and ideological lines into the political map of Red America and Blue America accelerated on his watch, and it started well before 9/11. I would never have imagined that the person I knew would have been characterized in a Time cover story as the "Great Polarizer." Or that he would kowtow to the extremists in his party. Or that he would allow his vice president to cast a shadow on his administration's integrity by maintaining secrecy on energy planning. Or that his advisers would be at war not just with terrorists but with each other. What happened to Governor George W. Bush? Where is the guy we sent to Washington?

THERE'S A REASON THIS STORY IS WRITTEN in the first person: My purpose here is not to persuade you about how to judge the Bush presidency but to persuade myself. I have no party loyalty or ideological anchor to rely on as a guide; I tend to vote according to which candidate seems to me to be the best match for the spirit of the times. I voted for Ronald Reagan and for Bill Clinton. Even if I hadn't known George W. Bush, I probably would have voted for him, because the times in 2000 demanded an end to the divisiveness of the Clinton years, and Bush had a far better chance to achieve it (or so I thought) than Al Gore did. It even crossed my mind to go to Washington and observe at close range his efforts to change the tone of the city, as he promised. (There was a bad career choice.) Instead, I stayed in Texas and spent the next three years discussing with my colleagues at Texas Monthly how to cover a Texas president from afar, relying for most of our information on the national media, which I don't particularly like or trust. Eventually we decided that the main thing we can contribute to the national debate is to compare Governor Bush with President Bush: How is he the same, how is he different, and why is it significant?

The two characters began to diverge in the summer of 1998, with the appearance of the polls showing him as the front-runner for the Republican nomination in 2000, an election that was then more than two years distant. In the beginning, he was a reluctant candidate, so much so that I had doubts about whether he would be an effective president. He had no hunger for the job, no clue about what he wanted to do with it. There was so much about it he didn't like: the culture of Washington, the national media, the isolation, the feeling of being a specimen under a microscope. That reluctance exposed itself during the early part of his presidential campaign. I saw him speak at the Iowa straw poll in the summer of 1999, and he seemed uncomfortable and unsure of himself. As good as he was in an intimate setting, he wasn't the same person in a big space. A few months later I saw him in New Hampshire, visiting a company that made outdoor clothing, and he was awful. The audience filed into a small auditorium to the sound of country music. I didn't think it was smart to be too Texan in New England. Hell, they didn't even want us to join the Union. He spoke on a low stage with a stand-up microphone, and he had to bend forward to get down to it, making himself look all arms and knees, awkward and unpolished. His speech was so generic it could have been given in Amarillo. An ad-lib about his Texas Rangers' getting beat by the Red Sox drew the only round of applause. His issues were esoteric—missile defense, social security privatization, military reorganization, and tax cuts that the employees who made up most of the audience knew wouldn't help them. Only after John McCain whipped him in the primary did he finally act like he cared. He had gone from wanting not to lose to wanting to win.

I wasn't the only one wondering what had happened to the George W. Bush I thought I knew; Republicans around the Texas Capitol were asking the same thing. His advisers' penchant for keeping him on message and avoiding any mistakes submerged his personality. But I had other concerns. How could the guy who the right wing in Texas had blocked from being the leader of the state's delegation to the 1996 Republican National Convention have gone to Bob Jones University and embraced the very worst elements of the right wing, people who had openly loathed his father? Karen Hughes, his communications director, told me at the time that after their decisive defeat in New Hampshire, they needed an enthusiastic crowd in South Carolina, and where else could they be sure of finding one? I wonder if Bush understood the negative symbolic importance of that appearance to his core constituency in the political center. In retrospect, this was a pivotal moment for him in two ways: It proved that winning mattered to him after all, and it threw him into the clutches of the right. And, given the militancy of that wing of the Republican party and Bush's own belief that he must avoid his father's mistake of alienating them, it meant that he would be locked permanently into reciprocating the embrace for the rest of his candidacy and at least the first term of his presidency.

Governor Bush had all but disappeared, to be replaced by a stiff and scripted fellow called Nominee Bush. I remember having lunch with Hughes in May, after the nomination was wrapped up, and she said, How do we get the country to see what he's really like? I suggested taking up Al Gore's challenge to debate every week; put him next to Gore and the country will like him better. But it wasn't in the script. In the fall, when Bush fell behind Gore, the campaign was still trying to avoid debates. I had more confidence in him than they did; I knew he was going to beat Gore head to head. He is the most competitive person you ever saw; every encounter is a joust. His zest for banter is well known by now, but the first time I experienced it was at a Texas A&M football game. It rained and rained and then it rained harder, and I had to give my raincoat to my two boys, so by halftime I was drenched. My hair was dripping water, my clothes were soaked through, and water squished out of my shoes as I traipsed up the stairs to the concession stands, a route that took me right by the VIP seats. A shout rang out: "Burka!" It was Bush, taking utter delight in my misery. "You're wet! Don't you know it's raining?"

I was quite surprised at the way Bush came to be viewed in the campaign. The Saturday Night Live caricature sums it up: a not-too-bright playboy. It would never have occurred to me (or anyone else who dealt with him at the Capitol) to think of Bush as dumb or lacking gravitas. He was both fluent and knowledgeable about the things a governor needed to know—his issues and Texas politics generally. His real forte was people and the political process. He had an unerring instinct for knowing how others really felt about him and how to win them over.

He had his shortcomings. Who doesn't? Chief among them was his narrow focus; if something wasn't on his radar screen, like higher education or the environment, forget about it. This quality is more of a problem for a president, who is expected to have a position on everything, than a governor. Bush picked out a few things he was interested in, pressed for his agenda, and seldom interfered with the rest. The attacks on Bush's record by the Democrats and the national media were true but not accurate. Yes, Texas leads the nation in air pollution, executions, and children without health insurance, but we were that way before Bush was governor, and we didn't change under all those Democratic governors, including Ann Richards.

BUSH WAS EXTREMELY LUCKY. RICHARDS faced budget and school-finance crises before him and Rick Perry faces budget and school-finance crises after him, but he faced neither. However, he has not been a lucky president; indeed, history dealt him the worst hand of any incoming president since Lincoln. He took office after an acrimonious election in which he lost the popular vote and was declared president rather than elected. The economy was sinking toward recession. Then, not quite eight months into his presidency, two jets brought down the World Trade Center, killed more than two thousand Americans, and sent the country into shock—at war with one enemy most people didn't know existed and, eventually, with another many didn't think it was necessary to fight.

Even before 9/11, I thought Bush was headed in the wrong direction. I worried that his $1.6 trillion tax cut was excessive, and in one aspect—the repeal of the inheritance tax—a huge mistake. Maybe the amount could be justified. The combination of tax cuts, deficits, and low interest rates is a textbook policy for stimulating the economy. Clinton chose to raise taxes on the wealthy to reduce the deficit, which is a different kind of textbook response: It frees up credit by taking the government out of the borrowing business. I don't know which textbook to believe, and I suspect that it is a matter of faith rather than science. At any rate, no one should be shocked that a Democrat would raise taxes on the wealthy or that a Republican would reduce them. But the repeal of inheritance taxes will do real harm to the country, which is why several Rockefellers and other zillionaires signed an ad in the New York Times opposing it. It would have made so much more sense to raise the ceiling enough to protect family businesses and parents who want to leave money for their grandkids to go to college. Not only will the repeal harm philanthropy by removing the tax-avoidance incentive for people to create foundations, but it will also remove the barrier to the creation of a permanent aristocracy in this country.

But the Bush policy that baffled me the most was, and is, his administration's unrelenting attack on the environment. I understand why he wanted to go easy on dirty refineries and power plants: In a recession, he wasn't going to eliminate a single job. But why did he want to spend his political capital on drilling in the Arctic National Wildlife Refuge? Why identify himself as the oil president, an image that has undercut his Iraq policy—especially since the ANWR reserves are barely a drop in the bucket of our energy needs? Why did the administration suspend the last-minute Clinton rule reducing arsenic in drinking water? Who thought that was a good idea? In the end, the administration restored the Clinton reductions, but the PR damage was done.

Then there was his claim to be "a uniter, not a divider." Right out of the box, the White House got crosswise with Senator Jim Jeffords, a Vermont Republican, over policy and, some say, personal differences; he became an independent and the GOP lost its majority in the Senate. Bush's relationship with Senate majority leader Tom Daschle was frigid, in contrast to the one he had enjoyed with Democratic leaders Bob Bullock and Pete Laney back in Texas. (The administration's version is that Daschle said one thing in their private meetings and something totally different to the media; to Bush, that's tantamount to lying.) I don't want to be naive here: A new sheriff can't expect to ride into town and clean things up overnight. Partisanship is built into the structure of Congress. Still, Bush had talked, both in the campaign and in interviews with me, about wanting to change the political climate of Washington. It seems to me that he didn't try very hard. The House Republicans and their divider-not-a-uniter majority leader, Tom DeLay, were as much opposed to bipartisanship as the Senate Democrats. Unless DeLay could be detoxified, the political climate of Washington would remain the same. But Bush didn't have to take on DeLay to claim the political center. The Democrats let him have it by default by moving to the left, both inside the Beltway, where their House caucus chose Nancy Pelosi, of California, as their minority leader, and outside, where Howard Dean emerged as the front-runner for the party's 2004 presidential nomination.

Another issue I found troubling was the nexus between religion and politics. I'm a big believer in the First Amendment. I think the Bill of Rights—and the First Amendment in particular—represents the United States' greatest contribution to civilization: free speech, a free press, and separation of church and state. I'm not nutty about this. I don't see anything wrong with the display of the Ten Commandments on the Capitol grounds, but if the courts ultimately order it removed, I don't want some headline-grabbing judge to defy the law he has sworn to uphold. I was even willing to accept Bush's plan for faith-based social services; churches have a much better chance of success in dealing with drug and alcohol rehabilitation than government agencies do. If it works, I'm not going to throw away people's lives just because someone might have to listen to a denominational prayer. On the other hand, I was appalled by Bush's decision to limit federal support for stem cell research. The conflict between religion and science is an old one, going back at least to Galileo, and the church has almost always been on the wrong side. I understand why the issue mattered to Bush. He had made a campaign promise to the religious right not to allow federal funding for the research, and as I said, he had learned from his father the cost of alienating his political base. But a lot of Republicans favored federal funding because of its potential for human progress. I fail to understand how anyone who knows a lot about the issue could be against giving science a chance to cure terrible diseases. When the issue got hot, Bush had to back off and find a compromise. Still, in the light of history, he made the wrong choice.

ALL OF THESE CONCERNS LOOMED LARGE at the time, but they faded into the background on the morning of September 11, 2001. In a democracy, even decisions of war and peace must be made within a political framework. The framework for Bush was that his presidency was adrift before the calamity of 9/11; after the early successes of the education bill and the tax cuts, the rest of his legislative program had stalled. His job- approval rating was a lukewarm 50 percent. The political importance of 9/11 for George W. Bush cannot be overstated. It united the nation in tragedy. It provided him with an opportunity for leadership and created the one thing a leader needs most: followers. And it defined his presidency, giving him the sense of purpose that he had previously lacked.

In a way, Bush was repeating as president the evolution he had gone through in his personal life, when he stopped drinking and became a grown-up at age forty. As governor, he was supremely self-confident, a trait that his critics on the national stage would later see as cockiness or arrogance but which to me seemed to be something more profound: the result of having lived most of his life being less than pleased with himself and then turning his life around through faith and force of will. Now he had to prove himself again, this time to the world. Here's what he had to say about the way other world leaders viewed him, in an interview for Bob Woodward's book Bush at War: "I'm the toxic Texan, right? In these people's minds, I'm the new guy. They don't know who I am." They should have seen the letter he wrote in longhand to his father on the night of October 7, 2001, to thank him for all he had done and to tell him that he had ordered the bombing of Afghanistan to begin. I saw it in 2002 at the George Bush presidential library at Texas A&M, as part of an exhibit about the two father-and-son presidents, John and John Quincy Adams and George and George W. Bush. The bullhorn Bush had used when he spoke at the still-smoldering ruins of the World Trade Center was there and so was the New York City firefighter's pullover he had worn when he threw a strike before the first game of the World Series, at Yankee Stadium, but I found the letter to be the most compelling item, especially its conclusion: "I feel no sense of the so-called heavy burden of the office."

There it is: a one-sentence character sketch. Other presidents have agonized over hard choices—think of Lyndon Johnson in the early days of Vietnam, wanting to get out yet knowing that he couldn't—but not Bush. "The best thing he does is make decisions," his longtime political guru, Karl Rove, told me during the gubernatorial years. Rove went on to say that it was more important for a leader to make a decision and stick by it than that the decision be absolutely right. Bush is comfortable with the burdens of the office because he doesn't feel them the way others do: He never looks back, never second-guesses himself, never shows weakness, never admits a mistake, never reverses course. And it drives his critics crazy.

If you're going to stay the course, come what may, you had better be right, especially if the stakes are high and the odds are long. Even as governor, Bush was prone to roll the dice; he liked the big play. Following the old political rule that it is easier to ask for forgiveness than permission, he announced before his second legislative session in 1997, without consulting Bullock or Laney, that he wanted $1 billion of a $3 billion surplus (those were the days!) to be used for property tax relief. Bullock and Laney didn't like his unilateralism, but they went along. Even bolder, he led a drive that session to reform the state's tax structure by shifting property taxes to business taxes. "Now we'll find out: Can government act prior to a crisis?" he told me. Well, I knew the answer to that one: No. After the plan failed in the last days of the session, Bush told me, "One of these days the Legislature will wish they had passed it." (That day is now.) But Bush wasn't going to wait for the crisis to arrive. His reasoning for taking on tax reform was the same as his reasoning for invading Iraq: the preemptive strike. But tax reform is one thing, and regime change in Iraq is quite another.

"THE LOVE HIM, HATE HIM PRESIDENT" was the headline of Time's December 1 cover story about Bush. "He is the man about whom Americans feel little ambivalence," the story said. "People tend to love him or hate him without any complicating shades of gray." Hmmm. Am I all alone out here in a gray area? I certainly don't hate him. I found him to be a good man with decent instincts. Those who follow the national media don't hear this from journalists, but if you read books journalists have written about him—Woodward's Bush at War, Frank Bruni's Ambling Into History—his character and personality break through. I realize that there are millions of people in America, to say nothing of worldwide, who think that he deliberately lied about Iraq's weapons of mass destruction and ties to Al Qaeda, but I can't imagine that the person I knew, who campaigned on restoring honor and dignity to the White House, would deliberately lie to the American people. (I can believe it of Dick Cheney, whom I know only through the media, just as others can believe it of Bush, whom they know only in the same way.)

But I don't love him either. For one thing, I gave up loving politicians long ago. Politics is noble in conception but too often ignoble in practice; it puts expedience on public display. For another, I disagree with him about too many things, including the big one of war and peace. You might reasonably ask: Who am I to disagree with the president of the United States? Well, it's a free country. But more than that, I majored in history, and my favorite professor drummed into us the obligation to be judgmental, about the present as well as the past. "Every man his own historian," he would say.

So here's what this individual historian believes. First, when dealing with the rest of the world, America must abide by its basic principles, which, as Lincoln said just before the outbreak of civil war, in 1861, offer "hope to the world for all future time." He added, "If [the country] can't be saved upon that principle, it will be truly awful." Second, the most effective foreign policy for a great power is the one laid down almost one hundred years ago by Theodore Roosevelt: "Speak softly and carry a big stick."

On my personal scorecard, Bush is zero for two. What is especially sad about this is that he had the country overwhelmingly behind him after 9/11. The moment that everything changed came in the 2002 State of the Union address, when Bush identified an "axis of evil": Iraq, Iran, North Korea. So much for speaking softly. He could have said the same thing in many different ways; the way that he chose—boldness, always boldness—committed us to act, for a president does not employ words like "evil" casually. To use that label is to raise the stakes, because how can good (that's us) tolerate evil?

Let me be very clear about this: It's not the label itself I object to; these were all bad regimes. It's the use of it. It raises the basic question of whether America should be the world's moral policeman, ridding the globe of bad guys who pose no imminent threat to us except in what they might do at some future time. In other words, preemptive war; as Shakespeare's Brutus said, in determining to join the assassination plot against a too-ambitious Julius Caesar, "Then, lest he may, prevent." But does brandishing your intentions in public create a safer world—or a more dangerous one? In the case of North Korea, it is likely that the "axis of evil" speech created exactly what we feared, spurring that country to resume its nuclear weapons program. The speech and the policy it produced reopened the political divisions of the 2000 election by forcing each of us to decide what kind of values we expect our country to uphold. I am 100 percent in favor of hunting down terrorists to the ends of the earth and bringing them to justice. But it's not justice if the government can hold suspects indefinitely without charging them. I know the counterargument, that the Constitution is not a suicide pact. But requiring the government to have some evidence against suspects to produce in court is not tantamount to suicide. And it doesn't necessarily advance the war on terrorism to wage war on spec against states; indeed, it may cause us to shift our focus from the greater danger to a lesser one.

These are truly momentous issues. The president did not seek them out; they were forced upon him by 9/11. It's hard to blame him for going to the utmost lengths to protect the nation; that's his sworn duty. What concerns me is whether we can trust the decision-making apparatus around him. In the governor's office, Bush had advisers and top aides who were totally loyal to him. In the White House, he has advisers and top aides who have a long history of intellectual and ideological loyalty to specific policy positions. This is not to say that they are disloyal, just that they think the way ideologues always think—that their interests and the nation's interests (and the president's) are one and the same. One day after 9/11, the National Security Council met to plan the response. According to Bush at War(which is based on interviews with the principles), one of the first comments was by Defense Secretary Donald Rumsfeld, asking why we shouldn't go after Iraq, as his chief deputy, Paul Wolfowitz, famously supported. Colin Powell warned that neither the coalition America was seeking nor the American people wanted a war against Iraq. At first, Woodward writes, Bush worried about diluting the mission against Al Qaeda, but the proponents were relentless—in particular, Cheney, who repeatedly said that Saddam Hussein had weapons of mass destruction and ties to Al Qaeda, two assertions that remain unproved—and they must have known that the boss's proclivity for boldness would work in their favor. Once Bush decides to take a bite of the apple, it's going to be the biggest chunk he can sink his teeth into. The argument that the status quo in the Islamic world would not change unless America did something to change it would have appealed to him. Of all the reasons to oust Saddam, the boldest was to change the paradigm. I admire the play, and I hope it works, even as I doubt its likelihood of success and fear that it targeted a less dangerous enemy than Al Qaeda. Few things in international affairs are more risky than to view the world as you wish it to be, rather than as it is.

A COUPLE OF DAYS BEFORE CHRISTMAS, I went to Washington for an interview at the White House with a senior administration official, who imposed the condition of anonymity. My editor and I had talked by phone earlier in the day, and he had told me to stick to the big stuff. "If you bring up arsenic in drinking water, they'll laugh you out of the office," he said. The SAO and I exchanged pleasantries, and I mentioned that I had written only a couple of stories about Bush since he became president. "I know," the SAO said, tapping a red file folder resting on the table. It contained copies of what I had written: one pre-9/11 article expressing puzzlement over Bush's policies and another, from last fall, about Texas Supreme Court Justice Priscilla Owen and her nomination to the U.S. Fifth Circuit Court of Appeals. "You got it completely wrong," the SAO said. "But you didn't come here to talk about that." "That's okay," I said. "Let's talk about it." And the first thing the SAO brought up was . . . arsenic in drinking water: how they had no choice but to reexamine the scientific evidence or else the rule left for them by the Clinton administration would have been subject to a court challenge.

That's pretty much how the interview went. The SAO had plenty to say, but the old atmosphere, so impressive in my Texas interviews, of open and big-picture discussions was nowhere in evidence. This was all about staying on message—and the message was that the national media have consistently failed to report Bush's policies accurately. Earlier in the day, I had had lunch with Paul Begala, the Democratic consultant and co-host of CNN's Crossfire. He had told me that the Democrats felt lied to about Bush's education bill, which Bush did not fully fund, falling $15 billion short. The SAO's response to my mention of this widely reported issue was that the White House had increased education funding by 60 percent since 2000, the largest increase in history. I turned to Iraq: What about the slow pace of rebuilding? No such thing, was the answer—no food crisis, no breakdown in health care, electricity back to pre-war levels, no refugees.

I confess that this aggressive defense took me by surprise. Perhaps it shouldn't have. This White House is famous for its antipathy to the national media, and the president himself makes no secret that he neither reads nor watches the news. This goes back to the campaign; I remember one Bush aide telling me in 1999, as the media clamored to know Bush's stance on issues, that "we intend to run this campaign on our timetable, not the media's." Still, I was from Texas. They knew me. Didn't that make a difference? Well, those days are gone. When I got home that night, I told my wife about the interview and said, "I felt just like a member of the national media." She gave me her best you-idiot look and said, "You are a member of the national media."

The thing I most wanted to ask about was Bush's desire to change the culture of Washington and what had become of it. The SAO told me that the president never criticizes Democrats directly; he always says something like "some in Congress." The Democratic leadership, on the other hand, wants total war. The SAO told a story about a trade bill, giving more authority to the president, that a number of House Democrats had supported during the Clinton years but opposed when Bush wanted it. Bush met with the D's, but all but a handful rejected his overtures. One of the Democrats, the SAO said, complained that a Republican chairman had been mean to them, shutting them out of conference committees and heaping other indignities on them. The SAO presented this as a silly reason to be against a public policy issue, as if, What is the president supposed to do—call up the chairman and say, "Be nice to the Democrats?" In fact, if Bush is serious about changing the culture of Washington, I think that is exactly what he should do. "Look," he could say, "I'm trying to get reelected, trying to help us keep our majorities in Congress, trying to pass important legislation, trying to unite the country in the war on terrorism, and I don't need you guys screwing things up." But I don't think he's serious about it—not serious enough to do the hard stuff, like take on the petty princes in his own party.

I STARTED THIS ARTICLE BY SAYING that I never expected to know a president of the United States. The truth is, I don't know President Bush. The person I knew was Governor Bush. I really liked him. I still do. But I'm ambivalent about his alter ego. On the one hand, the issue that matters most to me is the safety of my family and my country, and I cannot imagine that anyone, Republican or Democrat, would be more resolute and vigilant than Bush; on the other, I disagree with so many things that he has done.

If I end up voting for him—and I probably will—it will really be Governor Bush who gets my vote. Why? Because hope springs eternal: my hope that in a second term, free from worries about reelection and with an undisputed electoral victory, he will reappear after a four-year sabbatical. I'm betting he's still around; we just haven't seen him for a while. He's a uniter, not a divider. He doesn't kowtow to the extremists in his party. He's serious about wanting to change the political climate. He's vigilant about not letting his team mislead him or taint his administration. He makes appointments based on ability, not litmus tests. You see, I knew that guy.

Copyright © 2004 Texas Monthly Magazine

Turn, Turn, Turn


One of my favorite folk-rock songs is "Turn, Turn, Turn."


To every thing, turn, turn, turn
There is a season, turn, turn, turn
And a time to every purpose under heaven
A time to be born, a time to die
A time to plant, a time to reap
A time to kill, a time to heal
A time to laugh, a time to weep
To everything, turn, turn, turn
There is a season, turn, turn, turn
And a time to every purpose under heaven
A time to build up, a time to break down
A time to dance, a time to mourn
A time to cast away stones
A time to gather stones together
To everything, turn, turn, turn
There is a season, turn, turn, turn
And a time to every purpose under heaven
A time of love, a time of hate
A time of war, a time of peace
A time you may embrace
A time to refrain from embracing
To everything, turn, turn, turn
There is a season, turn, turn, turn
And a time to every purpose under heaven
A time to gain, a time to lose
A time to rend, a time to sew
A time to love, a time to hate
A time for peace, I swear it's not too late

words adapted from the Book of Ecclesiastes by Pete Seeger
music by Pete Seeger



Michael Kammen is in season. He has read all of the right stuff. If this is (fair & balanced)nostalgia, so be it.




[x CHE]
Summer, Fall, Winter, and Spring in American Culture
By MICHAEL KAMMEN

Some 25 years ago I began to be intrigued by depictions of the four seasons. They just kept surfacing hither and yon, in museums and public spaces -- all four seasons in suites, not just single pictures with titles like "Summer Haying" or "Winter Landscape." Gradually, my awareness expanded to sculpture, architectural ornaments, and the decorative arts -- works encountered in venues from galleries in Fort Worth, to venerable hotels de ville in Paris, to seasonal pavilions surrounding elegant ponds in Suzhou, China. Once I was alert to it, the motif seemed to be ubiquitous, so I took notes on my serendipitous discoveries and filed them away.

In the late 1990s, an avocational whimsy became a compulsion. Using the computerized inventory of the Smithsonian Institution's Archives of American Art and the ingenious search engine ArchivesUSA, I was able to pursue my interest more systematically, tracking down sources that had previously seemed inaccessible. That also led me to a considerable literature devoted to the subject, ranging from fiction and poetry to almanacs and, most notably, the writing of naturalists, as well as to music evoking the changing seasons -- far more than just Vivaldi's famous suite of concerti (1726) and Haydn's great oratorio The Seasons (1801), but also works like Verdi's ballet suite "Les Quatre Saisons" (1851) and Joni Mitchell's folk song "The Circle Game" (1970).

When I finally decided to write a book devoted to the history of this motif, I faced an array of problems and issues unlike any I had encountered in previous projects. Because the seasonal motif dates back to antiquity and can be found in the literature and art of most cultures in the temperate zone around the world, I first had to figure out what balance to strike between concentrating upon an American story and taking into account its provenance and parallels elsewhere.

I did not want merely to compile an international encyclopedia of seasonal sequences. Equally important, I needed to see what methodological precedents existed in my own disciplines of history and American studies. Had anyone traced the trajectory of a major motif over time? Did models exist that could offer guidance on traps to avoid as well as paths to follow? What about the history of notions that have profoundly mattered to many Americans -- ones indicative of the values we have held most dear? Those questions led me to revisit an imposing group of benchmark books.

Most obvious were works that traced the history of an important concept: Perry Miller's two-volume classic The New England Mind (1939 and 1953), Albert K. Weinberg's Manifest Destiny: A Study of Nationalist Expansionism in American History (1935), and Arthur A. Ekirch Jr.'s The Idea of Progress in America, 1815-1860 (1944). Consulting such text-based works in intellectual history reminded me that I was working on a visual motif, not just an elusive idea. Miller's "model" was especially striking, however, because in the first of his volumes he offered a topical and synchronic view of Calvinist beliefs in late 16th- and 17th-century England and her colonies, whereas the second volume provided a diachronic, developmental account of how and why Calvinism got transformed by the exigencies of New World conditions. I would need to do something similar, since the American version of the four-seasons motif originated in the Old World but gradually underwent notable alterations here.

Several other classic works that influenced me are routinely identified as progenitors of the "myth and symbol" school of American studies: Henry Nash Smith's Virgin Land: The American West as Symbol and Myth (1950) and Leo Marx's The Machine in the Garden: Technology and the Pastoral Ideal in America (1964). The limitations of their approach began to be widely discussed by the 1970s, for example putting more stress upon the significance of certain texts to us rather than on what the original authors wished to say. Still, an emphasis common to both books remains important, especially for my purposes: namely, their insistent attention to contradictions in American culture. As Marx observed of the pastoral ideal, "It enabled the nation to continue defining its purpose as the pursuit of rural happiness while devoting itself to productivity, wealth, and power."

Since the late 20th century, a new genre of works has gone further in exploring the relationship between visual and literary "texts," and I revisited Sarah Burns's Pastoral Inventions: Rural Life in Nineteenth-Century American Art and Culture (1989), which extended several of Marx's major insights into the realm of paintings. As Burns observed, "The image of the good farm in the 19th century was the offspring of the marriage of European pastoral traditions with Thomas Jefferson's agrarian politics." That kind of convergence supplied a sensible reminder about cultural fusion. Also pertinent, Merrill Schleier's The Skyscraper in American Art, 1890-1931 (1986) highlighted the critical response to a new and native structural form, but with special reference to the tension between tradition and innovation during the late 19th and early 20th centuries.

Three other interpretive inquiries added more clues to the puzzle I was beginning to piece together. Reading Roderick Nash's Wilderness and the American Mind (first published in 1967, with two revised editions since then) enhanced anew my appreciation for aspects of the American "Gospel of Nature," the title of a widely noted essay by the naturalist John Burroughs, prominent among my emerging dramatis personae. Burroughs's impassioned love of nature and its inspirational gifts offered solace to many Americans at a time when so many were wracked by doubt, during what has been dubbed the spiritual crisis of the gilded age.

Quentin Anderson's The Imperial Self: An Essay in American Literary and Cultural History (1971) highlighted the American flight from society into the "empire of the self." In the writings of Ralph Waldo Emerson and Henry David Thoreau, a heightened sense of self in relation to the natural world became a prominent trope. Finally, The Moving American (1973), George W. Pierson's investigation of mobility, provided a reminder of seasonal variability across the vast diversity of the United States. The obvious but critical fact that the seasons are not experienced in the same way in Maine, Missouri, Texas, and California would have to be emphasized in my study.

The four-seasons motif arrived in America in the 17th century. Before industrialization, it remained largely derivative from European perceptions and rarely innovative. Benjamin Franklin and Thomas Jefferson, for example, were spellbound by "The Seasons," written by the 18th-century Scottish poet James Thomson. Often considered the first book-length poem in English to feature nature, "The Seasons" saw in nature God's plan, helping to reconcile the existence of good and evil.

With the onset of industrialization and urbanization during the 19th century, as fewer Americans lived close to the land, what might be called a "flattening" of the seasons occurred. Seasonal change ceased to affect human lives to the same degree that it once did. The advent of canned foods at the turn of the 20th century transformed the American diet and liberated people from longstanding seasonal constraints. During the 1920s, the Caterpillar company advertised that, with its new snow-removal equipment, "January can be just like July" -- at least in terms of driving automobiles, and especially in American cities. I discovered, nonetheless, that engagement with the four-seasons motif did not diminish. Instead, its appearance in art, music, poetry, and other forms of literature, including the writings of naturalists, actually increased during the 20th century. I have tried to provide a window on the reasons why.

In the 19th century, national chauvinism mattered a great deal: Americans earnestly believed that the seasons and seasonal change were more spectacular in their country than anywhere else. The leading cultural journal at mid-century, The Crayon, quoted a representative response to Jasper F. Cropsey's dramatic painting, "Autumn on the Hudson": "They who know the aspect of Nature in the autumn in England only, have no notion of the glorious garb she elsewhere puts on at that time. In America, the woods are all ablaze." On the eve of the worst sectional struggle in American history, it was not surprising that national chauvinism enjoyed a great surge.

In an echo of the contradictory impulses that Smith and Marx had noted, nostalgia, and a sense of American loss, also mattered, because so many people felt increasingly out of touch with the natural world and with the lifestyle of preceding generations. Ambivalence about modernity played an important part in the attraction to seasonal invocations. Consider the pictorial splendors of Victorians like Cropsey (who created at least a dozen sets of seasonal images, more than any other American painter) and the beloved lithographs of Currier and Ives. The earliest of Currier's images in 1855 did not emphasize seasonal distinctiveness in the United States to an unusual degree, but, within a dozen years, new ones that highlighted romanticized scenes of rustic work and play did.

Consider, too, the seasonal essays of such Brahmins as Oliver Wendell Holmes Sr. and James Russell Lowell, as well as the enormous popularity of books and essays by Burroughs and John Muir, founder of the Sierra Club. When Holmes lamented, in his 1868 essay on "The Seasons," that too few people paid adequate attention to seasonal change, he was primarily addressing urban dwellers like himself.

The advent of cultural modernism in America early in the 20th century offered fresh challenges and new ways to think about the seasons in metaphorical and allegorical terms. Whereas creative people had previously been most intrigued by the presentation of seasonal peaks, increasing attention came to be devoted to seasonal transitions. Allusions to change became a prominent metaphor. The pace of life seemed to accelerate as the 20th century progressed, and so did the production of books, essays, and art of every type depicting the seasons, quite often as a metaphor for the human life cycle. As in earlier periods, the changes were manifest in high as well as mass culture: in Charles Burchfield's vivid watercolors and the seasonal calendars of Norman Rockwell, in the challenging but moving poetry of Wallace Stevens and W.D. Snodgrass, in the nature writings of Hal Borland, Edwin Way Teale, and Joseph Wood Krutch.

When Jasper Johns completed his famous suite "The Seasons" in 1987, he allowed his dealer, Leo Castelli, to sell three of them. He retained "Autumn" for his personal collection, however, because he felt he had been in the autumn of his life when he created the series. In 1964 the magic realist Peter Blume began his seasonal suite, and in 1972 when Marc Chagall was persuaded to accept a commission to make a monumental mosaic block as public art for downtown Chicago, he selected the four seasons because of the universal appeal of the theme.

By the close of the 20th century, new developments in the biological sciences began to provide information about the impact of seasonal change on human physiology. Women are more notably susceptible than men to Seasonal Affective Disorder during the dark months from November until March, and schoolboys are thought to be more likely to do well on standardized tests in the spring than in the fall because of seasonal variations in their testosterone level. Those findings have practical implications for understanding gendered differences in the human condition and the ways in which those alterations vary during the course of the annual cycle. Seasonal art by women artists like Anne Abrons, Jennifer Bartlett, and Lisa Zwerling is fairly explicit about the link between sex and seasonality.

Thoreau, who loved both seasonal peaks and transitions, organized his great classic, Walden (1854), according to the cycle of the seasons, beginning with spring, when he built his cabin, and then moving on to the sociability that came with curious visitors in summer, the growing isolation of fall, which also provided him with occasions for ecstasy about the stunning beauty of North American foliage, and the quiet of winter, when he could enjoy solitude for productive writing as well as long walks. Eventually, Walden came full cycle with the renewal of life at the advent of spring.

Thoreau's justly revered Journals, published in their entirety just a century ago, are jampacked with seasonal observations, including an entry on June 11, 1851: "No one, to my knowledge, has observed the minute differences in the seasons." A book that did so, he wrote, would be "a Book of the seasons, each page of which should be written in its own season & out-of-doors, or in its own locality wherever it may be." That is just what I have tried to do, except for Thoreau's admonition to write out-of-doors. My work required electrical outlets as well as aesthetic ones.

Michael Kammen is a professor of American history and culture at Cornell University. His most recent book is A Time to Every Purpose: The Four Seasons in American Culture, to be published this month by the University of North Carolina Press.

Copyright © 2004 by The Chronicle of Higher Education