Tuesday, May 31, 2005

The Plotz Thickens!

Give me a break! Some NPR listener from Georgia wants Plotz to sample Alabama barbecue, Georgia barbecue, and South Carolina barbecue. Hell, why doesn't Plotz do all 50 states? Talk about heartburn! If this is (fair & balanced) gourmandry, so be it.

[x Slate]
Well-Traveled: Texas BBQ? Not So Fast ...
On Monday, NPR's Alex Chadwick ("Day To Day") spoke with Slate's David Plotz, who, after exhaustive research, declared Texas the home of the nation's best barbecue. Chadwick reads a response from a listener who begs to differ. Listen to the segment here.

Copyright © 2005 Slate

Copyright © 2005 National Public Radio

Sunday, May 29, 2005

Ground Nada, Not Zero

Frank Rich nails the key issue of the day. I tormented my students at the Collegium Excellens by asking them — after 9/11 — if they were a 9/10 person (unaffected by the terrorist attacks) or a 9/11 person (changed by the terrorist attacks). Almost from the beginning, most students admitted that they were 9/10 people pursuing business as usual. If this is (fair & balanced) frankness, so be it.

[x NYTimes]
Ground Zero Is So Over
By Frank Rich
In its not-so-brief and thoroughly unhappy life, ground zero has been a site for many things: tragedy and grief, political campaigns and protests, battling architects and warring cultural institutions, TV commercials and souvenir hustlers. Perhaps it was inevitable we'd end up at pure unadulterated farce.

That's where we are as of this Memorial Day weekend. A 1,776-foot Freedom Tower with no tenants - and no prospect of tenants - has been abruptly sent back to the drawing board after the Marx Brothers-like officials presiding over the chaos acknowledged troubling security concerns about truck bombs. But truck bombs may be the least of the demons scaring away prospective occupants. The simple question that no one could answer the day after 9/11 remains unanswered today: What sane person would want to work in a skyscraper destined to be the most tempting target for aerial assault in the Western world? As if to accentuate this obvious, if frequently suppressed, psychological bottom line, news of the Freedom Tower's latest delay was followed like clockwork by a Cessna's easy penetration of supposedly secure air space near the White House, prompting panicky evacuation scenes out of the 50's horror classic "The Day the Earth Stood Still."

And so ground zero remains a pit, a hole, a void. As The New York Post has noticed, more time has passed since George Pataki first unveiled the "final design" of the Freedom Tower than it took to build the Empire State Building. For New Yorkers this saga is a raucous political narrative whose cast of characters includes a rapacious real-estate developer, a seriously irritating architect with even more irritating designer eyeglasses, a governor with self-delusional presidential ambitions and a mayor obsessed with bringing New York the only target that may rival the Freedom Tower as terrorist bait, the Olympics.

But there is another, national narrative here, too. Bothered as New Yorkers may be by what Charles Schumer has termed the "culture of inertia" surrounding ground zero, that stagnation may accurately reflect most of America's view about the war on terror that began with the slaughter of more than 2,700 at the World Trade Center almost four years ago. Though the vacant site is a poor memorial for those who died there, it's an all too apt symbol for a war on which the country is turning its back.

This is a dramatic change from just a year ago. In the heat of election season, the Bush-Cheney campaign set off a melee by broadcasting ads that featured the shell of the World Trade Center and shrouded remains being borne away by firefighters. Ground zero was hallowed ground, and the outcry against its political exploitation was so fierce that the ensuing Republican National Convention went nowhere near the site that had made New York its cynical choice of venue in the first place. Instead, the prospect of terror and the hot-button-pushing invocations of 9/11 were shoveled into the oratory at Madison Square Garden, where Rudolph Giuliani had a star turn. All the post-election talk of "moral values" notwithstanding, the terrorism card proved the decisive factor in the defeat of John Kerry, a character whose genius for equivocating on just about any issue rendered him a pantywaist against an opponent who had stood with a bullhorn in the smoky wreckage and had promised to round up the bad guys "dead or alive."

But once the election was over, ground zero was tossed aside like a fading mistress. The only time it has figured in national public discourse since was when the president nominated Bernard Kerik director of homeland security. The most damaging of the subsequent allegations against this 9/11 hero - that he had used an apartment for rescue workers overlooking the site as a hot-sheets motel for an extramarital tryst - didn't just end his government career; it effectively downsized ground zero from sacred ground into crude comic fodder for late-night comics. The fallen cultural status of the site in the months since is epitomized by the recent news conference at which Donald Trump thought nothing of showcasing his own stunt plan for ground zero (building replicas of the twin towers, only a story higher) as a promotional tie-in to the season finale of his reality show, "The Apprentice." Though there was some outrage among the 9/11 families, everyone else either giggled or shrugged (and "The Apprentice" was still eviscerated by "CSI").

Such lassitude about the day that was supposed to change everything is visible everywhere. Tom Ridge, now retired as homeland security czar, recently went on "The Daily Show" and joined in the yuks about the color-coded alerts. (He also told USA Today this month that orange alerts were sometimes ordered by the administration - as election year approached, anyway - on flimsy grounds and over his objections.) In February, the Office of Management and Budget found that "only four of the 33 homeland security programs it examined were 'effective,' " according to The Washington Post. The prospect of nuclear terrorism remains minimally addressed; instead we must take heart from Kiefer Sutherland's ability to thwart a nuclear missile hurling toward Los Angeles in the season finale of "24." The penetration of the capital's most restricted air space by that errant Cessna - though deemed a "red alert" - was considered such a nonurgent event by the Secret Service that it didn't bother to tell the president, bicycling in Maryland, until after the coast was clear.

But what has most separated America from the old exigencies of 9/11 - and therefore from the fate of ground zero - is, at long last, the decoupling of the war on terror from the war on Iraq. The myth fostered by the administration that Saddam Hussein conspired in the 9/11 attacks is finally dead and so, apparently, is the parallel myth that Iraqis were among that day's hijackers. Our initial, post-9/11 war against Al Qaeda - the swift and decisive victory over the Taliban - is now seen as both a discrete event and ancient history (as is the hope of nailing Osama bin Laden dead or alive); Afghanistan itself has fallen off the American radar screen except as a site for burgeoning poppy production and the deaths of detainees in American custody. In its place stands only the war in Iraq, which is increasingly seen as an add-on to the war provoked by 9/11 and whose unpopularity grows by the day.

Take a look at any recent poll you choose - NBC/Wall Street Journal, Harris, CNN/Gallup/USA Today - and you find comparable figures of rising majority disapproval of the war. Or ignore the polls and look at those voting with their feet: the Army has missed its recruiting goals three months in a row, and the Marines every month since January, despite reports of scandalous ethical violations including the forging of high-school diplomas and the hoodwinking of the mentally ill by unscrupulous recruiters. Speaking bitterly about the Army's strenuous effort to cover up his son's death by friendly fire, Pat Tillman's father crystallized the crisis in an interview with The Washington Post last week: "They realized that their recruiting efforts were going to go to hell in a handbasket if the truth about this death got out. They blew up their poster boy."

THE cost of the war is rapidly becoming the routine stuff of mainstream popular culture. July 27 will bring the debut of "Over There," a powerful new weekly TV drama by Steven Bochco ("NYPD Blue") and Chris Gerolmo ("Mississippi Burning") that takes no political stand on the war but dramatizes the ripped torsos, broken homefront lives and unknown expiration date of our Iraq adventure in the unsparing detail that has often been absent from network news. The show is being presented not by some liberal cabal but by the rising cable network that "Nip/Tuck" built - FX - a franchise of Rupert Murdoch. On June 21 FX is also bringing back Denis Leary's jaundiced look at post-9/11 firefighters, "Rescue Me." In the first new episode, the hero throws a bag of "twin-tower cookies" back at the vendor selling them, heaving in anger that those who died that fateful morning have been usurped by kitsch.

Tomorrow, Memorial Day itself, will bring another "Nightline" reading of the names of the fallen: the more than 900 Americans who have died in Iraq and Afghanistan since Ted Koppel's previous recitation. When he read 721 names in April 2004, Mr. Koppel was labeled a traitor by the right for daring to call attention to the casualties, and some affiliates even refused to broadcast the show. This time the prospect of a televised roll call of the dead has caused little notice at all. Like the latest setbacks at ground zero, it is a troubling but increasingly distant event to those Americans who, unlike the families and neighbors of the fallen, can and have turned the page.

Frank Rich is an Op-Ed columnist for The New York Times. His weekly 1500-word essay on the intersection of culture and news helped inaugurate the expanded opinion pages that the paper introduced in the Sunday Week in Review section in April 2005.

Copyright 2005 The New York Times Company

Saturday, May 28, 2005

The Price Of Fame

I dislike People and its imitators (both print and TV). Cable news is consumed by the endless speculation about celebrities: Jacko, Paris Hilton, the Murderer of the Week, the Scoundrel of the Week, ad nauseum. P. J. O'Rourke is the rarest of creatures: a Republican with a sense of humor. If this is (fair & balanced) wit, so be it.


[x The Weekly Standard]
Here's a Tax We Can All Agree On: Soak the celebrities
by P.J. O'Rourke

THE GREATEST PLEASURE OF RUNNING a country (although no politician will admit it) is getting to tax people. We Republicans decry exactions and imposts and espouse minimal outlay by the sovereign power. But we control all three branches of government. This won't last forever. Let's have some fun while we can. Moreover, the federal deficit is--contrary to all Republican principles--huge. Even the most spending-averse among us wouldn't mind additional revenue.

America's media and entertainment industry has a gross (as it were) revenue of $316.8 billion a year. If we subtract the income derived from worthy journalism and the publishing of serious books, that leaves $316.8 billion. Surely this money can be put to a more socially useful purpose than reportage on the going forth and multiplying of Britney Spears.

What is the least damaging way to tax the media and entertainment industry? The first response that comes to mind is "Who cares?" Everybody in this business hates us except Rupert Murdoch, the Wall Street Journal editorial page editors, and Bruce Willis. Private bills in Congress having to do with Bermuda incorporation can take care of that. Still, we don't want to tax profits. After all we're Republicans. And as that great Republican think tank, the Bible, puts it, "For what shall it profit a man, if he shall gain the whole world and lose" . . . the next election. An indirect tax is best, being proportional in its effects and producing "flat tax" outcomes. I propose a tax on raw materials.

The raw material of the media and entertainment industry is fame. Of course we wouldn't want to tax the well-earned and justly deserved fame of a Jonas Salk or a Ronald Reagan. However, now that Reagan and Salk have gone to their reward and America has recovered from its 9/11 adulation of New York City firemen, there are no such famous people extant in the United States. I did a Lexis/Nexis search.

Actually the resource upon which the media and entertainment industry depends is not fame but its toxic run-off, celebrity. America has vast proven reserves. I bought the May 23 issue of a magazine devoted to vulgar public notice. Its contents suggest that Sartre was ever so slightly misquoted on the nature of perdition: Hell is People. What have I ever done to deserve being exposed to Paris Hilton's Chihuahua, Tinkerbell, wearing four designer outfits? This was in a photo spread titled "Dogs Are Children Too!" Also featured was Tori Spelling's pug dressed as Little Orphan Annie and a quote from Oprah Winfrey about her cocker spaniel, Sophie: "I have a daughter." (Named, no doubt, with an eye to using the William Styron novel in a forthcoming Oprah's Book Club segment.)

I suggest, therefore, a Celebrity Tax with a low-end base rate of, mmm, 100 percent. Furthermore, let's make the tax progressive to get some Democrats on board. (Probably not including Hillary, Ted, and Barney Frank. They'll be working nights and weekends to pay up.) Given the modest talent of current celebrities and the immodest example they set for impressionable youth, we'll call it a "Value Subtracted Tax," or, better, a "Family Value Subtracted Tax." And it will be calculated on the celebrity's net worth.

We can quit worrying about the federal deficit, at least for this year. Forbes estimates that Oprah alone has assets of $1 billion. True, we need another $411 billion to close the budget gap. But optimism is kindled by a flip through People. I had no idea there were so many notoriety nuisances. Among the boldface names in the "Insider" gossip column I find Kimberly Stewart, Scarlett Johansson, and Michelle Trachtenberg--all, I'm given to understand, possessed of anonymity manqué. And who on earth is Wilmer Valderrama? Why am I being informed that she (or, as far as I know, he) was dancing with someone called Ryan Seacrest at the Spider Club in West Hollywood? The Spider Club is not, I am guessing, exactly Chasen's or the Brown Derby.

There will be difficulties levying the Celebrity Tax. People and its print and broadcast ilk treat certain better types of human beings as at least nominal celebrities. For example, in my May 23 issue there is an article about a young man who is blind and severely crippled but an accomplished pianist. Another article concerns a 78-year-old nun doing good works in a Tijuana jail (although, rather tabloidishly for a nun, she has been divorced twice and has seven children). We can't tax handicapped piano players and elderly, contrite sisters of mercy. An expanded IRS will be needed to determine who is rightly acclaimed and who is merely egregiously overexposed.

Republicans aren't supposed to grow the bureaucracy. But, being honest with ourselves as Republicans, creating more patronage jobs isn't always a bad thing. The GOP includes large numbers of earnest, morally committed social conservatives, not to say cranks. We need their fundraising and get-out-the-vote skills. Here is a perfect place for them between elections, with civil service benefits and plenty to keep them busy.

A second problem with an excise on infamy is the possible economic effect. The media and entertainment industry is an important factor in America's GDP. Our best economists tell us that increasing the taxes on any enterprise decreases the enterprise's productivity. But in this case--and this case only--I'll argue against Milton Friedman. Everything (by "everything" I mean Reality TV) indicates that the business of being a celebrity does not respond to the usual positive and negative economic stimuli.

People (and by "people" I mean contestants on "American Idol") are willing to invest all that they have in the faint hope they'll receive a fleeting and worthless moment as the center of attention for an audience of bored idiots. (If you doubt me, compel yourself to watch an episode, regrettably available on DVD and video, of "Jackass.") Tax the media and entertainment industry at a million percent and it will continue to produce a surplus of celebrities with Stakhanovite labor heroism.

Of course it's possible that I'm wrong. My proposed Celebrity Tax might create wide-ranging economic dislocations. The media and entertainment industry could be bankrupted. This would result in the demise of Top 40 radio, blockbuster movies, hit television shows, and People. If I am wrong, send the bottles of Veuve Clicquot in care of this magazine.

P.J. O'Rourke is a contributing editor to The Weekly Standard and author, most recently, of Peace Kills (Atlantic Monthly Press).

© Copyright 2005, News Corporation, Weekly Standard, All Rights Reserved.

Friday, May 27, 2005

The Great American Barbecue Pilgrimmage Ends In Texas (Naturally)

Slate's deputy editor — David Plotz — just completed a 5-day pilgrimmage in pursuit of great barbecue.

Plotz began his odyssey in Kansas City, MO and ate the first of his 15 barbecue meals at Oklahoma Joe's and then went on to Arthur Bryant's (immortalized by Calvin Trillin's piece in The New Yorker), Fiorello's Jack Stack, and Gates Bar-B-Q.

From KC, Plotz went to Memphis, TN and ate barbecue at Charlie Vergo's Rendezvous, Corky's, and Jim Neely's Interstate.

Plotz then went to Texas to cleanse his palate (his words, not mine) and spent his next-to-last day in Lockhart, TX where he ate at both Smitty's Market and the Kreuz (pronounced "critts") Market. Like me, Plotz liked Smitty's the best.

Plotz finished his odyssey in Llano, TX (west of Geezerville by about 65 miles) at Cooper's Old-Time Barbecue. Cooper's is unique in that diners enter via the roofed pit area and choose their meat selection(s) before they go into the building. The pitmaster says, "You pick it and I'll stick it." Like the joints in Lockhart, Cooper's meat goes on butcher paper (no plates) and is sold by the pound.

Plotz mentions Rudy's, but neglects to locate that joint. Rudy's is in Leon Springs (west of San Antonio). George Strait supposedly lives in a palace on a hill looking down on Leon Springs. I'll bet George's mouth waters when he gets a whiff of Rudy's pit smoke.

If this is (fair & balanced) gastronomy, so be it.



[x Slate]
The Greatest Barbecue Restaurant in the World (In Two Parts)
By David Plotz

I.

MEMPHIS, Tenn. to LOCKHART, Texas—Sunday is a sabbath day for many barbecue restaurants, so that's when I made my monster 600-mile drive from Memphis to Houston. It took me past Hope, Ark., right around lunchtime, so I pit-stopped in Bill Clinton's hometown to hunt for sustenance. Clinton's birthplace, sandwiched between the railroad tracks and a grim strip of cash-advance and fast-food places, was closed, but just outside of town I found Uncle Henry's Smokehouse open for lunch. Arkansas styles itself very pure about its 'cue, and owner Bobby Redman made me a totally unadorned sandwich: a pile of fresh chopped pork on a bun, with no slaw and no sauce. It was good, if a little dry and shy on smoke for my taste. Fleetwood Mac was singing "Don't stop thinking about tomorrow" on the Uncle Henry's radio, which seemed only fitting. That was Clinton's 1992 theme song and my theme song for Sunday, because Monday was when I would make my hajj to barbecue's most holy city: Lockhart, Texas.

I had persuaded my carnivorous father to join me for the Texas leg. He met me in Houston on Sunday night, and on Monday morning we raced west, first on highways, then on farm roads, toward Lockhart, which is 150 miles from Houston and 30 miles south of Austin. It's in the heart of the Texas Barbecue Belt. Start in Austin and drive 15 or 30 or 70 miles in practically any direction, and you are liable to find yourself at a world-class barbecue shop. (It will probably be advertising "hot guts." Do not be alarmed. This is Texan for sausage.) When I was 19 years old, I drove through this part of Texas with a friend. Knowing nothing about the Barbecue Belt, we stopped at a roadside stand and ordered a few slices of brisket. That meal burned in my memory as the Platonic ideal of barbecue. It is my barbecue Rosebud. It is why I came back.

Texas barbecue is like Texas itself: brash, arrogant, and beefy. In the Barbecue Belt, meat is seasoned with only salt, pepper, and a little cayenne, then smoked quickly over mesquite or post oak. It is cut in huge slabs in front of you and served on butcher paper with a pile of saltines or white bread. The best places serve no sauce. Some don't even have forks. It's pure longhorn showmanship: They are so sure of their meat, they don't think you should eat anything else.

The Texas idealism produces extraordinary barbecue fealty. Barbecue: A Texas Love Story, a charming new documentary, captures the cultlike nature of it, cruising with the University of Texas student barbecue club and worshipping at the New Mount Zion Missionary Baptist Church in Huntsville, whose barbecue side business is so beloved it has earned the nickname "Church of the Holy BBQ." Every few years, Texas Monthly magazine rates the best barbecue restaurants in the state, an announcement that is to Austin almost what the Academy Awards are to Los Angeles.

When the latest Monthly rankings came out, two of its five "best of the best" were in Lockhart. A town of 11,000, Lockhart became Texas' barbecue capital for three reasons. First, Germans and Czechs settled in this part of Texas starting in the mid-19th century, bringing the central European butchering and smoking techniques that made Texas barbecue. Second, Lockhart is where the Schmidt family settled. And third, the Schmidt family can't get along.

In 1948, Edgar Schmidt bought a German meat store in Lockhart from the Kreuz family. Over the next half-century, Schmidt's Kreuz Market became the most beloved barbecue restaurant in the state. In 1999, nine years after Edgar's death, his children squabbled. Son Rick Schmidt was running Kreuz Market, while daughter Nina Schmidt Sells owned the building. Nina wouldn't renew the lease, so Rick took the coals out of the pits and hauled them five blocks down the road to the massive new Kreuz Market—a "barbefeud" that made the newspapers and even got a segment on 48 Hours. Nina and her son kept the old Kreuz and renamed it Smitty's Market—thus turning the greatest barbecue restaurant in the world into the two greatest barbecue restaurants in the world.

My father and I stopped at Smitty's first. Entering feels like walking into an ancient shrine. You cross the threshold from the bright parking lot into a smoky darkness. The air smells indescribably delicious, smoke that you want to eat. As your eyes adjust, you can make out the men in white butcher coats hacking off huge slices of brisket on wooden blocks. Two walls are lined with the pits, long, waist-high brick boxes. Metal grates inside hold briskets, shoulders, sausages. At one end of the pit is an opening, and a fire of post oak logs burns on the floor next to it. It's a simple but effective method. The smoke and heat of the fire are drawn through the opening into the pit.

I tracked down Nina Sells' son, who runs Smitty's. His name is John Fullilove; a more perfectly named pitmaster could not be found. John is 31 years old, and wide, with a red face that is both fierce and incredibly sweet. He was a joy to be with, funny, friendly, hospitable, and passionate about his work. We asked him about the cuts of meat he uses, and John—who's a butcher, too—demonstrated on his own body which parts of the cow we would eat.

He piled up butcher papers with sausage, brisket, and shoulder—about 20 bucks' worth, an enormous amount—and directed us out to the cheery dining room. (On Saturdays, this dining room and the overflow room would be jammed, with lines way out the door.) He grabbed himself a slice of prime rib, an avocado, and some Doritos, and joined us for lunch. There are no forks and no sauce at Smitty's. You hack your meat up with a plastic knife and eat it off the knife or with your hands. (The beans and slaw you can eat with a spoon.) In the old days of Kreuz Market, before plastic cutlery and health inspectors, customers ate with communal knives that were chained to the wall. You can still sit at the old wood benches and see the chains.

Smitty's barbecue was unbelievably good, divinely good. The brisket, black and almost crunchy outside, was moist inside—a perfect mix of fat and salt and meat. The sausage—made with nothing more than beef, pork, salt, pepper, cayenne, and smoke, was incredible—so good that my father and I jury-rigged an improvised ice chest in order to buy a dozen links to bring home. Smitty's meat didn't need sauce or sides or even bread. It was perfect.

I felt honored to be eating there with John, a man who loves his job and does it better than anyone, in a place that bears the burden of tradition so magnificently. I couldn't imagine a better meal.

We headed down the street—past Lockhart's charming downtown, with a gorgeous library and confectionery courthouse—to the new Kreuz Market. We chatted for a minute with Keith Schmidt, who's the general manager and the son of owner Rick. He was doleful and unwelcoming, a stark contrast to his cousin John at Smitty's. We ordered a second lunch. The new Kreuz is cavernous—it can seat several times as many people as Smitty's—and it has a USDA-approved kitchen so it can ship its meats nationwide.

Unlike Smitty's, it's modern and sterile, and I don't mean that as a compliment. The menu is essentially identical to Smitty's, except Kreuz has sauerkraut and potato salad and costs a little more. The food was wonderful—fantastic brisket and ribs, a great sausage. Technically, it was probably just as good as the meal I had eaten 15 minutes earlier up the street, but the atmosphere—antiseptic and unfriendly—suppressed my enthusiasm. I would much rather have eaten twice at Smitty's.

II.

AUSTIN and LLANO, Texas—Here's the amazing thing about Texas barbecue. Even a run-of-the-mill place around here is better than the best barbecue anywhere else. On Monday night in Austin, my father and I ate our third barbecue meal of the day at the Iron Works, a downtown joint with a modest reputation. It was great!

There was a bit too much forced funkiness in Austin for my taste. We spent the night in the funky Austin Motel ("So Close Yet So Far Out," read the sign), ate dessert at the funky ice cream shop across the street, read the paper the next morning in the funky coffee shop next door (but we didn't get a funky tattoo at the funky tattoo parlor). After a tour of the fantastic Museum of Texas—where there was a lot of talk of longhorns, but none of barbecue—we headed west through the Hill Country to hunt for lunch. The barbecue in the Hill Country west of Austin is slightly different than in towns east of Austin such as Lockhart. Some Texans claim that West Texas—and thus the whole American West—starts in the Hill Country. The barbecue west of Austin has a slightly more cowboy feel. (It's cooked over mesquite rather than post oak, for example.)

In the Hill Country, the bluebonnets and other wildflowers were in bloom, and the sun finally decided to come out. It was a perfect day for driving. We cruised 100 miles through ranches and scrub land to the small town of Llano. We pulled up at Cooper's Old Time Pit Bar-B-Q, a place recommended by several of my barbecue rabbis. I realized, as we stepped out of the car, that Cooper's was where I first tasted Texas barbecue on my road trip 16 years ago. I was so glad to return.

At Cooper's, you step right up to the outdoor pit and point at the meat you want. The pitmaster grabs it; slices off as much as you ask for; slaps it on a tray; pours a tiny bit of thin, vinegary sauce on it; and hands it to you. Then you take it inside and hand it to a cashier, who weighs it and dumps it on butcher paper—your plate. We ordered ribs, brisket, and two kinds of sausage, then returned for seconds of brisket and prime rib. Customers sit family-style inside, helping themselves from the buckets of jalapeños and loaves of Butterkrust bread on the tables. The place is less charming than Smitty's—the walls are cinderblock and ceilings are low—but it's friendly. The barbecue was superb. The brisket was stellar, and the ribs may be the best I've tasted. (It's criminal that Memphis is recognized as the city of great ribs, because every rib I ate in Texas was vastly superior.) We also ate a mesmerizingly delicious blackberry cobbler. It was the first dessert I ate at a barbecue restaurant on the whole trip, and it made me wonder what I missed elsewhere.

On our way out, we discovered that my father and President Bush, who don't agree about very much, agree about Cooper's ribs. A testimonial letter from Bush to the ribs hangs on the wall. He ate here when he was governor, and during the vote-counting after the 2000 election, Cooper's catered a picnic at Bush's Crawford ranch.

We made our way back to Austin, sated. I had driven 1,800 miles in seven days, eaten 15 barbecue meals in a row, and finally found bliss in Texas. The four Texas barbecue meals I ate in 24 hours were better than any other barbecue I ever had in my life (save my one meal at Cooper's in 1989). I had found my barbecue bliss, and I was done. My lower intestine had ground to a complete stop, and I had a slight pain in my chest. It was time to go home.

At the Austin airport, I was singled out for a special security screening. The TSA agent fingering through my bag pulled out a jar of barbecue sauce I had bought at Gates in Kansas City. "What's this?" she asked.

"It's barbecue sauce," I said.

"I know it's barbecue sauce. I mean, what kind of sauce is it? I've never seen this kind before."

"It's from Kansas City."

She grimaced at this. Holding the jar like it contained C-4 explosive, she showed it to another screener. "Look, this guy has some kind of barbecue sauce from New York City or something," she told the other screener derisively.

"Kansas City," I weakly interrupted.

She waved me off, then said in an ominous voice. "Now, why would you have that?"

"I was on a barbecue tour," I answered. "I started in Kansas City, and finished here."

"Did you go to Rudy's?" she asked.

I shook my head.

"You came to Texas for barbecue, and you didn't go to Rudy's?" She turned to her partner. "He came to Texas, and he didn't go to Rudy's!" The partner shook his head.

"What about the Salt Lick?" she asked. I shook my head no again. She made a face.

The partner continued the interrogation. "How about the County Line?"

I shook my head.

"Well, where did you go?" the screener asked in an exasperated voice.

"I went to Cooper's in Llano. And I went to Smitty's and Kreuz Market in Lockhart."

She lit up. "Well, why didn't you say that to begin with?" She nudged her partner. "He went to Lockhart." The partner nodded. The agent turned back to me, and handed me the bag and the sauce. "You can go ahead now."

David Plotz is Slate's deputy editor. He is the author of The Genius Factory: The Curious History of the Nobel Prize Sperm Bank. You can e-mail him at plotzd@slate.com.

Copyright © 2005 Washingtonpost.Newsweek Interactive Co. LLC

Monday, May 23, 2005

Wild Bill's 10 (actually 26) Commandments

My recollection about my course in southern history as an undergraduate brought me back to William B. (Wild Bill) Hesseltine, the author of the core text in the course. Later, I encountered people who had been at the University of Wisconsin during the Golden Age of its history department. Among the giants who strode the shores of Lake Mendota was a great figure in southern history: William B. Hesseltine. Little did I know that Hesseltine promulgated a decalogue for Wisconsin history grad students. Woe to the violator of one of these commandments. If this is (fair & balanced) pedagogy, so be it.

THE HISTORIAN'S TEN COMMANDMENTS
(Revised and Enlarged)
Original attributed to
WILLIAM B. HESSELTINE (1902-1963)
Department of History
THE UNIVERSITY OF WISCONSIN


  1. Thou shalt not use the passive voice.


  2. Thou shalt not use the present voice.


  3. Thou shalt not quote from secondary sources.


  4. Thou shalt not quote more than three lines--and never shalt thou use "block" quotations.


  5. Thou shalt not quote anything thou canst say better thyself.


  6. Thou shalt not quote to carry thy story.


  7. Thou shalt, in short, quote only to "season" thy story.


  8. Thou shalt not quote to establish a fact except when the word itself is the fact.


  9. Thou shalt not pass judgments on mankind in general nor shalt thou pardon anyone for anything.


  10. Thou shalt cite material to the standard source.


  11. Thou shalt combine footnotes whenever possible.


  12. Thou shalt not write the history of a wheat field regardless of how "naturally" it develops.


  13. Thou shalt not designate persons by their last names only.


  14. Thou shalt clearly identify any personality mentioned in the text--be it Jesus Christ or Abraham Lincoln.


  15. Thou shalt not mention secondary sources or writers in the main body of thy text.


  16. Thou shalt fight all thy battles in thy footnotes.


  17. Thou shalt write about thy subject and not about the documents concerning thy subject.


  18. Thou shalt not discuss thy methodology.


  19. Thou shalt strike thy reader hard with thy first sentence.


  20. Thou shalt not use slang--nor split thy infinitives.


  21. Thou shalt place thy time clauses first.


  22. Thou shalt not use the personal pronoun either explicitly or implicitly.


  23. Thou shalt not use the rhetorical question to avoid an intelligent transition.
  24. .

  25. Thou shalt set down things as they happen; thou shalt have no references later in time than the subject thou'rt dealing with.


  26. Thou shalt be neither a "no-er" nor a "not-er"--i.e., thou shalt avoid negations whenever possible.


  27. Thou shalt never use THIS for THE, nor THE for A.



(reserved for future revelations from the prophet)

That's What I Like About The South?

One of my favorite courses as an undergraduate in the early 60s was "History of the South." Professor Theodore Crane introduced me to several things: William B. Hesseltine, W. J. Cash, and green book bags. At the time that I was taking this course, I was asked — "Why are you taking a southern history course?" — and I answered that the South was going to be where the action was in this country before the end of the century. Before the end of the twentieth century, Atlanta's Hartsfield International Airport surpassed O'Hare, LAX, and Kennedy as this country's busiest airport. Atlanta, New Orleans, and Houston moved to the forefront as business centers by 2000. I have lived in Texas for more than half my life. I can't imagine living anywhere else. If this is (fair & balanced) prescience, so be it.

[x Washington Post]
washingtonpost.com
Rethinking the Confederacy: New takes on the losing side of the nation's bloodiest war
By Joseph Crespino

Capture the Flag

Abraham Lincoln's second inaugural address remains such an important document in American history because it was, in essence, the first history of America's Civil War, delivered a month before Gen. Robert E. Lee surrendered at Appomattox. Americans all knew, Lincoln pronounced, that slavery was "somehow the cause of the war." Exactly how "the peculiar institution" sparked the bloodiest conflict on U.S. soil has been argued over ever since. In the years following Reconstruction, the desire for reconciliation among white Northerners and white Southerners overshadowed notions that the war had been about ending slavery and giving political rights to African Americans. By the turn of the century, white Southerners enacted (and white Northerners condoned) a system of legalized inequality and political exclusion that persisted until the 1960s. Conflicts that erupted during the nation's Second Reconstruction -- the civil rights era -- revived the fight over the meaning of the Civil War. And even today, some 40 years past America's Second Reconstruction, fights over the symbols of the war continue to divide Americans, white and black.

No symbol in the past few decades has been more divisive than the Confederate battle flag. In his important new book, The Confederate Battle Flag: America's Most Embattled Emblem (Belknap/Harvard, $29.95), John M. Coski shows how it got that way. The battle flag, though not the official banner of the Confederacy, emerged over the course of the war as the sentimental favorite among Confederate soldiers and civilians alike. Coski takes the story forward from there, but his most important contribution is his recounting of the tumultuous story of the flag in the second half of the 20th century, when the civil rights movement emerged, setting loose a variety of groups that made competing claims over the meaning of the flag -- and the meaning of the war.

Since the 1960s, Americans have fought over the public display of the Confederate flag. For its opponents, the flag is a symbol of both the 19th-century society that enslaved African Americans and the 20th-century reactionaries who opposed the civil rights movement. The flag's defenders have made any number of arguments -- that the flag represented not slavery but the defense of states' rights, that a banner that represented the South's noble heritage was corrupted in the 1950s by opportunistic racists, that a minority of the population should not be able to dictate the meaning of a historical symbol. Coski's book will speak to the flag's opponents as well as its defenders, but his most inspired message is aimed at those cheerleaders who insist that the flag has one, unchanging, fundamentally benign meaning. He shows that the history of the flag is simply too complicated for anybody to reach such simplistic conclusions.

The Confederate Battle Flag includes a mountain of research. Coski, historian of the Museum of the Confederacy, seems to have culled every single public reference to the Confederate flag in the past 55 years, which is the book's strength and its weakness. The depth and breadth of his research give his book real authority, and future disputants on both sides will have to reckon with his clear, reliable conclusions. But the final two sections of the book are repetitive, particularly as the author recounts the general history of the flag's use since World War II only to then go into more detail on individual flag controversies.

The Goat of the South

Jefferson Davis's reputation has been almost as controversial as the Confederate battle flag's. As Donald E. Collins points out in The Death and Resurrection of Jefferson Davis (Rowman & Littlefield, $22.95), at war's end the former Confederate president was persona non grata even in the South, the scapegoat for Confederates who felt that his heavy-handed wartime leadership made him the Southern leader most responsible for their defeat. But in subsequent decades, Davis's stock rose, and he became one of the most vigorous and beloved defenders of the lost cause. So when exactly did this resurrection happen?

According to Collins, a history professor at East Carolina University, Davis's tours through the South in 1886 and 1887 -- as well as the public sympathy surrounding his death and burial in 1889 -- were the key moments. Collins dutifully recreates these events from newspaper records, but he breezes past the first and most important moment in Davis's resurrection: his two-year imprisonment at the end of the war. Stories of a manacled Davis being starved by Union soldiers stoked the fires of Confederate sympathy -- so much so that U.S. officials decided to release him lest they turn him into a martyr. By that point, Davis may have been a chump to many white Southerners, but he was their chump, and they chafed at news of his mistreatment at Yankee hands. The resuscitation of Davis's career, then, both reflected and contributed to the revival of Southern nationalism in the late 1860s -- a process intimately connected with the overthrow of Reconstruction, the reestablishment of white Southern rule and the Jim Crow era.

What If?

Perhaps more than any other Confederate, Jefferson Davis reaffirmed white Southerners' sense that they had been involved in a noble struggle. Davis helped spawn a generation of white Southern men who sat outside the county courthouse, a stone's throw from the Confederate monument, and argued over how different things might have been if only Maj. Gen. James Longstreet had broken through on Little Round Top, or if Lee had never ordered Maj. Gen. George Pickett to launch his hopeless charge at Union lines.

No historian has thought through such "what if" questions as seriously as Roger L. Ransom in The Confederate States of America: What Might Have Been (Norton, $25.95). The book begins with Ransom's "Recipe for Counterfactual History Pudding," mixing two parts historical plausibility with one part common sense and another part imagination. The culinary metaphor is apt; Americans have displayed an almost insatiable appetite for this sort of speculative fare, from the series of historical novels by Harry Turtledove to Hank Williams Jr.'s pro-Southern anthem, "If the South Woulda Won (We Woulda Had It Made)."

While Ransom, a historian at the University of California, Riverside, stops well short of Williams's conclusion, most historians have a word for counterfactual history: fiction. But there's a method to Ransom's madness. The best way to understand how profoundly the Civil War altered American and world history, he agues, is to try to imagine what would have happened if the war had gone the other way. We know that the Union's victory unleashed the forces that ultimately transformed the United States into the political and economic behemoth of the 20th-century world. But what if the South had somehow fought its way to a stalemate?

Ransom takes the reader through the individual battles that swung the Confederates' way and led to an imaginary, counterfactual truce in November 1864. As he dips back and forth between his counterfactual narrative and historical analysis, Ransom sheds light on a number of surprising places. He asserts, for example, that had the Confederacy emerged as an independent nation, the history of the American corporation might have looked very different; a defeated United States would never have enacted a 14th Amendment, which was passed to defend the rights of Southern freedmen but unintentionally became the pathway through which corporate America received a range of legal protections under federal law.

Unfortunately, some of Ransom's other analyses fall flat. He speculates that the South may have lost because it failed "to find commanders comparable to those who eventually emerged in the North." But earlier Ransom recounts the conventional wisdom that the performance of Lee's army early in the war was nothing short of miraculous. Also, the shift between Ransom's analysis and his fictional Civil War can be jarring -- as though, in the short space of a few lines, he hopped from the subdued aura of a university seminar room to the frontlines of a Civil War reenactment.

Ultimately, however, Ransom brings the reader back around to a safe, if familiar, conclusion: Lincoln was right when he said that the fight to preserve the Union was a struggle "not altogether for today." Above all, it was a fight for "a vast future" to be enjoyed by "the whole family of man." That's a lesson well worth remembering today. ·

Joseph Crespino teaches American history at Emory University and is the author of The Last Days of Jim Crow, forthcoming next year.

© 2005 The Washington Post Company

Thursday, May 19, 2005

Three Films And You're Out!

For some reason, I never saw "East of Eden." I don't know why, either. I did see "Rebel Without A Cause" and "Giant." Strange, James Dean and Paul Newman were contemporaries and James Dean was no Paul Newman in any of Dean's three films. Paul Newman became a national treasure. In addition, Newman is able to drive at high speed and not toast himself. Newman's salad dressing is better than anything Dean might have concocted. If this is (fair & balanced) semi-hagiography, so be it.

James Dean: What would he have grown up to be?
By John Swansburg

Fifty years ago, James Dean died in a violent car accident on his way to Salinas, CA. His mangled Porsche was dispatched on a tour of the country soon after to scare kids into driving safely, only to be picked clean by fans lusting after a tangible piece of the Dean legend. None of this seems to have inspired a wave of judicious driving, but Dean's highly publicized death did make a lot of people want a Porsche. In a nod to that fateful plug, Porsche recently issued a special edition roadster designed to pay homage to Dean's 550 Spyder, unveiling it at the site of his grisly crash.

A morbid stunt, perhaps, but morbidity has always been at the heart of the James Dean cult. Fans are keener to celebrate the anniversary of his death than his birth, and for the 50th they'll get to choose from the Porsche, a new book of photography, a new documentary, and his collected works on DVD. In June, 150,000 devotees are expected to make a pilgrimage to Marion, Ind., near Dean's rural birthplace, for the James Dean Fest. Tour buses will go to Dean's mortuary, the church that held his funeral, and to his grave, billed as "a pink granite headstone often covered in red lipstick from his fans wanting to leave something behind."

The unnatural deaths of other popular icons have inspired bizarre conspiracy theories—and the occasional Elton John ballad—but Dean's followers take their obsession in another direction. They consider Dean's death part of his allure: By dying young, he preserved himself in amber, a Peter Pan in jeans and a red windbreaker. "Dean died before he could fail, before he lost his hair or his boyish figure, before he grew up," Donald Spoto writes in his admiring biography. Fair enough. But what lurks behind the celebration of the star's unsullied youth is a fear that, had he lived, Dean couldn't have topped what he'd accomplished by the age of 24 and might even have tarnished those feats. Those of us not making our way to Marion in June, however, may see promise rather than perfection in Dean's short career—and wonder where a full career might have taken him.

The pessimism about the future that Dean didn't have is pervasive and is shared by film critics along with the fans. David Thomson, whose entry on Dean in his indispensable New Biographical Dictionary of Film is one of the book's most passionate, attributes Dean's singularity to his eternal youth. Like others, he evokes the specter of Marlon Brando as the alternative. Brando "went from beauty to wreck," Thomson told the Orlando Sentinel recently. "Dean stays the same." Even Brando's biographer seems to pine after Dean's fate. "There is much to be said for dying young in circumstances melodramatically appropriate to your public image," writes Time's Richard Schickel. "There is very little to be said for living long and burying that image in silence, suet and apparent cynicism."

It's true that Dean never had the chance to get fat and make Brando's mistakes (paging Dr. Moreau!), but he also never had the chance to achieve Brando's later successes ("Last Tango in Paris," "The Godfather"). Or to improve upon his own. It's not that Dean made only three films—Joyce wrote only three novels. It's that when you sit down and watch his pictures, you notice that they betray Dean's limitations and show the promise, more than the realization, of greatness. Before his death, Dean had proved only that he could play an angsty, estranged teenager—himself, in other words. His two most famous roles, Cal Trask in "East of Eden" and Jim Stark in "Rebel Without a Cause," are variations on that theme. Dean played them with an unabashed inwardness and awkwardness that was new in the mid-1950s. His intensity riveted the era's moviegoers. But there's a "you had to be there" aspect to Dean appreciations that leaves some of us who weren't cold.

If Dean deserves credit, nonetheless, for inventing the troubled teenager, that accomplishment was as much a triumph of casting as acting. Elia Kazan, Eden's director, gave Dean the part of Cal because he knew Dean had lived it. Dean's mother, who nurtured his creative energies, died of cancer when her son was 8, after the family had moved to Los Angeles. His father promptly shipped James back to the Indiana home of the boy's aunt and uncle—on the same train as his mother's body. Dean eventually returned to L.A. but never reconciled with his father. Kazan saw the two together before taking the actor to his screen test for Eden, sensed a deep tension between them, and knew that if he could sneak the unrefined Dean past the studio execs, he had his Cal. "There was no point in trying to cast it better or nicer," the director told Dean biographer David Dalton. "Jimmy was it. He had a grudge against all fathers."

In "East," Kazan stoked that grudge by placing Dean opposite a father (played by the upright Raymond Massey) who was more concerned with perfecting a method of freezing lettuce than figuring out his troubled son. Dean turned in a performance that launched his star. Then he basically reprised the role in "Rebel." Massey was replaced by Jim Backus, the doddering voice of Mr. Magoo, and portrayed a dad who is less distant than plain weak. Backus parades around the house in an apron, lives in fear of his overbearing wife, and drives his son crazy by never standing up for himself. Near the end of the film, Dean's Jim Stark just can't take it any more. He throws his father to the floor and starts strangling the life out of him.

There's nothing wrong, of course, with channeling personal experience to play a part—without that where would Eminem, for example, have been in "8 Mile"? But when Dean didn't have his biography to fall back on, he fell flat. The last film he appeared in, "Giant," George Stevens' epic about Texas, exposed the limits of the actor's talent. He plays a wildcatter-turned-oil baron named Jett Rink, and there's no father figure out in the oil fields for as far as the eye can see. Dean mumbles his way through the part, getting less and less believable and intelligible as the movie progresses and Rink's character ages. By the end, Dean looks like a kid playing grown up. Or as Kazan put it: "He looked like what he was: a beginner."

What would Dean have been like as a seasoned veteran? It's possible that he would have deteriorated physically and professionally, as Brando did. But it's also possible he would have honed his craft and become something greater than an actor who could play a version of himself. Early poster-boy success need not be a harbinger for subsequent failure. Think of another Dean contemporary, Paul Newman.

Before his car crash, Dean talked of two projects he dreamed of doing. One, a screen adaptation of his favorite book, Antoine de Saint Exupéry's The Little Prince, probably wouldn't have stretched him—it would have been another film about children whose parents don't understand them. The other project Dean wanted to do was also a story about a mixed-up son who doesn't know how to please his father. But to have been a convincing Hamlet, Dean would have had to draw on more than his travails with his own father—there's a difference between teenage angst (who am I?) and existential angst (to be, or not to be?). Hamlet might have been the perfect stepping stone toward something bigger for Dean. His signature torment could have changed the role (Hamlet to Ghost: "You're tearing me apart!"), but the role might have taught him something as well. Something, perhaps, about growing up. His fans should wish he'd lived to play it.

John Swansburg is a senior editor at Legal Affairs magazine.

Copyright © 2005 Slate

Want To Know Why Obesity Is A Major Public Health Problem?



Don't blame Twinkies for our supersized population. 150 calories per Twinkie and 5 grams of fat make them less hazardous than a lot of other desserts. If this is (fair & balanced) gourmandise, so be it.

[x The Christian Science Monitor]
Twinkies at 75: munch 'em, fry 'em, save 'em for years
By Judy Mandell

The Twinkie just turned 75. Considering that 500 million of them are sold yearly, it seems obvious that Americans are crazy for these sweet, spongy, cream-filled snacks. The question is - why?

"The reason behind my loving Twinkies is obvious - they taste so darn good," says Debbie Rizzo, a publicist in San Francisco..

"Twinkies are simply my favorite food group," says Denise Dorman, a Twinkie connoisseur in Florida. "I craved Twinkies during my recent pregnancy, and we're having my newborn son's christening cake made of Twinkies."

OK, so some people think Twinkies taste great. But why have the squeezable yellow cakes endured as an American cultural icon?

"Great brands live on because of the emotional response they evoke as part of our [long-term] memory," says Tom Collinger, associate professor of integrated marketing communications at Northwestern University in Chicago.

Professor Collinger once thought of a Twinkie as the perfect food: "You could hold it in one hand. You didn't get crumbs on your fingers or your mouth. There were options to get at the filling inside - biting, licking, and slurping or sucking."

Phil Delaplane, a 50-something professor of American cuisine at the Culinary Institute of America (CIA) in Hyde Park, N.Y., grew up eating Twinkies.

"Loving Twinkies is a nostalgia thing," says Mr. Delaplane, who recently made his wedding cake with Twinkies. "Where I work, there are pastry chefs all around. My colleagues told me a substantial wedding cake for 150 guests would cost $1,200."

So he decided to go with a Twinkie cake instead. Saving money wasn't really the motivation, he insists. "We wanted something fun that would bring back our childhood - not just for ourselves but for our guests as well."

The choice of cake has also had a continuing effect on Delaplane's life. "On our monthly 'anniversary,' I stop off at the convenience store and pick up a piece of 'wedding cake' to bring home to my wife, Pam. With Twinkies, you don't have to freeze the wedding cake" to enjoy it together later.

These days, you might think Twinkies would be a big no-no among nutritionists. But some don't condemn it at all.

"It's portable, individually wrapped, and has a lot of flavor and satisfaction if you are looking for a portion-controlled treat," says Madelyn Fernstrom, associate professor and director of the University of Pittsburgh Medical Center's Weight Management Center.

With 150 calories per Twinkie, Dr. Fernstrom says, it's a great choice for those seeking a "real dessert" without a lot of extra fat and calories - a Twinkie contains only 5 grams of fat, although it is high in sugar. She's not arguing that nibbling on a Twinkie is better than eating an apple. But if the choice is a piece of cheesecake or pie versus a Twinkie, she recommends the latter.

Chef Delaplane admits to watching what he eats, but he isn't concerned about Twinkies: "There are too many other things for me to worry about in this world - not necessarily what's in my Twinkie."

And what's in a Twinkie that causes its phenomenally long shelf life (rumored to range from years to decades, although officially it's 25 days)? Fernstrom attributes it simply to the absence of dairy products.

Some adults who loved Twinkies as kids and try them again as adults wonder what in the world ever attracted them to the snacks. Have their tastes changed, they wonder, or are Twinkies different now?

The ingredients for the Twinkie are the same today as they were when introduced, except for the filling, says a spokesman for Interstate Bakeries Corp.The original Twinkie contained banana creme filling. When bananas were in short supply during World War II, the company changed to vanilla creme filling.

Twinkies trivia

• Twinkies have been featured in major movies, including "Ghostbusters," "Grease," and "Sleepless in Seattle."

• In the TV series "All In The Family," Edith put a Twinkie in Archie's lunchbox each day.

• In 1999, President Clinton and the White House Millennium Council selected the Twinkie to be included in the nation's Millennium Time Capsule, representing "an object of enduring American symbolism."

• Chicago consumes more Twinkies per capita than any other city in the US.

• It takes 10 minutes to bake a Twinkie.

• Interstate Baking Corp. bakeries can produce 1,000 Twinkies in a minute.

• When Twinkies were first introduced, the price was two for a nickel. In 1951, a package of two cost 10 cents; in 1966, 12 cents. Today, the price ranges from two for 99 cents to two for $1.29.

Copyright © 2005 The Christian Science Monitor



Wednesday, May 18, 2005

Shirts Or Skins?

I played pickup basketball at the best place in Denver in the late 1950s: the 20th Street Recreation Center. Located at the east edge of downtown Denver within the attendance area of arguably the best basketball school in Denver — Manual (Manual Arts & Trades) High School — the second-floor gym at 20th Street saw some of the best hoopsters in Denver in those years. Most of the games were full-court affairs, but the teams were divided into shirts and skins. As I recall, we shot free throws for teams. The first five to make a free throw were shirts and the next five were skins (shirtless). The remainder went to the sideline to await the winner of the first team to 20 points (10 goals). And so it went, weekend afternoons and weekday evenings, in the spring and summer. Flash forward: the best two books on the pickup basketball culture both treat New York City playground basketball. The better of the two, Heaven Is A Playground by Rick Telander recounts the author's summer in the toughest part of Brooklyn: Bedford-Stuyvesant. The runnerup is The City Game by Pete Axthelm. Axthelm focuses on the Rucker Tournament as the premier pickup basketball event in New York City. Ed Rucker was a playground supervisor who died prematurely and the tournament bearing his name has become a playground institution. The level of play in Austin playgrounds or Amarillo playgrounds, while spirited, does not approach the basketball played in New York City. If this is (fair & balanced) dunking, so be it.

[x Austin Fishwrap]
Westenfield Park: Court of kings gets a face lift
By John Maher

Word was out not long after the paint dried and the nets were hung back up this month. Westenfield Park, the court that for decades has been known as the site of probably the best pickup basketball in town, was open for action and ready to add to its reputation.

"I'm on my way down there now," said Bick Brown, owner of Hyde Park Bar and Grill, who has played there for 25 years. "It's become a big part of people's lives. It's a relief that it's done."

The single court, known simply as Enfield to hoop junkies and located just off MoPac Boulevard (Loop 1) in West Austin, has even received some national attention in books such as Chris Ballard's Hoops Nation: A Guide to America's Best Pickup Basketball.

In 2004, the Austin Chronicle named the slab Austin's Best Basketball Court, but by then Enfield had been living off its reputation for years. The court, which had been slick from grit for at least two decades, had gradually fallen into disrepair. The paint job was mostly a memory, and chunks of the surface were missing, making dribbling a misadventure. Austin Parks and Recreation Department employees say the court had not been resurfaced in at least 15 years.

When workers did start repairs, they received a surprise: a water leak under the concrete. The water line to the park's swimming pool had ruptured and was coming up under the slab.

"You had a wet, slick slab," said Stuart Strong, assistant director for Austin's Parks and Recreation Department. "It was buckling the coating."

Repairing the leak was not a quick fix. Eventually the pipe had to be routed around the court, which took a couple of weeks. Strong estimated that rerouting the pipe cost $5,000 and that the resurfacing was another $3,500.

But now the court looks better that it has in a long time.

For decades the procedure at the court has been pretty much the same. A group that sometimes includes former college players will usually gather at 3:30 or 4 in the afternoon, with games shifting to evening and early morning in the heat of the summer.

The court draws a wide mix of players with regard to age, race and occupation.

"You're judged on if you can guard your man and if you can score," Brown said. "Nobody really cares what else you've got going on in your life."

He gave the new paint job a big thumbs up. "I was concerned they were going to move the three-point line back and that would put me out of the game," a relieved Brown said.

Copyright © 2005 Cox Texas Newspapers, L.P. All rights reserved.

Tuesday, May 17, 2005

How Old Time Is "Old Time Religion"?

I got two things from reading Perry Miller on the settlers of New England. First, Miller — long deceased — never allowed his atheist bias to intrude on his dense, magisterial works in religious history. Second, my students never really got it that the New England settlers weren't "Puritans." While the Reformed folk commonly called "Puritans" wanted to purify the Church of England of the taint of the Whore in Babylon (the Roman Catholic Church), these people were devout believers in the Reformed faith of John Calvin and Ulrich Zwingli and wanted that version of Protestantism for godly Englishmen like themselves. A splinter group (a universal phenomenon among religious sects) wanted to separate from the Anglican church because they believed that reform was hopeless for those in England who were tainted with Romanism. The Separatists (aka Pilgrims) ultimately came back to the fold when their colony in Plymouth was subsumed by the larger Massachusetts Bay Colony. The Reformed folk of New England wanted no truck with the Society of Friends (aka Quakers) or Jews or atheists. Woe to the Friend who wandered into Massachusetts or Connecticut. Friends were hanged routinely. Today, supposedly born-again types like Tom (The Hammer — of God?) DeLay routinely threaten a lynching for apostate judges. Alan Wolfe gets more right than wrong in this dual review of books that confront today's duel in the pews. If this is (fair & balanced) heresy, so be it.

[x The New Republic Online]
What God Owes Jefferson
Reviewed by Alan Wolfe

God's Politics: Why the Right Gets It Wrong and the Left Doesn't Get It. Jim Wallis (Harper, 384 pp., $24.95)

Taking Faith Seriously. Edited by Mary Jo Bane, Brent Coffin, and Richard Higgins (Harvard University Press, 381 pp., $29.95)

I.

The phenomenon of martyrdom demonstrates that political success and personal salvation do not generally go together. The faithful find grace not in building winning coalitions, but in worshipping God's glory. Gazing toward heaven means stumbling on earth, a small price to pay for the rewards that await.

For a deeply religious society, the United States has had little or no culture of martyrdom. There are those--bless them--who share the suffering of the downtrodden. But America's voluble religious rightists, those who are most visible in their insistence that they embody the true faith, are too busy celebrating their victories to have much time for defeat. They have rendered under Caesar what is Caesar's: themselves, as it happens, and all the political power that comes with them. They dwell not in the house of the Lord, but in the House of Representatives. Their prayer breakfasts are strategy sessions, their churches are auxiliaries of political parties, their pastors are political bosses. Their God must be great: look at the clout of his constituency.

Once confined to the margins of American politics, the religious right seems to be everywhere these days, rallying to the cause of Terri Schiavo or lobbying intently for conservative judges. No wonder that activists on the left of the political spectrum find themselves filled with wonder. Surely, they believe, it ought to be possible to remind Americans that Jesus was a man of compassion who turned swords into plowshares. On theological grounds alone, the left's case to rally God to its side ought to be stronger than the right's. "It's time to take back faith in the public square," writes Jim Wallis, America's leading evangelist for progressive causes. In the presidential campaign last year, Howard Dean asserted that he belonged to the Democratic wing of the Democratic Party. Jim Wallis insists that he belongs to the Christian wing of Christianity.

Wallis is not the only prominent believer with progressive instincts to challenge the religious right's influence in American politics. Taking Faith Seriously offers a collection of policy papers written by scholars associated with Harvard's Hauser Center for Nonprofit Organizations. Arguing that religious organizations are a crucial aspect of America's network of ever-spawning voluntary associations, these writers aim to use the insights of social science to better understand the role that religion plays in American public life. Although Jim Wallis worked with them from time to time, their book is not written in his prophetic yet strangely inside-the-Beltway tone. Still, even if they write with greater analytical precision, there is no masking their larger political point: liberal democracy has a place for religious believers, in no small measure because religious believers constitute a pluralistic, diverse, and basically reasonable group of Americans. As sensible as that claim may be to most people, it would be considered hostile and inflammatory to the new breed of Christian Republicans.

Here, then, are two books that together make the best case yet for the notion that the left can no longer allow the right to claim a monopoly on religion's involvement in politics. Yet in making the case so well, they also expose some of its flaws. Of course the left should not let the right monopolize religion. The politicization of the religious right has done great damage to both religion and politics. But everything depends on how religion and politics meet. There are good ways and bad ways to bring them together, and for all the sensible suggestions in these books, neither avoids the bad ways completely. If liberals are to return to power, it must not happen by appropriating some of the rotten ideas of conservatives.

II.

In keeping with their literalist disposition, with their theological conviction that the Bible is literally true, right-wing Christians believe that the more conservative the society, the better matters will be for religious conservatives. This is the premise of their politics; but the premise is false. It is, in fact, a spectacular mistake. For conservative religion has always flourished best in liberal societies, and the more liberal, the better.

Of all liberal society's great innovations, none has been more important to the rise of conservative religion than the commitment to free exercise embodied in one of the First Amendment's two clauses. Today's religious conservatives live off the accomplishments of previous generations of religious radicals, whose willingness to challenge received doctrine, to confront established authority, to dispense with encrusted tradition, to develop their own vernacular, and to insist on the dignity of the individual believer pierced the heart of everything conservative around them. Had not a love of liberty accompanied the rise of evangelical religion in seventeenth- and eighteenth-century Europe and North America, there would never have occurred the awakenings that inspire today's religious activists. Free exercise requires a free society. This should have been obvious. It is, indeed, a literal reading of the constitutional language. About the relation of free exercise to a free society, ask only the Jehovah's Witnesses. No other organization has done more to expand the quintessentially liberal idea of allowing dissenting voices to be heard than this decidedly conservative faith.

The First Amendment's other clause--the one separating church and state--is another liberal idea without which conservative religion could not exist. Written just two decades after the publication of The Wealth of Nations, the First Amendment essentially created a free market in the salvation of souls. Left without the financial guarantees offered by state monopolies--especially, in the early American Protestant imagination, the decadent and reactionary Catholic Church--congregations would live or die by their own efforts in recruiting members, tithing their purses, and kindling their enthusiasm.

By the time Tocqueville arrived in the 1830s, the pattern had been set. The itinerant preacher, the bewildering variety of new denominations, the determination to evangelize: all took advantage of America's remarkable experiment in church-state separationism to organize on behalf of God. European religion--not only in its Catholic form, but also in its various Calvinist and Episcopal manifestations, burdened by the privileges secured by established churches--withered and, in the opinion of many, died. But American religion, banned from the state, infused the culture. The more it was kept out of politics, the deeper would be its reach into every other area of life.

If the close links between established religions and European monarchies spurred the growth of religious liberty in the United States, their hierarchical organization and their affinity with privilege convinced many an American believer of the necessity for equality. There would be a clergy, of course; but it need not be a learned one speaking a language foreign to its flock, nor need it wear the finery associated with decadent aristocracies. Compared with the old regime, liberal societies were leveling societies, and nothing leveled more than a faith that, in insisting on the priesthood of all believers, held everyone to be equal in the eyes of God. It surprises us today that America's most famous symbol of conservative religious reaction, William Jennings Bryan, was also a great advocate for equality. It should not surprise us at all.

America's free air and free soil worked to the benefit of all American religions, but its truly special blessings flowed to conservative Protestantism. Protestantism's greatest source of strength has been its capacity to re-invent itself. As older modes of worship lost their power to attract, new modes rushed in to fill the gap. In the nineteenth century, the urban revival hall and the rural camp meeting drew crowds away from the staid chapels of the more upper-class faiths. In the twenty-first century, the megachurch brings in those more exposed to Oprah than to Amos, as organ music and hymns give way to contemporary Christian rock, and the diet book is studied more carefully than the Bible, and Sunday attire is replaced by aisle-rolling and spirit possession. Listen to the sermons in the sprawling, dynamic, and theologically incoherent world of conservative Protestantism, and you may hear liberalism denounced from the pulpit; but these jeremiads are philosophically and historically blind, since they are oblivious to the fact that without liberalism, there would not exist the vibrant voluntary sector, the responsiveness to popular taste, or even the freedom to attack the Democratic Party that serve as the homily's backdrop.

Some religious conservatives are aware of the debt that they owe to liberal ideas. Among them has been the largest Protestant denomination in America, the Southern Baptist Convention. Baptists cut their teeth on liberal principles. Their most famous leaders across the centuries--Roger Williams (1603-1683), John Leland (1754-1841), George W. Truett (1867-1944)--preached on behalf of religious liberty and church-state separation. True, their commitments were not the same as those of Enlightenment skeptics; Williams sought to protect the garden of faith from the wilderness of government, not the other way around. But their significant presence among American believers helps to explain why we never developed a tradition of clericalism and thus were spared nosy inquisitors relying on the police power of the state to enforce the creedal orthodoxies of one particular sect. As odd as it may sound in these days of conservative ecumenicalism, when all that matters on the right is whether your beliefs are conservative and not what your beliefs actually are, Baptists were once closer to Jews in their commitment to religious liberty than they were to establishment-oriented Catholics (against whom, it must also be said, they were once deeply prejudiced).

It is a matter of considerable political importance that so many of today's Baptist leaders have opted to neglect the history and the traditions of their own denomination. Sensing an opportunity to wield considerable political influence, leaders such as Richard Land, president of the SBC's Ethics and Religious Liberty Commission, Albert Mohler, president of the Southern Baptist Theological Seminary, and Adrian Rogers, of Memphis's Bellevue Baptist Church, developed close connections with the Republican Party, and with the religious right that was fueling its growth. In March 1998, after meeting with the conservative political strategists Paul Weyrich, Gary Bauer, and James Dobson, Land said of his relationship to them: "The go-along, get-along strategy is dead. No more engagement. We want a wedding ring, we want a ceremony, we want a consummation of the marriage."

They got one. Land sometimes criticizes the Republican Party, as he did when it tried to use church mailing lists to bring out voters in 2004, and he goes to great lengths to try to prove that his political activism can be reconciled with the Baptist tradition of religious liberty. (He argues, for example, that the state can "accommodate" religion without violating the First Amendment.) But he is fooling no one, not least other Baptists. "Though not ready to jettison separation," writes the historian Barry Hankins in Uneasy in Babylon: Southern Baptist Conservatives and American Culture, "he knows his views on this aspect of church-state relations are different enough from previous considerations that he needs to use a different term, hence his use of 'accommodation' to encapsulate his principles."

Liberals worry that the religious right, by failing to respect the proper boundary between church and state, will move the United States in a theocratic direction insufficiently appreciative of the blessings of human freedom. That may well be true, but among the first to suffer would be religious conservatives. If it could, the religious right would create a society in which Christianity's place of privilege would be supported by special access to public funds, police powers charged with enforcing its conception of morality, restrictions on the free speech of atheists and non-Christians, and a foreign policy designed to spread its word. Yet by sucking up the air of human liberty, such a society would leave a vacuum in which new religious movements--tomorrow's evangelical establishment--would find no place of nourishment.

Religions that draw too close to politics lose their capacity to innovate. If they receive government money, they become bureaucracies, which is one reason that at least some deeply conservative believers are skeptical of President Bush's talk about faith-based initiatives. If their objective is to get out the vote, they will want an obedient flock, not disputatious otherworldly saints. Their clergy will become more comfortable in the country club than on the street corner; worried about their standing, the status of their capital campaign, and their ties to local business, they will lose their ability to speak to the economically marginalized, from whom conservative religion has always drawn the bulk of its new recruits. (Whatever its commitment to real Darwinism, the Bush administration's unblinking support for social Darwinism ensures that untold numbers of potential conservative Christians will never live to adulthood.) Their churches may grow, to be sure; indeed, they will develop management objectives, best practices, and consumer surveys to ensure their growth. But the resources of the spirit are limited, and the more that goes into committee work, the less there is for God. Evangelicals did well in America not least because they had little or no political power.

The leaders of the religious right have evidently decided that none of this matters to them. After all, how many times in a millennium does one get the opportunity to change the culture? The age of Bush is their opportunity, and they are not going to let it pass by. They may think that religion--the spiritual, otherworldly, transcendental kind--is for idealists, and that politics is too exciting and too profitable for them to stay stringently behind the pulpit. Right there before their very eyes is the prospect of a Supreme Court with an unshakable majority in favor of returning the United States to its Christian roots. God would not want them to take their eyes off the prize, and the least they can do is to work on his behalf.

Yet for all its political muscle, the religious right is an easy roll. Republican leaders are happy to throw symbolic crumbs to them--Terri Schiavo's life, constitutional amendments banning gay marriage that are designed never to pass--and they lick them up gratefully. Of course they may someday get their Supreme Court, and perhaps the struggle will have been worth it; but to this point it seems correct to say that they have sacrificed the future of their faith for pretty thin gruel. Given the degree to which Americans distrust politicians, it boggles the mind that religious leaders would consign themselves to that particular circle of hell. But they have chosen to do so, and they, more than anyone, will reap the whirlwind.

III.

We all need, in a very real and dire sense, to get some old-time Religion! Politics can't save itself. Nor can it rely for its salvation on the debased spirituality so prevalent in the culture." These are the words of the left-wing firebrand Jim Wallis, not the right-wing activists Pat Robertson or James Dobson. God's Politics is filled with words such as these. "We've lost the social, unifying, and liberating aspects of biblical faith," Wallis laments. "What is needed is nothing less than a renovation of our souls and the soul of our politics." And lest any reader fail to grasp the point, Wallis leaves no doubt that his faith, the evangelical one, is the one that our country needs most. "Without a personal God, there is no personal dimension to belief.... In today's world, there is one overriding and key distinction in all of the religion that is growing--a God who desires relationship with each person." That is Jesus talk, even if Jesus is not mentioned.

Politically speaking, Wallis's book is long overdue. He is correct to point out that the Jesus invoked by so many conservative religious believers has little or no relationship to the Jesus who preached on behalf of the outcasts and was widely admired for his humility. When it comes to interpreting Scripture, at least in its moral and social aspects, Wallis has it all over Robertson and Falwell, and if he had his way, they would go the way of Jimmy Swaggart and Tammy Faye Bakker. But in adopting much of the language of Christian evangelicalism, Wallis brings along its problems. Its participation in politics has led the religious right to a position in which its politics have driven out its faith. God's Politics is proposing the same degradation for the left. For the left would certainly suffer a similar fate if it adopted the prophetic stance that Wallis urges.

Religion and politics have such a difficult time mixing because the uncompromising faith required for the one defies the brutal realism of the other. Wallis is in many ways a very worldly man, an active politician; he fills his book with letters and memos that he has written to the leading political figures of our times, and accounts of his meetings with very important people, and glimpses of his life on the road as he speaks across the country on a schedule that only a presidential candidate could love. This Jim Wallis is a policy wonk. He has positions on the war on terrorism, Millennium Development Goals, the future of the Gaza Strip, the global campaign against HIV/AIDS, and the proper role of the IMF and World Bank. His Jesus would need a staff of economists and planners to make his presence felt among us.

In another sense, though, Wallis does not seem to have a political bone in his body. "Many of the president's critics make the mistake of charging that his faith is insincere at best, a hypocrisy at worst, and most a cover for his right-wing agenda," he writes. "I don't doubt that George W. Bush's faith is sincere and deeply held." Well, I do: I can imagine no scenario in which the president dissuades Karl Rove from a political strategy because his religion forbids it. But what if it is? Bush is no victim of "bad theology," as Wallis informs his readers. He is a politician striving to achieve real-world objectives that would have devastating consequences for millions of people; and whatever religion he may have surely is secondary to that. I admire Jim Wallis the man of God for faith in all people, even the lamentable George W. Bush. I do not find Jim Wallis the political analyst particularly trustworthy about the struggle that it will take to stop Bushism and all that it represents.

The best example of Wallis's political myopia is his insistence that he has no political agenda. Religious values, he claims, are inclusive ones, and as a religious leader he does not want anyone to feel left out of the conversation. At various points in his book, Wallis even identifies himself as conservative, especially "on issues of personal responsibility, the sacredness of human life, the reality of evil in our world, and the critical importance of individual character, parenting, and strong 'family values.'" Yet as he goes through his list of what Jesus would do today, there can be no doubting that Wallis's heart leans left. "Of course, God is not a partisan; God is not a Republican or a Democrat," Wallis writes. But people are Republicans or Democrats, and if they were to vote for the agenda in which Wallis believes--a healthier environment, greater equality of incomes, a repudiation of foreign policy unilateralism, gay civil unions--they will be voting Democratic. No Republican activist would take seriously Wallis's claim to be above politics, and with good reason. Progressive beliefs are not made less progressive by claiming that they embody what God wants.

To position himself as non-political, Wallis creates a dichotomy between the militant Christian right and those who belong to the ACLU and the Anti-Defamation League. The latter--"secular fundamentalists," Wallis calls them--fail to appreciate religion's role in American history and "attack all political figures who dare to speak from their religious convictions." These people "make a fundamental mistake" by insisting "that the separation of church and state ought to mean the separation of faith from public life." God, in Wallis's account, may be personal, but he is never private. Our public life needs him because otherwise we would have no sense of the common good, no prophetic vision needed to realize it, and no individuals such as Martin Luther King Jr. willing to hold us up to our highest ideals. (Leftists who invoke King on behalf of religious intervention in politics ought at least to remember that King was a close student of Reinhold Niebuhr, who, more than any other American theologian, warned of the dangers of too easy an identification of faith with power.)

One need not agree with the values of the ACLU and the ADL to find Wallis's dichotomy bizarre. Secular liberalism and religious fundamentalism do not have even remotely similar political or psychological dispositions. The former allows room for religion, including religious fundamentalism, if only in the private sphere; but the latter allows no room for liberalism, in the private or the public sphere. Wallis may be on the left, but like his fellow believers on the right, he is not much of a liberal. If there were legions of progressive Christians in the land willing to vote for his prophetic stance--I wish that there were--they would need the same protections of free expression, church-state separationism, and the right to organize that have sustained the religious right.

If God is a source of our common morality, he is not the only source. From him we derive lessons in the meaning of life, but we may do the same from moral philosophers, Founding Fathers, writers and artists, and even an occasional social scientist. We could use some more God talk, I think, in our public life, but we could also use a lot more history and reasoned argument. The advantage of the latter is that those who have used them have rarely tried to exclude anyone else from the conversation. I would not want to live in either a society composed solely of secular liberals or a society composed solely of religious fundamentalists, but at least the former lives and lets live.

"The best response to bad religion is better religion, not secularism," Wallis writes. That may be true of religion in the private sphere, in the soul; but from the standpoint of a pluralistic society, we are better off with bad religion than with theological single-mindedness. Affirmations of "God" or "the creator," platitudinous and "badly" religious as they may be, exclude fewer Americans than pledges to Jesus or the Prophet. "The best religion to counter the Religious Right," Wallis claims, "is prophetic faith--the religion of the prophets and, of course, Jesus." Count me out, because Jesus is not my God. When it comes to politics, Jim Wallis is on the other end of the spectrum from Jerry Falwell. When it comes to Jesus, they stand in the same corner.

IV.

Religion needs secularism to thrive. But it is also true that secular society needs religion to grow. This idea, which liberals do not like to hear, is made with exceptional clarity by Peter Dobkin Hall and Ronald F. Thiemann, two of the contributors to Taking Faith Seriously. Hall demonstrates how the important and now quite secular idea of moral agency--the belief that we are captains of our own fate--had its origin in the revolt against strict Calvinism led by preachers such as Jonathan Edwards, Timothy Dwight, and Lyman Beecher. The problem is easily stated. If we are all tainted with original sin, our salvation or our damnation is in the hands of a capricious God; we have no autonomy and, without it, our love of God is cheapened. It was a short step, in Hall's account, from this theological point to the development of the American voluntarism so celebrated by Tocqueville.

Searching for a role for human beings to play in the process of their own salvation, Beecher discovered the importance of voluntary associations, and the free air of the West (he moved to Cincinnati in 1832), and the notion of a common good. Although Hall does not say so, it nicely fits his story that Beecher's daughter would play such a vital role in exposing the this-worldly evils of slavery. "The historical record indicates that the proliferation of voluntary associations in nineteenth-century America involved groups whose theological convictions and religious practices led them to see secular civil society as the most promising arena for exercising moral agency," Hall concludes. Without its frequent religious awakenings, America would not have become a great secular nation.

Thiemann's focus is different. He is concerned with somewhat arcane developments within the world of Lutheran social service provision. Yet what he tells us puts to shame any simple notion that the religious and the secular constitute two different worlds. No doubt to the surprise of many a contemporary conservative Christian, Martin Luther did not believe that only Christians could carry out good works, or that doing so would be an essential step in their salvation. Good work, Luther told his flock, is good because it helps people, not because it proves the sincerity of a person's faith. Christians ought to be guided by the ideal of a vocation, but this is not in itself solely a religious duty to be carried out by the faithful for the faithful. Once launched into the world, Lutheran orphanages met the same problems of limited budgets, diverse clienteles, and staff professionalization as any other kind of organization; as Thiemann writes, "their missions became shaped more by the demands of external public demands than by a clearly stated internal theological rationale." The lesson for our times is plain: provide public funds for religiously motivated social services if you wish, but do not expect them to remain "religious" if you do. Religion is an excellent incubator for such secular ideals as the modern welfare state.

To the editors of Taking Faith Seriously, the mutual interdependence of the religious and the secular allows us to avoid the extremes of "faith-based boosterism" and "dogmatic secularism." (Yes, it is the same dichotomy perceived by Jim Wallis.) Were they to avoid this coarsening polarization, Americans could find common ground on some of the most contentious issues of the day. One example is offered by the debate over faith-based initiatives, Bush's proposal to allow religious organizations to play a greater role in social service provision. Religious boosters are mistaken, Bane, Coffin, and Higgins point out, when they insist that faith-based initiatives can replace the welfare state, for there are many sources of compassion, not just religious ones. At the same time, secularists are wrong to oppose faith-based initiatives for fear that they will become sectarian, since religious organizations have many purposes, not only salvational ones. Once we "adopt a stance of critical openness toward religion's place in public life," we can recognize that there is a middle ground out there. Secularists ought to extend rights of recognition to believers, and believers ought to understand that liberal pluralism includes a place for them.

Taking Faith Seriously offers a series of case studies designed to demonstrate that the always tricky and sometimes unsolvable puzzles posed by faith in a liberal democracy can best be addressed by appreciating the role that religion actually plays in concrete circumstances. The decision to look at specific cases was a good one; people who believe in different gods, as well as people who believe in none of them, are more likely to find common ground when they know one another personally than when they shout abstractions at one another over the airwaves or disagree with one another through their political leaders. Social science cannot tell us what is morally the right thing to do; but it can tell us what, practically speaking, from the standpoint of a compassionate democratic society, is the smart thing to do.

Some of the case studies are outstanding, such as Bane's analysis of the gap between Catholic teachings on the common good and the often dismally low rates of civil participation among ordinary Catholics, and Omar McRoberts's unflinching portrait of how difficult it can be for inner-city churches to evangelize the culture when the culture around them is crime-ridden and drug-drenched. But alas, the bulk of the material in Taking Faith Seriously either repeats research that has already been published or fails to address the issues raised by the editors. And the one contribution that forthrightly takes on the issue of religion in liberal democracy--Coffin's account of the way churches in Lexington, Massachusetts dealt with conflicts over homosexuality--is hardly reassuring.

Coffin's story is organized around a town forum that took place in October 2000 called "Respecting Differences: Creating Safer Schools and a More Inclusive Community for Gay and Lesbian People and Their Families." This was not the kind of title chosen to leave attendees in suspense over the outcome. Every word in the title screams liberalism, and indeed the forum was the brainchild of one of the town's Unitarian Universalist churches, First Parish, surely the most liberal congregation in this very liberal suburb. Still, Helen Cohen, First Parish's minister, believed that the forum could not be taken seriously unless evangelical Christians attended, and so she invited Chris Haydon, pastor of Trinity Covenant Church. Eventually nine religious leaders (as well as Representative Barney Frank) addressed the forum; Haydon was the only evangelical among them. Coffin does not offer many details on what actually took place at the forum, but he does say that "an open, respectful exchange developed" and that the discussion was characterized by "searching questions, personal experiences, moral convictions." The whole experience left Coffin persuaded that churches can play a role in "recognizing religious pluralism, disagreeing without dividing, and articulating democratic values of tolerance and mutual respect."

But a closer look at his case suggests a different conclusion. "While they tend to be affluent," Coffin writes, "Lexington's congregations mirror broader trends." This is simply not true. Lexington is not affluent, it is rich; its median housing price approaches seven figures. It is also a disproportionately academic town, its culture shaped by the fact that Massachusetts Avenue, which runs through Harvard, continues directly to Lexington, eight miles away. And its congregations do not mirror national trends at all. The fact that two rabbis and five mainline Protestant ministers addressed the October forum, with only one priest and one evangelical pastor, suggests just how far away Lexington stands from the increasingly megachurch-inspired American religious landscape. There may be an argument available to Coffin to suggest why his hometown offers an appropriate case study, but his failure to provide one--along with the fact that three of the four congregationally based case studies in Taking Faith Seriously use Boston as their focus--makes the reader worry about the generalizability of his findings.

Even more importantly, Coffin's own findings are not what he claims. Haydon, while an evangelical, is about as far from what is typical in American evangelicalism as it is possible to get. He is part of something called the Evangelical Covenant Church. Quoting from the denomination's description of itself, Coffin describes the ECC as having "roots in historic Christianity as it emerged in the Protestant Reformation, in the biblical instruction of the Lutheran Church of Sweden, and in the great spiritual awakenings of the nineteenth century." You would be surprised to discover, if you relied only on that description, that to Jim Wallis the ECC is "the most interesting church in America today," the very model of what a progressive evangelicalism would resemble. From Wallis we learn that this originally Swedish-inspired denomination is poised to grow among African Americans, and that "all the denomination's pastors now make a pilgrimage to many of the historic sites of the civil rights movement and to forgotten places of extreme hunger still present in America today." So you could not find an evangelical church in the United States more sympathetic to find common cause with Unitarians than this one. "Out of the Covenant's emerging multicultural identity is coming a powerful and prophetic commitment to social justice and peace," Wallis writes, playing music to a liberal's ear.

And astonishingly enough, even Lexington's Trinity Covenant Church was too conservative to find much common ground with the town's omnipresent liberals. Trinity did have an internal conflict over family values, but it had nothing to do with gay and lesbian inclusion, because, as Coffin writes, "Trinity's evangelical culture did not allow the matter of sexuality to be debated in terms of gay and lesbian inclusion." What divided Trinity Covenant Church was the question of whether the church could accept a couple who had children but were not married. (The congregation eventually refused to do so.)

On matters involving homosexuality across denominational lines, moreover, no agreements were reached in Lexington, because no discussions actually took place. The book's three editors, including Coffin, praise the fact that "liberal and conservative churches in Lexington were able to structure a civilized dialogue on an important public issue, the inclusion of gays and lesbians in the community." In reality, however, Reverend Craydon, on his way to speak at the forum, had to walk past protesters holding signs saying "What Next?: Pedophiles in Our Schools?" and "Do Not Turn Your Back on God," and inside he heard Barney Frank speak in what Coffin describes as "adversarial, take-no-prisoners rhetoric," which "seemed to mirror the polemics outside." And even more damning to the cause of cross-denominational civility, it turns out that there were two forums on homosexuality in Lexington, not one. Five months after the October event, the town's conservative believers organized one of their own called "Respecting Differences, Part II: A Revolutionary Response to Kinsey-Based Sex Education and Culture." Two people spoke at that meeting, both of them African American. Not a single leader of any of Lexington's liberal congregations bothered to show up.

Social science is an unpredictable horse to ride; once you mount, you have to go where it takes you. Coffin may think that his "thick description" leads to a way to bring religion and politics closer together, but as I read his tale, there exist warnings aplenty to keep them apart. Unlike Lyman Beecher and other nineteenth-century clergymen, today's religious conservatives, even the prophetic sort so much admired by Jim Wallis, are not quite ready to participate in American civic life if doing so means losing their status as a self-enclosed community unwilling to have its values challenged by the larger culture. More remarkably, even Beecher's direct descendants, today's mainline religious liberals, are more comfortable hanging out with people like themselves than treating the views of evangelicals as worthy of debate. If there is dogmatism in this tale, it comes not from the secularists, but from the Unitarians. (For a Christian rightist, of course, they are the same thing.)

In America's past, religion helped to create a society committed to secular liberalism, which in turned helped to spur the growth of religion. Today, if Lexington is any indication, conservative believers want nothing to do with secularism, and liberal religion's advocates are not especially liberal. Both need considerably more training in the need--no, the duty--to live together with people whose views are different from their own before politics and religion can help each other out.

V.

In the wake of John Kerry's defeat, Democrats have been pondering the question of whether they should engage in more God talk. Jim Wallis's book has become Exhibit A for the case that they should do so. And while the editors and the contributors to Taking Faith Seriously are not political activists, their message is one that clearly urges greater openness to religion on the part of liberal elites.

I agree with both books: there is no reason, save that of political martyrdom, for the Democratic Party to turn its back on people who not only believe in God but also look to him for guidance about the nature of the good society. Americans want leaders to protect them against foreign threats and to improve the economy, but they also want people who stand for values, and any party that does not respond to the latter as well as the former will be in the minority. But if Democrats seem unwilling to address questions of religious morality, it may not be a fear of faith that inhibits them but a fear of big ideas, even secular ones. They shy away from identifying with liberal and Enlightenment philosophy just as they do from Christian or any other kind of theology. Twenty million Christians have purchased a book that tells them--wrongly, in my view--that they need Christ to lead purpose-driven lives. But liberals rarely speak of purpose at all.

Moral agency, human dignity, freedom of thought and expression, respect for other ways of life, the right to organize: these are moral values, every one of them, and they have all emerged out of the combination of secularly liberal principle and evangelically inspired faith that have mixed together so well throughout American history. That they have stopped mixing in contemporary America is no reason to ignore them. Quite the contrary. The United States now more than ever needs constant reminders of the benefits that liberalism offers to faith--not just, or even especially, to liberal faith. If religious rightists are unwilling to make the case for liberty that enabled Baptist and other conservative churches to grow, let the case be picked up by liberal secularists who, in defending the rights of non-believers, will also protect those who worship in their own way. Our conservative president speaks of the need to protect and to expand freedom, including religious freedom, abroad. It is time for the liberal opposition to make an equally strong case for the same objectives at home.

Alan Wolfe is a contributing editor at TNR and Professor of Political Science and Director of the Boisi Center for Religion and American Public Life at Boston College.

Copyright © 2005, The New Republic