Sunday, February 29, 2004

Daniel Boorstin, RIP

Daniel Boorstin was a Commie in the '30s. He found the Party boring. I wonder what he would have thought of the anti-Communist zealots in the Texas Panhandle in the '60s and '70s? Talk about boring! If this is (fair & balanced) ideology, so be it.



[x NYTimes]
Daniel Boorstin, 89, Former Librarian of Congress Who Won Pulitzer in History, Dies
By ROBERT D. McFADDEN

aniel J. Boorstin, a Pulitzer Prize-winning author and social historian who was the librarian of Congress for 12 years, died yesterday at Sibley Memorial Hospital in Washington, D.C. He was 89 and lived in Washington.

The cause was pneumonia, said his son David.

Dr. Boorstin, who was also a lawyer and for 25 years a faculty member at the University of Chicago, wrote more than a score of books, including two major trilogies, one on the American experience and the other on world intellectual history viewed through prisms of scientific and geographic discovery, the work of creative artists and the ideas of prophets and philosophers.

As the librarian of Congress from 1975 to 1987, Dr. Boorstin literally brought drafts of fresh air into a stodgy, forbidding institution whose 550 miles of shelves and 19 reading rooms were all but terra incognita to the public and even to many scholars. He ordered the majestic bronze doors of the world's largest library kept open, installed picnic tables and benches out front, established a center to encourage reading and arranged get-togethers for scholars and midday concerts and multimedia events for all.

Recalling his directive to keep the doors open, he remarked: "They said it would create a draft, and I replied, `Great — that's just what we need.' "

Dr. Boorstin, a man of prodigious energy who wrote almost every day, almost all the time, ran into a slight hitch at his Senate confirmation hearings. Several senators demanded that he not write while serving as the Congressional librarian. He refused to stop writing but promised to do it on his own time. And he did — on weekends, in the evenings and on weekdays from 4 a.m. to 9 a.m., when he left for work.

Witty, informal, a politically conservative thinker who favored bow ties and unconventional ideas, Dr. Boorstin provided America four decades ago with a glimpse of its reality-show and photo-op future, introducing the notion of the "pseudo-event" to describe occurrences, like news conferences and television debates, that are staged to get news coverage and shape public perceptions.

In his 1962 book, "The Image: Or What Happened to the American Dream," Dr. Boorstin deplored the "programming of our experiences," saying "they have no peaks and valleys, no surprises." He cited the 1960 Kennedy-Nixon debates, which he said reduced national issues to trivial theatrics. "I think you have to be willing to settle for the messiness of experience," he said.

Dr. Boorstin developed his social theories in a steady stream of books that were popular with many readers and critics, though not always with other historians. His first trilogy — "The Americans," with the subtitles "The Colonial Experience" (1958), "The National Experience" (1965) and "The Democratic Experience" (1973) — won many awards.

The first volume won the Bancroft Prize, the second won the Francis Parkman Prize and the last, which focused on the entrepreneurs and inventions of the century after the Civil War, received the 1973 Pulitzer Prize in history. Dr. Boorstin also won the National Book Award for distinguished contributions to American letters in 1989.

The professor, who received a doctorate in juridical science at Yale University in 1940, advanced the theories of Frederick Jackson Turner, who postulated that democracy followed the frontier. Dr. Boorstin broadened the concept, contending that the American experience was shaped by the efforts of a people to tame the continent.

This struggle, he believed, had led Americans to value practicality and pragmatism over theory and dogma, action over thought, and experience over tradition. He maintained that this outlook made American institutions resilient and versatile.

The second trilogy — a vast edifice of scholarship and words devoted to the world's intellectual history but aimed at general readers — was composed of "The Discoverers" (1983), which focused on geographic and scientific explorers, "The Creators," (1992) about artists and their contributions, and "The Seekers," (1995), which examined the ideas and lives of religious leaders and philosophers.

While the scope of his work was sweeping, his historical focus was typically down to earth: the lives of people, their daily concerns, the implements they used, the way they solved everyday problems. His eye for the telling detail often led to insights: the invention of the pocket watch so time could be known anywhere, the likelihood that Houston could not have become a great city without air conditioning.

Reviewers praised Dr. Boorstin for a lively, inventive style, unconventional and bold approaches, intriguing perceptions and for placing familiar information in fresh contexts to generate unexpected conclusions. Admirers also praised him for shaping great stores of evidence into well-ordered, vigorous narratives and for producing original and provocative observations.

Detractors charged that his work was "popular" history, more superficial than overarching, focusing unduly on goods, services and processes at the expense of ideas and ideologies. Some critics viewed him as too conservative, morally complacent, content with the status quo.

Kenneth S. Lynn, a professor of history at Harvard and Johns Hopkins, quoted in "Contemporary Authors," accused Dr. Boorstin of philosophical bias and blatant myth-making, but still hailed the third volume of "The Americans" as "a path-breaking and important book" that reflected great zest for research and contained brilliant analyses delivered with a supple style.

Dr. Boorstin's curiosity, mental agility and inclination not to suffer fools led some associates to call him arrogant and elitist. In the late 1960's, when antiwar protests swept the nation, he was a target of student radicals whom he denounced as "incoherent kooks" and "barbarians."

Many black leaders denounced his opposition to affirmative-action quotas and open admissions as well as his description of black studies as "racist trash." Dr. Boorstin responded that he was strongly against racism and believed in "equal opportunity, mobility and nondiscrimination," but said that he opposed "single-minded solutions."

In a world of rapid change, Dr. Boorstin championed books as the key to enduring values. He once described the book as mankind's "single greatest technical advance," and noted: "For each of us, reading remains a private, uniquely qualitative nook of our life. As readers, then, we are refugees from the flood of contemporaneous mathematicized homogeneity. With a book, we are at home with ourselves."

Daniel Joseph Boorstin was born Oct. 1, 1914 in Atlanta, to Samuel Aaron Boorstin, a lawyer, and the former Dora Olsan, both children of Russian-Jewish immigrants. His father took part in the defense of Leo Frank, a Jewish factory superintendent who was falsely accused of the rape and murder of a teenage gentile and was lynched by a mob after the governor commuted his death sentence to life in prison.

The case generated surges of anti-Semitism and Ku Klux Klan activity throughout the South, forcing the exodus from Georgia of many Jews, including the Boorstins. Mr. Boorstin grew up and attended schools in Tulsa, Okla., and majored in English history and literature at Harvard, where he was elected to Phi Beta Kappa and graduated summa cum laude.

It appeared that the young man was headed for a career in the law. As a Rhodes scholar, he graduated from Balliol College at Oxford with highest honors, passed the British bar examinations and became one of the few Americans to become a British barrister-at-law. He then completed advanced studies as a fellow at the Yale Law School and taught at Harvard, Radcliffe and Swarthmore. He was admitted to the Massachusetts bar. His first book, "The Mysterious Science of the Law," was published by the Harvard University Press in 1941.

At Harvard, he was swept up in left-wing radicalism. He later explained, "Nearly everybody I knew in these days who was interesting humanly or intellectually was `leftist' and thought they had a duty to `do' something about the state of the world."

He belonged to a Communist Party cell in 1938-39, but resigned in revulsion over Stalinist repression and the 1939 Soviet-German nonaggression pact. He later described his membership in the Communist cell as "boringly instructive."

Dr. Boorstin joined the University of Chicago faculty in 1944, rising over the years to become the Preston and Sterling Morton Distinguished Service Professor of American History. He also lectured at a dozen universities around the world.

In the late 1960's, his outspoken opposition to student radicalism, militancy and violent protests made him a lightning rod for protesters. Many boycotted his classes and circulated leaflets publicizing his friendly testimony before the House Committee on Un-American Activities in 1953, when he identified other members of the Communist cell.

In 1969, Dr. Boorstin left Chicago for Washington, where he became the director of the Smithsonian Institution's National Museum of History and Technology until 1973 and then the senior historian there for two years. After a dozen years as the librarian of Congress, he resigned in 1987 to continue writing full time and to become editor-at-large for Doubleday, where he specialized in acquiring books on history, reference and biography and recommended reissuing of earlier titles.

A collection of his essays, "Hidden History," was published in 1987. A second volume of essays, "Cleopatra's Nose: Essays on the Unexpected," appeared in 1994. Dr. Boorstin in recent years served on the editorial board of the Modern Library, a Random House imprint that publishes classics for a less expensive market. He also was a former president of the American Studies Association and a trustee of Colonial Williamsburg, the Kennedy Center for the Performing Arts and the Woodrow Wilson Center.

Most of Dr. Boorstin's major works were published by Random House, whose senior editor, Robert Loomis, worked on the manuscripts. But Dr. Boorstin often credited his wife, the former Ruth Carolyn Frankel, whom he married in 1941, with crucial editing contributions. "Without her, I think my works would have been twice as long and half as readable," he said.

Besides his wife and son David, of New York City, Dr. Boorstin is survived by two sons, Paul and Jonathan of Los Angeles and six grandchildren.

Copyright © 2004 The New York Times Company

Doonesbury Endorses A Presidential Candidate!

The Sunday fishwrap in Amarillo doesn't carry Doonesbury in the comics section (aka the funnies). Weekdays, Doonesbury appears in the Op-Ed section (not appropriate for the comics/funnies in Amarillo). During the week, Garry Trudeau has been offering $10K to any Alabama Air Guard vet who can verify that W showed up for drill in the early 70s. So far, no takers. It's kinda like offering a gazillion bucks for information leading to the arrest of Osama bin Laden. If this is (fair & balanced) spoofery, so be it.


Thursday, February 26, 2004

Tracy Kidder Makes Sense (As Usual) On Haiti

Tracy Kidder—in my humble opinion—is the finest writer of our time. He makes sense. This time, he shares his perspective on Haiti. When Franz Fanon coined the phrase, wretched of the earth, he could have described Haiti and its people. The wretched of the earth are in Haiti, in Somalia, on the West Bank and we have no understanding of wretchedness. If this is (fair & balanced) despair, so be it.



[x NYTimes]
Why Aristide Should Stay
By TRACY KIDDER

NORTHAMPTON, Mass.

In Haiti, a paramilitary group has been making coordinated attacks on towns and cities, overwhelming understaffed, underequipped and ill-trained members of the national police force. The group has been burning police stations and setting free prisoners, both ordinary criminals and people convicted of involvement in massacres. It has been looting and rounding up supporters of the elected government and, apparently, killing anyone who tries to oppose it.

This group seems to be operating with the tacit approval of some of the politicians who oppose Haiti's government. But many of these rebels, as news reports call them, have unsavory records. Some are former soldiers from the disbanded Haitian Army, which in 1991 deposed Haiti's first democratically elected president, Jean-Bertrand Aristide, and ruled the country with cruelty and corruption for three years. Another was a ranking member of an organization that aided the army in terrorizing the country during that period. This rebel group seems to enjoy sanctuary within the Dominican Republic and free passage across the border between that country and Haiti.

For several years, the rebels have been making raids into Haiti, including a commando-style assault on the presidential palace in 2001 and, in 2003, an attack on a hydroelectric dam, during which they burned the control station, murdered two security guards and stole an ambulance. Clearly, they were just getting warmed up. Their leaders now boast that they will soon be in control of the entire country.

I first went to Haiti in 1994, for research on an article about some of the American soldiers sent to restore the country's elected government. I have spent parts of the past several years there, working on a book about an American doctor and a public health system that he helped to create in an impoverished rural region. The Haiti that I experienced was very different from the Haiti that I had read about back in the United States, and this disconnection is even stronger for me today.

Recent news reports, for example, perhaps in laudable pursuit of evenhandedness, have taken pains to assert that President Aristide and his Lavalas Party have been using armed thugs of their own to enforce their will on the country. The articles imply that the current crisis in Haiti is an incipient war between two factions roughly equal in illegitimacy. But I have interviewed leaders of the opposition, and can say with certainty that theirs is an extremely disparate group, which includes members of the disbanded army and former officials of the repressive regime of Jean-Claude Duvalier — and also people who were persecuted by both these groups.

This is an opposition that has so far shown itself unable to agree on much of anything except its determination to get rid of Mr. Aristide. Most important, the various leaders of this opposition have enjoyed little in the way of electoral success, the true measure of legitimacy in any country that calls itself a democracy. Mr. Aristide, by contrast, has been elected president twice, by overwhelming margins, and his party won the vast majority of seats in Parliament in the last legislative elections, held in May 2000.

Press reports generally date the current crisis to those elections, which they describe as flawed. In fact, they were flawed, but less flawed than we have been led to believe. Eight candidates, seven of them from Lavalas, were awarded seats in the Senate, even though they had won only pluralities. Consequently, many foreign diplomats expressed concern, and some went so far as to call the election "fraudulent."

But to a great extent, the proceedings were financed, managed and overseen by foreigners, and in the immediate aftermath many monitors declared a victory for Haiti's nascent democracy. Sixty percent of the country's eligible voters went to polling stations, many trudging for miles along mountain paths, then waiting for hours in the hot sun to vote. Moreover, those eight contested Senate seats didn't affect the balance of power in Parliament. Even if it had lost them all, Mr. Aristide's party would still have had a clear majority.

Citing the flaws in those elections, the United States and other foreign governments refused to monitor the presidential election that followed, later in 2000, which Mr. Aristide won handily. The opposition boycotted the affair and still claims that the election was illegitimate, but it does so against the weight of the evidence. This includes a Gallup poll commissioned by the United States government but never made public. (I obtained a copy last year.) It shows that as of 2002 Mr. Aristide remained far and away the most popular political figure in Haiti.

Again citing the flawed elections as its reason, the Bush administration also led a near total embargo on foreign aid to the Haitian government — even blocking loans from the Inter-American Development Bank for improvements in education, roads, health care and water supplies. Meanwhile, the administration has supported the political opposition. This is hardly a destructive act, unless, as Mr. Aristide's supporters believe, the aim has been to make room for an opposition by weakening the elected government.

They have a point. Over the past several years, the United States and the Organization of American States have placed increasingly onerous demands on Mr. Aristide. Foreign diplomats insisted that the senators in the contested seats resign; all did so several months after Mr. Aristide's re-election. Though Mr. Aristide called for new elections, the opposition demanded that he himself step down before it would cooperate. Last year, a State Department official in Haiti, speaking on condition of anonymity, told me that the United States wouldn't tolerate that kind of intransigence but also said that no support for new elections would be forthcoming until President Aristide improved "security." And yet by the time the diplomat said this, the administration had long since withdrawn support from Haiti's fledgling police force, with predictable and now obvious results.

Mr. Aristide has been accused of many things. A few days ago, a news report described him as "uncompromising." For more than a week now, American and other diplomats have been trying to broker a deal whereby the president would appoint a new prime minister acceptable to the opposition. Mr. Aristide has agreed. So far the opposition has refused, insisting again that the president resign.

It was the United States that restored Mr. Aristide to power in 1994, but since his re-election our government has made rather brazen attempts to undermine his presidency. One could speculate endlessly on American motives, but the plain fact is that American policy in Haiti has not served American interests, not if those include the establishment of democracy in Haiti, or the prevention of the kind of chaos and bloodletting that has led in the past to boatloads of refugees heading for Florida.

One could also argue about the failings and sins of all the quarreling factions inside Haiti. But there are more important considerations. Haitians have endured centuries of horror: first slavery under the French, and then, since their revolution, nearly two centuries of corrupt, repressive misrule, aided and abetted by foreign powers, including the United States. All this has helped to make Haiti one of the world's poorest countries, and its people, according to the World Bank, among the most malnourished on earth.

The majority of Haitians have been struggling for nearly two decades to establish a democratic political system. It is important to this effort that Haiti's current elected president leave office constitutionally, not through what would be the country's 33rd coup d'état. Progress toward this difficult goal may still be possible, if the warring politicians within the country and the various foreign nations that have involved themselves in Haiti's affairs pull together now and put a stop to the growing incursions of terrorists. If this does not happen, there is little hope for Haiti. The result, I fear, will be a new civil war, one that will likely lead back to dictatorship and spill enough blood to cover all hands.

Tracy Kidder is the author, most recently, of Mountains Beyond Mountains.

Copyright © 2004 The New York Times Company

Maureen Dowd's View Of Mel & W

The Cobra (as she was dubbed by W) strikes again. The yoking of Mel & W is on target (as usual for Dowd): hate the Jews (Mel) and hate the Queers (W). If this is (fair & balanced) righteous indignation at opportunism, so be it.



[x NYTimes]
Stations of the Crass
By MAUREEN DOWD

Father, forgive them, for they know not what they do.

Mel Gibson and George W. Bush are courting bigotry in the name of sanctity.

The moviemaker wants to promote "The Passion of the Christ" and the president wants to prevent the passion of the gays.

Opening on two screens: W.'s stigmatizing as political strategy and Mel's stigmata as marketing strategy.

Mr. Gibson, who told Diane Sawyer that he was inspired to make the movie after suffering through addictions, found the ultimate 12-step program: the Stations of the Cross.

I went to the first show of "The Passion" at the Loews on 84th Street and Broadway; it was about a quarter filled. This is not, as you may have read, a popcorn movie. In Latin and Aramaic with English subtitles, it's two gory hours of Jesus getting flayed by brutish Romans at the behest of heartless Jews.

Perhaps fittingly for a production that licensed a jeweler to sell $12.99 nail necklaces (what's next? crown-of-thorns prom tiaras?), "The Passion" has the cartoonish violence of a Sergio Leone Western. You might even call it a spaghetti crucifixion, "A Fistful of Nails."

Writing in The New Republic, Leon Wieseltier, the literary editor, scorns it as "a repulsive, masochistic fantasy, a sacred snuff film" that uses "classically anti-Semitic images."

I went with a Jewish pal, who tried to stay sanguine. "The Jews may have killed Jesus," he said. "But they also gave us `Easter Parade.' "

The movie's message, as Jesus says, is that you must love not only those who love you, but more importantly those who hate you.

So presumably you should come out of the theater suffused with charity toward your fellow man.

But this is a Mel Gibson film, so you come out wanting to kick somebody's teeth in.

In "Braveheart" and "The Patriot," his other emotionally manipulative historical epics, you came out wanting to swing an ax into the skull of the nearest Englishman. Here, you want to kick in some Jewish and Roman teeth. And since the Romans have melted into history . . .

Like Mr. Gibson, Mr. Bush is whipping up intolerance but calling it a sacred cause.

At first, the preacher-in-chief resisted conservative calls for a constitutional ban on gay marriage. He felt, as Jesus put it in the Gibson script (otherwise known as the Gospels), "If it is possible, let this chalice pass from me."

But under pressure from the Christian right, he grabbed the chalice with both hands and swigged — seeking to set a precedent in codifying discrimination in the Constitution, a document that in the past has been amended to correct discrimination by giving fuller citizenship rights to blacks, women and young people.

If the president is truly concerned about preserving the sanctity of marriage, as one of my readers suggested, why not make divorce illegal and stone adulterers?

Our soldiers are being killed in Iraq; Osama's still on the loose; jobs are being exported all over the world; the deficit has reached biblical proportions.

And our president is worrying about Mars and marriage?

When reporters tried to pin down White House spokesman Scott McClellan yesterday on why gay marriage is threatening, he spouted a bunch of gobbledygook about "the fabric of society" and civilization.

The pols keep arguing that institutions can't be changed when, in fact, they change all the time. Haven't they ever heard of the institution of slavery?

The government should not be trying to legislate what's sacred.

When Bushes get in trouble, they look around for a politically advantageous bogeyman. Lee Atwater tried to make Americans shudder over the prospect of Willie Horton arriving on their doorstep; and now Karl Rove wants Americans to shudder at the prospect of a lesbian — Dick Cheney's daughter Mary, say — setting up housekeeping next door with her "wife."

When it comes to the Bushes' willingness to stir up base instincts of the base, it is as it was.

As the Max von Sydow character said in Woody Allen's "Hannah and Her Sisters," while watching a TV evangelist appealing for money: "If Jesus came back and saw what's going on in his name, he'd never stop throwing up."

Maureen Dowd's e-mail: liberties@nytimes.com

Copyright © 2004 The New York Times Company

Tuesday, February 17, 2004

The FINAL Word On The Passion

It is bewildering. U.S. foreign policy is tied to the Middle East. Zionism. B'athism. Shi'ites. Sunni. How to make sense of the bewildering maelstrom. And Mel Gibson steps in with another salvo in the culture wars. If this is (fair & balanced) disgust, so be it.



[x Newsweek]
Who Killed Jesus?
by Jon Meacham

It is night, in a quiet, nearly deserted garden in Jerusalem. A figure is praying; his friends sleep a short distance away. We are in the last hours of the life of Jesus of Nazareth, in the spring of roughly the year 30, at the time of the Jewish feast of Passover. The country—first-century Judea, the early 21st's Israel—is part of the Roman Empire. The prefect, Pontius Pilate, is Caesar's ranking representative in the province, a place riven with fierce religious disputes. Jesus comes from Galilee, a kind of backwater; as a Jewish healer and teacher, he has attracted great notice in the years, months and days leading up to this hour.

His popularity seemed to be surging among at least some of the thousands of pilgrims gathered in the city for Passover. Crowds cheered him, proclaiming him the Messiah, which to first-century Jewish ears meant he was the "king of the Jews" who heralded the coming of the Kingdom of God, a time in which the yoke of Roman rule would be thrown off, ushering in an age of light for Israel. Hungry for liberation and deliverance, some of those in the teeming city were apparently flocking to Jesus, threatening to upset the delicate balance of power in Jerusalem.

The priests responsible for the Temple had an understanding with the Romans: the Jewish establishment would do what it could to keep the peace, or else Pilate would strike. And so the high priest, Caiaphas, dispatches a party to arrest Jesus. Guided by Judas, they find him in Gethsemane. In the language of the Revised Standard Version of the Bible, there is this exchange: "Whom do you seek?" Jesus asks. "Jesus of Nazareth." The answer comes quickly. "I am he." ...

As moving as many moments in the film are, though, two NEWSWEEK screenings of a rough cut of the movie raise important historical issues about how Gibson chose to portray the Jewish people and the Romans. To take the film's account of the Passion literally will give most audiences a misleading picture of what probably happened in those epochal hours so long ago. The Jewish priests and their followers are the villains, demanding the death of Jesus again and again; Pilate is a malleable governor forced into handing down the death sentence.

In fact, in the age of Roman domination, only Rome crucified. The crime was sedition, not blasphemy--a civil crime, not a religious one. The two men who were killed along with Jesus are identified in some translations as "thieves," but the word can also mean "insurgents," supporting the idea that crucifixion was a political weapon used to send a message to those still living: beware of revolution or riot, or Rome will do this to you, too. The two earliest and most reliable extra-Biblical references to Jesus--those of the historians Josephus and Tacitus--say Jesus was executed by Pilate. The Roman prefect was Caiaphas' political superior and even controlled when the Jewish priests could wear their vestments and thus conduct Jewish rites in the Temple. Pilate was not the humane figure Gibson depicts. According to Philo of Alexandria, the prefect was of "inflexible, stubborn, and cruel disposition," and known to execute troublemakers without trial.

So why was the Gospel story--the story Gibson has drawn on--told in a way that makes "the Jews" look worse than the Romans? The Bible did not descend from heaven fully formed and edged in gilt. The writers of Matthew, Mark, Luke and John shaped their narratives several decades after Jesus' death to attract converts and make their young religion--understood by many Christians to be a faction of Judaism--attractive to as broad an audience as possible.

The historical problem of dealing with the various players in the Passion narratives is complicated by the exact meaning of the Greek words usually translated "the Jews." The phrase does not include the entire Jewish population of Jesus' day--to the writers, Jesus and his followers were certainly not included--and seems to refer mostly to the Temple elite. The Jewish people were divided into numerous sects and parties, each believing itself to be the true or authentic representative of the ancestral faith and each generally hostile to the others.

Given these rivalries, we can begin to understand the origins of the unflattering Gospel image of the Temple establishment: the elite looked down on Jesus' followers, so the New Testament authors portrayed the priests in a negative light. We can also see why the writers downplayed the role of the ruling Romans in Jesus' death. The advocates of Christianity--then a new, struggling faith--understandably chose to placate, not antagonize, the powers that were. Why remind the world that the earthly empire which still ran the Mediterranean had executed your hero as a revolutionary?

Copyright *#169; 2004 Newsweek Magazine

From The Left: Paul Fredriksen (Again) On The Gibson Film

Mel Gibson's films are routinely violent. There is a pornographic quality to all of the violence; senseless, sensational violence. If this is (fair & balanced) point counter-point, so be it; both sides have been heard.



[x The Responsive Community]
Responsibility for Gibson’s Passion of Christ
by Paula Fredriksen

Mel Gibson’s The Passion of Christ came into my life last April. It was then that Dr. Gene Fisher, the ecumenical officer for the United States Conference of Catholic Bishops, convened a small group of scholars to offer an ad hoc assessment of Gibson’s script. Fisher asked us to attend to a variety of issues: the script’s historical fidelity, its use of New Testament materials, and its consonance with Catholic magisterial instruction.

Why did Fisher care? This was, after all, just a movie. The answer, in part, lay with Gibson’s own publicity efforts. In numerous interviews, Gibson had presented his movie as an act of God. (“The Holy Ghost was working through me on this film,” he repeatedly claimed, adducing on-set miracles in support of his view.) He insisted that it was the most historically accurate depiction of Christ’s passion ever filmed. (“This is what really happened at the time.”) He paraded his own Catholic piety as some sort of authentication of his movie. (“We heard Mass every day. We had to be squeaky clean for this.”)

But in the course of these same interviews to publicize his film, Gibson had revealed some of its significant historical gaffes. Further, one of Gibson’s sources for his story came not from the first century Gospels, but from the revelations of Anne Catherine Emmerich (1774-1824), a stigmatic nun whose visions enunciate an anti-Semitism typical of her time and place. (She believed that Jews used the blood of Christian babies for their rituals.) And, finally, website stills of the movie paraded images marked with Gibson’s signature Hollywood gore: what he thought of as “realism” had less to do with history than with celluloid violence.

All this was cause for concern to Fisher, and to his counterpart at the Anti-Defamation League, Rabbi Dr. Eugene Korn. And it was of concern to us as scholars who work to promote interfaith dialogue and good relations between Christians and Jews. We volunteered our time and our professional expertise to compose for Gibson a confidential report. We concisely reviewed the problems, historical as well as (from a Catholic point of view) doctrinal, with his script. And we framed our presentation by naming one precise source of our concern, specifically, the long and toxic Christian tradition that Jews were (or are) particularly responsible for the death of Jesus, and the ways that this had led to anti-Jewish violence. I quote from the introduction of our report: We begin this task with an awareness of the tragic impact of Christian “passion plays” on Jews over the centuries. We know that their dramatic presentation of Jews as “Christ killers” triggered pogroms against Jews…and contributed to the environment that made the Shoah [the Holocaust] possible. Given this history, and given the power of film to shape minds and hearts, both Catholics and Jews in this ad hoc group are gravely concerned about the potential dangers of presenting a passion play in movie theaters.

The rest, as they say, is history. Icon Productions leaked our report to the press, presented our assessment as an “attack,” and has worked hard to keep the controversy alive until the movie’s release in February 2004. Icon and its supporters have proclaimed that criticism of the movie is tantamount to an attack on Christianity itself (check out www.seethepassion.com). Right-wing Jewish pundits have been lined up to report that they see no problems with the movie, and that criticisms of it “lack moral legitimacy.” Catholic concern has been deemphasized, Jewish concern emphasized, to enhance the idea that the controversy is a Christians vs. Jews argument. Free speech, freedom of expression, freedom of religion: Gibson’s critics, say Passion apologists, attack Gibson’s rights, and thus the rights of all citizens. To voice concern about this movie is virtually un-American. Let us be clear. We are talking about an action flick here. Aficionados of the genre, and of Gibson’s stellar contributions to it, know that realism is not one of its (or his) hallmarks. Actors routinely “bleed” in visually striking, medically remarkable ways, thanks to the makeup artist’s skill. Moral subtlety is also in short supply. Bad guys are very bad, good guys good: anything more complex would risk interfering with the story line.

Sensationalized violence substitutes for much else, from character development to plot. Gibson has taken the skills honed in Lethal Weapon, Conspiracy, and Payback, and used them to construct his take on the last 12 hours of Jesus’ life.

Anyone who has seen the final half-hour of Braveheart (a medieval action flick) has essentially seen The Passion already. This time, Caiaphas is Longshanks. Again, so what? It’s just a movie. But this movie—unlike, say, The Last Temptation of Christ, or Texas Chainsaw Massacre—risks more than religious offensiveness, and does more than simply entertain with senseless, sensational violence. The Passion stands in the echo chamber of deeply traditional Christian anti-Judaism. That tradition at its most benign has excused, and at its most malicious has occasioned, anti-Jewish violence for as long as Western culture has been Christian, from the fourth century to the twenty-first. Jews viewing the Scorsese movie were hardly going to feel enraged at Christians. Someone over stimulated by Massacre, if tempted to act out, would act out on his own. Christians enraged at the supposed Jewish treatment of Jesus—such as that anachronistically and luridly featured in Gibson’s first-century action flick—have often acted out against their Jewish neighbors in their midst, and felt morally and theologically justified in doing so.

Will The Passion of Christ, once released, have a negative effect on society? Might it promote anti-Jewish violence? I hope not, but I think it well might, for the reasons I sketch above. Long cultural habits die hard. The debate around the film, made public and promoted by Icon, has already occasioned ugly anti-Semitic slurs. My colleagues and I, via email, have received them. Both I and my university have received ominous threats from a furious Christian Passion-fan (“I am telling you now that if this woman continues to be employed as a professor, you will be putting your university at risk, with major problems to come…I speak with a powerful voice and with strength that comes from our Heavenly Father,” from an email of November 10, 2003). If the contrived, publicity-oriented “debate” stirs such feelings, will the movie stir fewer, once true public debate can ensue? I do not know, but I doubt it.

Gibson just re-shot some scenes a few months ago, in the wake of the pre-release attention that he has sought. Will he actually follow some of the scholars’ suggestions? Will he make his presentation of his Bad Guys—in this movie, the Jewish high priest, most of his council, and most of Jerusalem’s Jews—less extreme? Again, I do not know. Perhaps, perhaps not.

Will the anti-Semitism, which Gibson’s movie has already enabled, lead to violence? Despite the violence of American culture, I think not. Anti-Semitism just has not had the defining role here, historically, that it has had elsewhere. What about violence elsewhere? I do not know. But the long respectability of anti-Jewish violence in European culture, and the current climate of violence against Jews—in Istanbul, South America, Great Britain, and especially in France in the course of the past several years—inclines me to be much less sanguine about the effects of Gibson’s Passion with
foreign-language subtitles.

In the past several years, in Europe, violence against Jews—if those Jews are Israelis—has been explicitly excused by appeal to the toxic tradition that “the Jews killed Christ.” Horrific suicide bombings during the current intifada inspired a church in Edinburgh, over Easter 2001, to display a large oil painting of the Crucifixion with Roman centurions and officers of the Israeli Defense Force (IDF) depicted at the foot of the Cross. The Italian newspaper La Stampa commented on the IDF’s cordon around armed Palestinian gunmen holed up in Bethlehem’s Church of the Nativity with a political cartoon: baby Jesus, crouching in his manger at the sight of an Israeli tank, crying out, “Oh, no. They don’t want to kill me again?!?” I could cite 20 more examples. My point, simply, is that the Toxic Tradition—The Jews killed Jesus; all Jews everywhere are culpable; when something bad happens to them, it is no less than they deserve—is still very much alive, very current, very powerful.

I do not know Mel Gibson. I have read his script, and it seemed to me then a combination of enthusiastic piety, historical ignorance, poor reading of New Testament texts, and action-flick idioms. His response to the confidential report that my colleagues and I sent to him was belligerent and self-serving. (He and Paul Lauer, his marketing executive, have both commented appreciatively on what terrific publicity they have derived from all the flap.) The film, if unaltered, is in my view inflammatory, and therefore potentially dangerous. How Gibson lives with his responsibility for this affair is ultimately his own business.

My responsibility, meanwhile, is to speak up and speak out—not against the film so much as against the ignorance, and the unselfconscious anti-Judaism, that it so dramatically embodies and presents. Gibson has given myself and numberless colleagues in colleges, universities, and seminaries across the nation, a priceless opportunity for public education. Out of the ivory tower, past the Cineplex, into the churches and interfaith communities that have asked us all to come to speak. This teachable moment now serves as the silver lining that shines within the looming dark cloud of Gibson’s Passion.

Copyright © 2004 The Responsive Community

From The Right: Michael Medved On The Gibson Film

Cruxifixion was a Roman execution procedure. Stoning was a Jewish execution procedure. Who was crucified? Who was stoned? If this is (fair & balanced) point counter-point, so be it; Paula Fredriksen is next.


[x The Responsive Community]
The Right to The Passion of Christ
by Michael Medved

Any honest discussion of Mel Gibson’s movie The Passion of Christ (he recently changed the title from The Passion due to a copyright dispute) must begin with unflinching recognition of a few undeniable facts: the movie has been made, and other than minor adjustments in editing, it will be released in its current form. A distribution deal (involving Newmarket Entertainment) has been secured, and the movie will play in thousands of theaters around the world in February 2004. The film will draw eager audiences and will become a substantial box office hit—due in part to all the pre-release controversy, the “must see” factor has reached an almost unprecedented level of intensity among both committed Christians and the cinematically curious.

Most importantly, mainstream Christian leaders of every denomination will embrace the film as the most artistically ambitious and accomplished treatment of the Crucifixion ever committed to film. Some critics and scholars will criticize Gibson for his cinematic and theological choices in shaping the film, but any attempt to boycott or discredit the movie will, inevitably and unquestionably, fail.

No one who has actually seen the movie (as I have) would seriously challenge any of these conclusions. This means that all the debate about allegedly anti-Semitic overtones misses the point: the organized Jewish community and its allies in interfaith dialogue may not welcome The Passion of Christ, but hysterical overreaction to the film’s release will provoke far more anti-Semitism than the movie itself.

Gibson financed the film on his own (to the tune of $25 million) precisely due to his determination to realize his own vision of the Gospel story, without compromise. He could have involved a major studio (obviously, his star power remains potent and undiminished) but he wanted to avoid the need to adjust his Catholic traditionalism to suit the sensitivities of profit-oriented accountants or enthusiasts of other religious perspectives. Jewish leaders feel wounded that Gibson never consulted them in writing his script or re-creating historical details, but he also left out contributors from the Protestant or Eastern Orthodox tradition.

In the context of the forthcoming film, the focus (by the New York Times and other influential voices) on alleged Holocaust denial by Gibson’s 85-year-old father stands as both irrelevant and unfair. Hutton Gibson, an aging curmudgeon and crackpot, played no creative or consultative role in The Passion of Christ.

Meanwhile, the possibility of anti-Jewish violence in response to the film has been irresponsibly emphasized and has become, in a sense, a self-fulfilling prophecy. In parts of Europe and the Islamic world, anti-Semitic vandalism and violent attacks occur every day, and hardly need a film by a Hollywood superstar to encourage them.

In this context, Jewish denunciations of the movie only increase the likelihood that those who hate us will seize on the movie as an excuse for more spasms of hatred.

The problem with traditional “Passion Plays” was always the unmistakable association of contemporary Jews with the oppressive Judean religious authorities depicted on stage. The high priest and his cohorts often appeared with anachronistic costumes including European prayer shawls, skull caps, and side curls. Gibson pointedly avoids such imagery in his film—the costumes and ethnicity of the persecutors make them look far less recognizable as Jews than do the faces and practices of Jesus and his disciples in the film. The words “Jew” or “Jewish” scarcely appear in the subtitles to his movie (the dialogue is spoken in Aramaic and Latin). By agonizing so publicly about the purportedly anti-Semitic elements in the story (which closely follows the Gospel account), the Anti-Defamation League and its cohorts make it vastly more likely that moviegoers will connect the corrupt, exotic first century figures on screen with
Jewish leaders of 2004.

Of course, rabbis and teachers will feel an almost irresistible urge to respond to the explosion of public interest inevitably inspired by The Passion of Christ, and will comment on ways in which the Gospel story (particularly the Gospel of John, which heavily influenced Gibson) probably distorted the history of the execution of Jesus.

Many Jews understand that the canonized accounts came into existence at a time when early Christians had begun to despair concerning conversion of the Jews, and instead focused their attention on proselytizing Romans—hence, orthodox Jews come out looking very bad, while Pilate and other Roman authorities receive reduced blame.

Putting the New Testament account into this perspective may make sense with Jewish audiences, but insisting on this approach with our Christian neighbors represents outrageous arrogance. We may not welcome the stories told by Matthew, Mark, Luke, and John, but Christians have cherished that record for nearly two thousand years. The fact that anti-Semites through history have used these accounts as the inspiration for their depredations may prove that those stories can be dangerous, but does not prove that they are untrue. In any event, Jewish organizations must not attempt to take responsibility for deciding what Christians can and cannot believe. If those community agencies insist that Christian traditionalists must disavow their own sacred texts because of the shameful persecutions of the past, then they force a choice between faithfulness to scripture or amiable relations with Jews. The notion that committed Christians cannot have one without spurning the other does no service to Jewish communal interests, nor to the harmony of the larger community.

Does it truly contribute to interreligious understanding for Jewish leaders to insist that they know more about the truth of the Gospels than do Christians? Do we feel comfortable when some evangelical observers insist that they know more about the real symbolism of our rituals (emphasizing their supposed anticipation of Jesus the Messiah) than we do? I enjoyed a stimulating interchange with a pastor in Michigan who emphatically argued that the details of the Passover seder all related to Jesus of Nazareth—with the three matzos representing the Holy Trinity, the broken middle matzo symbolizing the broken body of Jesus Christ, and the Afikoman (half of the broken matzo) eaten at the end of the banquet indicating the second, triumphal coming of the Messiah. In our pluralistic society, this Pastor enjoys perfect freedom to teach his own unhistorical and eccentric interpretation of Jewish ritual, but he makes no attempt to insist that we include such versions in our homes, synagogues, or public explanations of our holiday. In other words, he offers a Christian understanding of Judaism without demanding that our own teaching must be accordingly adjusted.

By the same token, we remain free to teach a Jewish understanding of the New Testament story but we should make no effort to suppress or attack Christians who put forward their own traditionalist interpretations of their scripture. That’s especially true for Christians like Mel Gibson who, despite his personal involvement in a dissenting, traditionalist Catholic sect, provides in The Passion of Christ a vision of the Crucifixion that falls unequivocally within the Christian mainstream.

In fact, from a Jewish perspective, the most unfortunate aspect of the entire dispute regarding Gibson’s project involves the renewed focus on Christian scripture at a time when most Americans—emphatically including most American Jews—remain painfully ignorant of even the most fundamental Jewish teachings. Other than a general sense that Jews respect Moses and refuse to accept Jesus as Messiah, what do most members of the Jewish or general communities know of the essentials of our faith? The interests of Jewish continuity and vitality can hardly be served by a huge battle over a movie which will succeed with the public regardless of our discomfort. Rather than wasting energy and good will over a doomed, misguided effort to discredit an artful and ambitious film, we would do more for the cause of Judaism in America to emphasize the positive and productive aspects of our own sacred tradition.

Copyright © 2004 The Responsive Community

Monday, February 16, 2004

The Ultimate Spectator Sport

What makes a composer great? What makes a writer great? What makes an artist great? What makes a philosopher great? Charles Murray—W.H. Brady Scholar in Culture and Freedom at the American Enterprise Institute—has distilled the four essential elements in producing excellence in human achievement: (1) purpose, (2) personal autonomy, (3) organizing structure, and (4) transcendental goods. Straightforwardly and undogmatically, Charles Murray takes on some controversial questions: Why has accomplishment been so concentrated in Europe? Among men? Since 1400? He presents evidence that the rate of great accomplishment has been declining in the last century, asks what it means, and offers a rich framework for thinking about the conditions under which the human spirit has expressed itself most gloriously. If this is (fair & balanced) polemicism, so be it.



[x The New Criterion]
Of human accomplishment
by Denis Dutton

Readers familiar with Charles Murray’s work (Losing Ground: American Social Policy, 1950–1980, The Bell Curve: Intelligence and Class Structure in American Life) know that he is not a man to shy away from controversy or bold opinions. In Human Accomplishment,1 Murray’s aim is nothing less than to determine the geographic and chronological distribution of creative genius in science and the arts across the whole of the world during twenty-eight centuries, from 800 BC through 1950. It’s a tall order. Murray assembles histories, surveys, and encyclopedias of the arts and sciences, 163 of them, and records their treatment of significant figures. Using an initial cut-off that leaves only individuals who are mentioned in at least 50 percent of his sources, he comes up with a list of 4002 writers, philosophers, mathematicians, musicians, poets, astronomers, painters, physicists, biologists, and innovators in technology. These are then rated in a system devised and refined in order to provide an objective assessment of high achievement across cultures and over the ages.

The leading names are predictable: Galileo, Newton, Einstein, and Darwin in the sciences, Beethoven and Bach in music, Shakespeare and Schiller in literature, Michelangelo in painting, Euler in mathematics, and so forth. Murray is at pains to eliminate Eurocentrism in his analysis: there is separate coverage of Chinese and of Indian philosophy to match Western philosophy, Chinese painting, Japanese art, Japanese literature, Arabic literature, and Chinese literature. These include, at a level he views as comparable to Aristotle and Mozart, such names as Gu Kaizhi, Basho, al-Mutanabbi, and Kalidasa. Murray’s goal is not, however, merely to make a list of 4002 all-time greats. He wants to build up a general view of the historical conditions that allow for the flourishing of artistic and scientific innovation and discovery.

One of Murray’s favorite ideas is contained in a quip he credits to his late colleague Richard J. Herrnstein: “It is easy to lie with statistics, but it’s a lot easier to lie without them.” It’s a notion worth remembering in light of the sour reactions to his book from critics who don’t like the idea of quantifying greatness. In The New York Post, Sam Munson wrote that “there is something disturbingly petty about creating indexes, tables, and rates of human accomplishment.” Murray’s “page-counting,” Judith Shulevitz sniffed in The New York Times, seems the kind of thing that normally “would interest only those who find that sort of thing interesting,” were it not for the fact that his conclusions seem as “fantastical” as something out of Borges. Along with other critics, Shulevitz tends to find the book typical of old-time social science: when it’s right, it’s bleeding obvious; when it’s not obvious, it’s wrong.

Granted that in some respects Murray’s statistics drive him to conclusions not all that different from those of purely qualitative historians of genius and culture, such as Jacques Barzun and Harold Bloom. And when he does resort to humanistic sermons extolling great men and their masterworks, they come accompanied by tables and statistics lessons that many readers will find too tedious and wearying to study. This is a shame, for they will miss the heart of an impressively well-argued account of magnificent achievements of human history. For all his statistics, Murray does not promise or deliver certainty on the conditions for human accomplishment. Rather, he supplies data on which informed opinions, by him or by his critics, might be based.

Murray’s method for identifying eminence by reputation follows a form set by Francis Galton’s 1869 Hereditary Genius, which also used biographical literature quantitatively, as a platform for research. But the spirit of Murray’s endeavor stretches back farther, to David Hume. In his essay “Of the Standard of Taste” (1757), Hume formulated the problem of evaluating artistic achievement. There is “a species of opinion,” Hume observed, which holds that taste is subjective, beauty in the eye of the beholder, and that there can be no point in searching for standards in the arts. Many people find this an easy opinion to adopt, Hume writes, until someone forces it to its conclusion, declaring (say) that John Ogilby is as great a poet as John Milton. “The principle of the natural equality of tastes,” Hume remarks, “is then totally forgot, and while we admit it on some occasions, where the objects seem near an equality, it appears an extravagant paradox, or rather a palpable absurdity, where objects so disproportioned are compared together.”

Hume recognized works of art that pass what he called the Test of Time, works with the capacity to engage deep, permanent features of the human personality and are thus to be appreciated over the ages. (So the same Homer who pleased “at Athens and at Rome two thousand years ago, is still admired at Paris and at London.”) This shows, for Hume as for Murray, that objective achievements in the arts are demonstrable—and if they can be historically established for the arts, then they are even more clearly identifiable for the sciences. These two spheres of human endeavor represent two kinds of potential objectivity: there is as little chance of the human race giving up Homer or the Beethoven symphonies as there is that it will give up the notion that the earth is a sphere. Over time, achievement in the arts and the sciences is seen as not merely an invention of scholarship, a product of fickle fashion, or a general social construction. It is an objective fact about the real world.

Nevertheless, Murray does not glibly assume the reality of lasting values: he argues for them, scrappily, provocatively, with energy and conviction. In his introduction, he quotes St. Augustine in City of God, sounding, as Murray remarks, like a Victorian triumphalist: “What varieties has man found out in buildings, attires, husbandry, navigation, sculpture, and painting! … What million of invention has he [in] arms, engines, stratagems, and the like! … How large is the capacity of man!” The idea of progress, the notion that each generation builds a better world—by its own invention and discovery, standing on the shoulders of its forebears—is old indeed. It was prevalent for the last 600 years and has only recently fallen out of fashion among postmodernist intellectuals.

Intellectuals who dismiss progress—or who find attractive the idea of the Noble Savage, a person uncorrupted by civilization who lives a more deeply creative or authentic life than we can understand—have Murray’s contempt. Nor has he patience with people who complain about technology and economic growth over their cellphones on the way to the airport. A little question does the trick for him on the issue of whether there is progress in science and technology: “Would you be willing to live your life at any time before the invention of antibiotics?”

Murray is more patient with circumspect objections to his argument. Throughout the book, he anticipates plausible objections to show how they can be addressed. For instance, he notes that his index scores for literature suffer from problems of chauvinism: there is a tendency for encyclopedia editors to give too much attention to their own national literatures. Murray overcomes this problem by eliminating from his figures coverage by encyclopedias of the national literatures of the encyclopedist. Shakespeare is at the top of the literary heap only because non-English sources placed him there. Goethe comes in second after eliminating all non-German sources. Again, a cleverly Humean strategy to counter local bias: “Authority or prejudice may give a temporary vogue to a bad poet or orator,” Hume said, “but his reputation will never be durable or general. When his compositions are examined by posterity or by foreigners, the enchantment is dissipated, and his faults appear in their true colours.”

Some readers will nonetheless not feel entirely happy with Murray’s derived rankings, which can be odd despite his efforts to correct anomalies. His approach puts Michelangelo at the top of the list for Western art—but Picasso, mirabile dictu, comes in at number two, above Raphael, Leonardo, Titian, Dürer, and Rembrandt. Even Picasso’s most generous admirers would have to see this as an artifact of a reliance of encyclopedias written during Picasso’s age. Murray attributes this absurdly high placement not only to Picasso’s art, but also to his “seminal role in several phases of the break with classicism” that took place in the late nineteenth and early twentieth centuries.

The anomalous ranking of Picasso highlights a potential source of error that Murray does not adequately acknowledge. Dictionaries of biography and encyclopedias tend to feature entries written by historians of art or science. The academic mindset of such scholars concentrates on historical importance in the arts, but it is also prone to confuse historical importance with aesthetic achievement. This problem shows itself clearly in Murray’s inventory of accomplishment in Western music: Arnold Schoenberg stands above Brahms, Chopin, and Verdi. For good or ill, Schoenberg has affected the course of music just as Picasso has the history of painting, but to place his achievement above Brahms’s makes as much sense as to place Picasso’s above Rembrandt’s. Schoenberg’s nearly exact contemporary Rachmaninoff had no effect whatsoever on music history in the twentieth century; my guess is that his music will be performed much more frequently in the future than Schoenberg’s. So which composer’s “accomplishment” is the greater? Murray’s methodology gives one answer, history will probably supply another.

Nevertheless, readers who tune out of Murray’s analysis because of occasional weirdnesses and anomalies will miss much that is worthwhile. He shows that central Europe for a few hundred years has been the main scene of progress and innovation in the arts and sciences. Whatever their earlier or later contributions, China and India are outliers in the sciences, while the United States’s contributions to both the arts and the sciences is comparatively slight. There are women in the inventories—Sappho, Hypatia, Lady Murasaki, Jane Austen, Madame Curie, and others—but Dead White Males, as Murray likes to call them, do most of the heavy lifting in his specified fields of intellectual accomplishment. (Why? Murray considers the usual roster of arguments—child-bearing, lack of female access to the professions, possible innate biological differences between the sexes.)

There is much for later readers and researchers to ponder in Murray’s data. One arresting aspect of the book is the inclusion of European maps showing achievement in geographical distribution. The big four contributors to progress are Britain, France, Germany, and Italy; together they vastly exceed the combined contributions of all other European countries. Taken together, Murray’s dotted and shaded European maps for science, art, music, and literatures are wonderful to study. The music map 1800–1950, for example, shows shadings in northern Italy and France, but is dominated by dots and a huge dark glob extending through the German-speaking parts of Europe. The science maps are more uniform, but dark areas in France and especially Britain counterbalance a densely speckled Germany. In the art map and the early science map, Italy makes all other countries literally pale by comparison. There is much to contemplate in these maps, not least of which is the historically variable borders of the land we would today call Germany: the Germans have spread enormous intellectual influence all over Europe at various times, and in the last two centuries significantly into the United States.

With his facts, graphs, and tables as support, Murray tackles the question of the general decline in the arts and sciences. The fall off in the accomplishment rate (significant figures per unit of population) is severe, especially since around 1800. And yet conditions that promote creativity—wealth, cities and their cultural endowments, communication, and political freedom—have not declined but improved in recent centuries. How is this possible? What does ignite the blaze of human creativity and achievement? Why does it die out?

The fundamental principle of human achievement is expressed by Aristotle in the Nichomachean Ethics and accepted by philosophers since, and more recently even by psychologists: that human beings derive pleasure from the just exercise of their skills and capacities. From crossword puzzles and rock climbing to painting, composing music, playing a musical instrument, or solving equations, Murray says, “The pursuit of excellence is as natural as the pursuit of happiness.” For the creative geniuses who are the subject of his book, I prefer to say that achieved excellence simply is happiness.

There are, according to Murray, four conditions for the highest realization of this innate impulse toward excellence. The sources of energy for accomplishment are a sense of (1) purpose and (2) personal autonomy. The sources for what he calls the content of accomplishment are (3) organizing structure and (4) transcendental goods. Here Murray moves, by his own admission, into an area beyond statistical demonstration; his data are relevant, and he thinks supportive, of these conclusions, but they are not decisive.

Purpose. Accomplishment “is fostered in a culture in which the most talented people believe that life has a purpose and that the function of life is to fulfill that purpose.” Murray calls people who doubt or deny that life has a purpose “nihilists.” Since accomplishment at the level Murray specifies requires enormous levels of work, nihilists are at a disadvantage. They lack a sense of vocation, either in the form of an idea that God has called them to a life of scientific or artistic endeavor, or, if they are not religious, in having a sense that they were put on earth to accomplish great things.

Autonomy. A culture that “encourages the belief that individuals can act efficaciously as individuals” will be best in encouraging human accomplishment. Freedom for the individual and tolerance of nonconformity are positive contributors to a climate of achievement. It is not only formal political control (e.g., dictatorship) that will discourage initiative, but strictures found in familism (which presumably helps explain relatively lower levels of scientific and technical accomplishment in cultures of east Asia and the relative lack of innovation in Asian art). A culture that fosters individualism stands a higher chance of producing significant creative individuals in art and science.

Organizing structure. “The magnitude and content of a stream of accomplishment in a given domain varies according to the richness and age of the organizing structure.” In the sciences, “the structure from the Renaissance onward has been an evolving scientific method.” In the arts, structures present themselves differently: sonata form, haiku, Pointillism, the novel, and the motion picture are all organizing structures. All can be richly elaborated. Some structures are checkers-like in allowing a limited range of elaboration; others are more chess-like in their vast potential. Historical bursts of creative activity are initiated by theories, styles, and techniques (including the development of instruments, such as the spectroscope in physics or the grand piano in music) that open rich research or aesthetic possibilities. The age of an organizing structure is important: they are born, can have a vigorous youth, and then enter senescence, losing their potential to yield insights.

Transcendental goods. Accomplishment requires “a well-articulated vision of, and use of, the transcendental good relevant to that domain.” These values are the true, the good, and the beautiful—the first central to science, the last to art, and the second to both science and art. Without a coherent sense of these values to underpin them, science and art may rise to “the highest rungs of craft,” but they will not achieve exalted heights. A culture without a sense that science can reveal truth will never develop a stream of scientific accomplishment; a culture without a sense that beauty is real will never enjoy a great epoch of art, literature, or music: such artistic cultures are likely, as Murray puts it, to be “arid and ephemeral.”

Of these four conditions, Murray places the most weight on the last. Taking into statistical account (1) the tendency of everyone to overrate more recent accomplishments, and (2) the worldwide rise in skilled, educated populations that might produce great scientists and artists, Murray finds it inescapable that accomplishment has been slumping since around the beginning of the nineteenth century. In this respect, anyway, he resembles such gloomy cultural observ- ers as Nietzsche, Spengler, and Toynbee. Though he does not accept their versions of laws of history, he does present us with one last grand generalization. This idea is so important to him that it preoccupies him up to the last page of the book’s main text (before the nearly 200 pages of tables, appendices, and notes). “Human beings,” he claims, “have been most magnificently productive and reached their highest cultural peaks in the times and places where humans have thought most deeply about their place in the universe and been most convinced they have one.” This for Murray helps explain the preponderance of achievement in the arts and sciences in Europe during the centuries when Christianity was regnant.

Truth, regarded as a goal guiding inquiry, may be considered a transcendental value for science. Art faces a different challenge. An ironic, detached culture in which artists have lost faith in ultimate values is not likely to rival the greatness of the past. In terms of freedom, wealth, creature comforts, and health, Murray says, we may be doing better than our forebears, but that does not mean that our artists will achieve more than “shiny, craftsmanlike entertainments.” He quotes Gibbon’s observation that even at the apex of their empire, the Romans, who looked back at the achievements of old Greece much as we do of old Europe, were “a race of pygmies.”

Religion, Murray argues, “is indispensable for igniting great accomplishment in the arts.” Religious believers and philosophers of a traditionally idealist stamp may find comfort in this, especially since it comes from an avowed agnostic such as Murray. I am personally not convinced. In particular, it seems to me that Murray seriously underestimates the role of organizing structures in creating conditions for high achievement.

Murray is right to stress the importance of meaning it—of commitment in the arts. He tells of the stonemasons who sculpted gargoyles on Gothic cathedrals. They worked with passionate devotion, even when their handiwork would be invisible from the ground: God would see it. I discovered a similar aesthetic psychology in my own fieldwork in New Guinea, where serious artists view a carving created for a dead ancestor differently from one knocked off for tourists. Much of our own art and entertainment is shallow and flashy, made neither for God nor ancestors, but for a market.

But, accepting this does not mean that transcendental values form a principle necessary to explain high achievement in the arts. Consider the history of music. Murray makes it clear that the invention of polyphony led to more complex structures that, along with improved instrumentation, continued through the fifteenth century and into the sixteenth. The Himalayan heights of music were reached 150 years later, from the middle of the eighteenth century to the beginning of the twentieth. If there is progress in this period, it is the progress of artists who responded to the problems and potentialities inherent in musical tonality. New instruments, developing popular audiences, a sense of formal experimentation, and above all the maturing of tonality were the driving forces for the great flowering of music through the eighteenth and nineteenth centuries. It was, in other words, the birth, flourishing, and exhaustion of organizing structures, not transcendental values, that provided the most important motor for music development.

Murray argues that the greatest artists have “a vision of perfection” stirring them to greater achievements. This is true for some artists. But it slights the role of solving formal artistic problems—an important motive force even in so spiritually committed an artist as Bach. Think of Impressionism in painting; think of the rise of the novel from Cervantes and Richardson to Jane Austen. And there is nothing, by the way, nihilistic in the artistry of Jane Austen.

“Realized capacities are pleasing not only when they are exercised,” Murray observes, “but also when they are seen to be realized.” Right he is. We take pleasure in watching an athlete break a record, hearing a soprano in full flight, or reading a philosopher of depth and insight. Human accomplishment is the ultimate spectator sport. Apply as much historical analysis to it as we wish, and we’ll not unlock all its mysteries. The continuous capacity of genius to surpass understanding remains a human constant.

Denis Dutton teaches philosophy at the University of Canterbury, New Zealand. His website is www.denisdutton.com.




Notes

1Human Accomplishment: The Pursuit of Excellence in the Arts and Sciences, 800 BC to 1950, by Charles Murray; HarperCollins, 668 pages, $29.95.

© 2004 The New Criterion

Sunday, February 15, 2004

Curb MY Enthusiasm

Larry David—the model for George Costanza on "Seinfeld"—appears on my favorite HBO comedy series, "Curb Your Enthusiasm." Larry David—as himself—has created another show about nothing. Larry David is small-minded, self-absorbed, and capable of mean spiritedness. My kind of guy. On a recent show, he got into a fight with a Russian immigrant during the eulogy for the immigrant's beloved uncle. The immigrant's wife was serving as an interpreter and was repeating the remarks to her husband in Russian. Larry David—sitting at the next table—asked the woman to "hold it down" and when the Russian immigrant learned what David had said, the fight was on. In this op-ed piece for the NYTimes, Larry David thanks W for restoring David's pride in serving in the Army Reserve during Vietnam. If this is (fair & balanced) mockery, so be it.



[x NYTimes]
My War
By LARRY DAVID

LOS ANGELES

I couldn't be happier that President Bush has stood up for having served in the National Guard, because I can finally put an end to all those who questioned my motives for enlisting in the Army Reserve at the height of the Vietnam War. I can't tell you how many people thought I had signed up just to avoid going to Vietnam. Nothing could be further from the truth. If anything, I was itching to go over there. I was just out of college and, let's face it, you can't buy that kind of adventure. More important, I wanted to do my part in saving that tiny country from the scourge of Communism. We had to draw the line somewhere, and if not me, then who?

But I also knew that our country was being torn asunder by opposition to the war. Who would be here to defend the homeland against civil unrest? Or what if some national emergency should arise? We needed well-trained men on the ready to deal with any situation. It began to dawn on me that perhaps my country needed me more at home than overseas. Sure, being a reservist wasn't as glamorous, but I was the one who had to look at myself in the mirror.

Even though the National Guard and Army Reserve see combat today, it rankles me that people assume it was some kind of waltz in the park back then. If only. Once a month, for an entire weekend — I'm talking eight hours Saturday and Sunday — we would meet in a dank, cold airplane hangar. The temperature in that hangar would sometimes get down to 40 degrees, and very often I had to put on long underwear, which was so restrictive I suffered from an acute vascular disorder for days afterward. Our captain was a strict disciplinarian who wouldn't think twice about not letting us wear sneakers or breaking up a poker game if he was in ill humor. Once, they took us into the woods and dropped us off with nothing but compasses and our wits. One wrong move and I could've wound up on Queens Boulevard. Fortunately, I had the presence of mind to find my way out of there and back to the hangar. Some of my buddies did not fare as well and had to call their parents to come and get them.

Then in the summer we would go away to camp for two weeks. It felt more like three. I wondered if I'd ever see my parakeet again. We slept on cots and ate in the International House of Pancakes. I learned the first night that IHOP's not the place to order fish. When the two weeks were up, I came home a changed man. I would often burst into tears for no apparent reason and suffered recurring nightmares about drowning in blueberry syrup. If I hadn't been so strapped for cash, I would've sought the aid of a psychiatrist.

In those days, reserve duty lasted for six years, which, I might add, was three times as long as service in the regular army, although to be perfectly honest, I was unable to fulfill my entire obligation because I was taking acting classes and they said I could skip my last year. I'll always be eternally grateful to the Pentagon for allowing me to pursue my dreams.

Still, after all this time, whenever I've mentioned my service in the Reserve during Vietnam, it's been met with sneers and derision. But now, thanks to President Bush, I can stand up proudly alongside him and all the other guys who guarded the home front. Finally, we no longer have to be embarrassed about our contribution during those very trying years.

Larry David, who served in the Army Reserve in the 1970's, appears in the HBO series "Curb Your Enthusiasm." David also was a writer for "Saturday Night Live" and the co-creator of "Seinfeld."

Copyright © 2004 The New York Times Company

Why I Am Voting For Dave Barry In 2004

It doesn't get any better. Another campaign piece by my candidate for President of the United States: Dave Barry. And Candidate Barry supplies a lagniappe: lyrics from my favorite song: My Way. Barry also includes Gandhi and Moses in his panthenon of great Americans (Lincoln, Kennedy, and King) and finishes with Perry Como. There are zillions of Protestant Rightists who would not object to the inclusion of Moses, but might choke on Gandhi. Nonetheless, Dave Barry has a great ear for the U.S. street. To wit: [Barry promises to govern according to] old country expressions that express the homespun wisdom acquired by rural people over years of drinkin' contaminated groundwater, such as: "Don't light a match 'til you know which end of the dog is barkin'.'' If this is (fair & balanced) political wisdom, so be it.



[x Miami Herald]
I'll do it yooooour wayyyyyy
by
DAVE BARRY

My fellow and gal Americans:

For the past few months, as I have traveled around this great nation talking about my campaign for president, the one question I have heard most often from the voters, in these troubled times, is: "President of what?''

Ha ha! Such kidders, those voters! But seriously: According to my team of policy advisors, it is now 2004, which means this November the American people will go into the voting booth and cast ballots for the leader of our nation, except in Florida, where they will become confused and attempt to produce urine samples.

But that is the imperfect nature of our political system. As the late Winston Churchill once said: "Democracy is the . . . the . . . (WHAM).'' Winston was on his 17th glass of gin when he said this, and would have broken his nose had he not landed face-first on a member of the British royal family, who, fortunately, was lying on the floor at the time.

Yes, Winston Churchill, like democracy itself, was not perfect. Neither was Abraham Lincoln, John F. Kennedy, Martin Luther King, Gandhi, Moses or the late Perry Como. And like these great Americans, I am not perfect, either. To quote the classic song My Way, which I think we can all agree, as Americans, has some of the worst lyrics ever written: "Regrets, I've had a few. But then again, too few to mention.''

Yes, I have made mistakes. But who has not? Are you perfect? Can you look yourself in the eye and honestly say: ''I have never, while high on crack, driven a bank-robbery getaway car into an elementary school?'' So if my opponents wish to dredge up that unfortunate incident from my past, I say to them: "Fine, go ahead, but I do not believe the American voters are so petty and vindictive as to punish a candidate for something that happened nearly six weeks ago.''

I say this because, unlike my opponents -- with their image consultants, their pollsters, their all-night sex orgies with the cast of Celebrity Mole Yucatan -- I trust you, the American people. I am not some professional politician in a silk suit who has never worked with his hands. I work with my hands! I am typing with my hands right now! I've tried working with my feet, but it comes out Welsh, as follows: ''Wel, dyma i chi ddefaid da!'' ("My goodness, what magnificent sheep!'')

Yes, voters, I trust you, because I am one of you. I even talk like you. For example, when I'm campaignin' in the South, I leave the 'g''s off the ends of words, and I use old country expressions that express the homespun wisdom acquired by rural people over years of drinkin' contaminated groundwater, such as: 'Don't light a match 'til you know which end of the dog is barkin.' '' As your president, I will govern the nation, or at least the South, in accordance with those words, whatever they may mean.

Voters, I have the same values, morals, religious beliefs, ethnic background and number of children as you. We even have the same blood type! If I am elected president, and you ever need blood, or an organ, you just come to the White House, and I will immediately hang up the Hot Line phone, and, bam, I will give you a kidney, lung, pancreas, liver segment, whatever you need, no questions asked. Name me one other candidate, besides Dennis Kucinich, who has made that promise.

Of course this is not enough for the so-called ''news media,'' which as we know is dominated by left wingers; or, if you prefer, right wingers. The point is, they are wingers, and they are always nosing around, asking questions, trying to make me reveal intimate details about my personal life, such as which party do I belong to, and do I have a domestic or foreign policy. Well you can call me a man of deep moral principles if you want, but I happen to believe that even a presidential candidate is entitled to a ''zone of privacy'' covering his political beliefs, criminal record, recreational use of household chemicals and Internet purchases of inflatable sheep.

Because in the end, I am a man, just like you, unless you are a woman, in which case, so am I. And in the words of the great Canadian-American songwriter Mr. Paul Anka: For what is a man, what has he got? If not himself, then he has naught.

© 2004 The Miami Herald and wire service sources. All Rights Reserved.


Friday, February 13, 2004

McLawsuits?

You deserve a break today. Have it your way. Finger lickin' good. Think outside the bun. Eet mor chikin. Head for the border. It's better here. What are you eating today? When was the last time you were interested in how many hamburgers McDonald's has sold? If this is (fair & balanced) nutrition, so be it.



[x Policy Review]
Burgers, Fries, and Lawyers
By Todd G. Buchholz

A scene: The overweight baseball fan jumps to his feet in the bleachers of Wrigley Field, screaming for the Chicago Cubs to hold onto their 3-2 lead in the bottom of the ninth inning. He squeezes a Cubs pennant in his left hand while shoving a mustard-smeared hot dog into his mouth with the right. The Dodgers have a runner on first, who is sneaking a big lead off the base. The Cubs’ pitcher has thrown three balls and two strikes to the batter, a notorious power hitter. The obese fan holds his breath while the pitcher winds up and fires a blazing fastball. “Crack!” The ball flies over the fan’s head into the bleachers for a game-winning home run. The fan slumps to his bleacher seat and has a heart attack.

Whom should the fan sue? (a) The Cubs for breaking his heart? (b) The hot dog company for making a fatty food? (c) The hot dog vendor for selling him a fatty food? (d) All of the above

A few years ago these questions might have seemed preposterous. But now scenes better suited for the absurd stories of Kafka snake their way into serious courtroom encounters. While no federal court has yet heard a case on behalf of sulking baseball fans, last year the U.S. District Court for the Southern District of New York responded to a complaint filed against McDonald’s by a class of obese customers, alleging among other things that the company acted negligently in selling foods that were high in cholesterol, fat, salt, and sugar. In the past 10 years we have seen an outburst of class action lawsuits that alleged harm to buyers. With classes numbering in the thousands, these suits may bring great riches to tort lawyers, even if they provide little relief to the plaintiffs. The sheer size of the claims and the number of claimants often intimidate defending firms, which fear that their reputations will be tarnished in the media and their stock prices will be punished — not because of the merits but from the ensuing publicity. In his opinion in the McDonald’s case, Judge Robert W. Sweet suggested that the McDonald’s suit could “spawn thousands of similar ‘McLawsuits’ against restaurants.” Recent books with titles like Fat Land and Fast Food Nation promote the view that fast food firms are harming our health and turning us into a people who are forced to shop in the “big and tall” section of the clothing stores. The Wall Street Journal recently reported that “big and tall” has become a $6 billion business in menswear, “representing more than a 10 percent share of the total men’s market.

But before the legal attack on fast food gets too far along, it would be useful to look at the facts behind fast food and fat America and to ask whether the courtroom is really the place to determine what and where people should eat.

Why is fast food under attack?


Fast food restaurants have exploded in popularity since World War ii. More cars, more suburbs, and more roads have made roadside eating more convenient. During the 1950s, drive-through and drive-in hamburger, ice cream, and pizza joints catered to a mobile population. McDonald’s, which specialized in roadside restaurants, eclipsed White Castle hamburger stands in the 1960s because the latter had focused more on urban walk-up customers. The McDonald’s road signs in the early 1960s boasted of serving 1 million hamburgers; now McDonalds claims to have sold over 99 billion. The “zeros” in 100 billion will not fit on the firm’s tote-board signs when the one-hundred-billionth burger is sold.

And yet, despite the popularity of such firms as McDonald’s, Wendy’s, Burger King, Pizza Hut, Taco Bell, and Subway — at which American consumers voluntarily spend over $100 billion annually — it has become fashionable to denounce these restaurants for a variety of sins: “They make people fat.” “They hypnotize the kids.” “They bribe the kids with toys.” “They destroy our taste for more sophisticated foods.” These condemnations often come from highbrow observers who claim that fast food customers are too ignorant or too blinded to understand what they are putting in their own mouths. The onslaught of criticism is not limited to the food. Animal rights activists condemn fast food outlets for animal cruelty. Environmentalists allege that fast food produces too much “McLitter.” Orthodox organic food fans accuse fast food firms of using genetically modified ingredients, which they call “frankenfoods.” In Europe, anti-globalization protestors allege that fast food homogenizes culture and spreads capitalism far and wide.

With the fury directed at fast food firms, it is no surprise that tort lawyers have jumped into the fray. Tort lawyers around the country settled the $246 billion tobacco case in 1998. Those who have not retired on their stake from that settlement are wondering whether fast food could be the “next tobacco,” along with hmos and lead paint. After all, the surgeon general estimates that obesity creates about $115 billion in annual health care costs. There are differences, of course. No one, so far, has shown that cheeseburgers are chemically addictive. Furthermore, most fast food restaurants freely distribute their nutritional content and offer a variety of meals, some high in fat, some not. Nor is it clear that the average fast food meal is significantly less nutritious than the average restaurant meal, or even the average home meal. The iconic 1943 Norman Rockwell Thanksgiving painting (“Freedom from Want”) highlights a plump turkey, which is high in protein. But surely the proud hostess has also prepared gravy, stuffing, and a rich pie for dessert — which, though undoubtedly tasty, would not win a round of applause from nutritionists.

The key similarity between the tobacco lawsuits and claims against the fast food industry is this: Both industries have deep pockets and millions of customers who could join as potential plaintiffs. Therefore, lawyers have enormous incentives to squeeze food complaints into the nation’s courtrooms. They will not disappoint in their eagerness to pursue this course.

Changing diets, misplaced blame


If you believe the old saying “you are what you eat,” human beings are not what they used to be. Before jumping into today’s fashionable condemnation of calories, let us spend a moment on historical perspective and at least admit that for mankind’s first couple hundred thousand years of existence, the basic human problem was how to get enough calories and micronutrients. Forget the caveman era: As recently as 100 years ago, most people were not receiving adequate nutrition. Malnutrition was rampant, stunting growth, hindering central nervous systems, and making people more susceptible to disease. Often, poor people begged on the streets because they did not have the sheer physical energy to work at a job, even if work was available to them. By modern standards even affluent people a century ago were too small, too thin, and too feeble, as economist Robert W. Fogel has noted. A century ago, an American with some spare time and spare change was more likely to sign up for a weight-gaining class than a weight-loss program.

Just as life expectancy in the United States rose almost steadily from about 47 years in 1900 to 80 years today, so too has the “Body Mass Index,” or bmi, a ratio of height to weight.1 In the late nineteenth century, most people died too soon and were, simply put, too skinny. The two are related, of course. For most of human history only the wealthy were plump; paintings of patrons by Peter Paul Rubens illustrated that relationship. In ancient times figurines of Venus (carved thousands of years ago) displayed chunky thighs, big bellies, and bmis far above today’s obesity levels. Likewise, skinny people looked suspicious to the ancients. (Remember that the backstabbing Cassius had a “lean and hungry look.”) The rise in the bmi from the nineteenth century to about 1960 should be counted as one of the great social and medical victories of modern times. In a sense, it created a more equal social status, as well as a more equal physical stature.

So what went wrong more recently? It is not the case that the average bmi has suddenly accelerated. In fact, the bmi has been rising fairly steadily for the past 120 years. Nonetheless, since the 1960s the higher bmi scores have surpassed the optimal zone of about 20 to 25.2 No doubt, a more sedentary lifestyle adds to this concern. (In contrast, the healthy rise in bmis during the early 1900s might be attributed to gaining more muscle, which weighs more than fat.). The post-1960s rise in bmi scores is similar to a tree that grows 12 inches per year but in its tenth year starts casting an unwanted shadow on your patio. In the case of people, more mass from fat has diminishing returns, cutting down their life spans and raising the risk for diabetes, heart disease, gallbladder disease, and even cancer. Over half of American adults are overweight, and nearly a quarter actually qualify as obese, according to the National Institutes of Health.

Should we chiefly blame fast food for bmis over 25? According to the caricature described by lawyers suing fast food companies, poor, ill-educated people are duped by duplicitous restaurant franchises into biting into greasy hamburgers and french fries. The data, however, tell us that this theory is wrong. If the “blame fast food” hypothesis were correct, we would see a faster pace of bmi growth among poorly educated people, who might not be able to read or understand nutritional labels. In fact, college-educated people — not the poorly educated — accounted for the most rapid growth in bmi scores between the 1970s and the 1990s. (Poorly educated people still have a higher overall incidence of obesity.) The percentage of obese college-educated women nearly tripled between the early 1970s and the early 1990s. In comparison, the proportion of obese women without high school degrees rose by only 58 percent. Among men, the results were similar. Obesity among those without high school degrees climbed by about 53 percent, but obesity among college graduates jumped by 163 percent.3 If the “blame fast food” hypothesis made sense, these data would be flipped upside down.

Of course, we cannot deny that people are eating more and getting bigger, but that does not prove that fast food franchises are the culprit. On average, Americans are eating about 200 calories more each day than they did in the 1970s. An additional 200 calories can be guzzled in a glass of milk or a soda or gobbled in a bowl of cereal, for example. Fast food’s critics eagerly pounce and allege that the additional calories come from super-sized meals of pizza, burgers, or burritos. It is true that between the 1970s and 1990s, daily fast food intake grew from an average of 60 calories to 200 calories. But simply quoting these data misleads. Though Americans have been consuming somewhat more fast food at mealtime, they have reduced their home consumption at mealtime. Americans have cut back their home meals by about 228 calories for men and 177 for women, offsetting the rise in fast food calories.4 In total, mealtime calories have not budged much, and mealtimes are when consumers generally visit fast food restaurants.

So where are the 200 additional calories coming from? The U.S. Department of Agriculture (usda) has compiled the “Continuing Survey of Food Intakes by Individuals,” which collects information on where a food was purchased, how it was prepared, and where it was eaten, in addition to demographic information such as race, income, age, and sex. The survey shows that the answer is as close as the nearest salty treat. Americans are not eating bigger breakfasts, lunches, or dinners — but they are noshing and nibbling like never before. Between the 1970s and the 1990s, men and women essentially doubled the calories consumed between meals (by between 160 and 240 calories). In 1987-88, Americans typically snacked less than once a day; by 1994 they were snacking 1.6 times per day. But surely, opponents of fast food would argue, those cookies and pre-wrapped apple pies at McDonald’s must account for calories. Again the data fail to make their case. Women ate only about six more snack calories at fast food restaurants, while men ate eight more snack calories, over the past two decades. That is roughly equal to one Cheez-It cracker or a few raisins. Where do Americans eat their between-meal calories? Mostly at home. Kitchen cabinets can be deadly to diets. And in a fairly recent development, supermarket shoppers are pulling goodies off of store shelves and ripping into them at the stores before they even drive home. Consumers eat two to three times more goodies inside stores than at fast food restaurants.

Why are people eating more and growing larger? For one thing, food is cheaper. From an historical point of view that is a very good thing. A smaller portion of today’s family budget goes to food than at any time during the twentieth century. In 1929, families spent 23.5 percent of their incomes on food. In 1961, they spent 17 percent. By 2001, American families could spend just 10 percent of their incomes on food, according to the usda’s Economic Research Service. The lower relative cost of food made it easier, of course, for people to consume more.

Since the mid-1980s we have seen an interesting change in restaurant pricing, which has made restaurants more attractive to consumers. Compared to supermarket prices, restaurant prices have actually fallen since 1986. Whereas a restaurant meal was 1.82 times the cost of a store-bought meal in 1986, by 2001 a restaurant meal cost just 1.73 times as much. Higher incomes and lower relative restaurant prices have induced people to eat more, and to eat more away from home.

Despite the attraction of restaurant eating and the proliferation of sit-down chain restaurants such as the Olive Garden, tgi Friday’s, P.F. Chang’s, and others, Americans still consume about two-thirds of their calories at home. Critics of fast food spend little time comparing fast food meals to meals eaten at home, at schools, or at sit-down restaurants.

The nature of the American workplace may also be contributing to higher caloric intake. Whether people dine while sitting down at a table or while standing at a fast food counter, at the workplace they are literally sitting down on the job more than they have during prior eras. More sedentary desk jobs probably contribute to wider bottoms. Consider two middle-income jobs, one in 1953 and one in 2003. In 1953, a dockworker lifts 50 boxes off of a mini-crane and places them on a handtruck, which he pulls to a warehouse. In 2003, a person earning a similar income would be sitting in front of a computer, inputting data and matching orders with deliveries. What’s the key difference? Until recently, employers paid employees to exert energy and burn calories. In contrast, employers now pay workers to stay in their seats. For many, the most vigorous exercise comes from tearing off a sheet of paper from a printer or walking to the refrigerator. Furthermore, the decline in factory work — with its fixed lunch and coffee break schedule — enables people to eat more often. Less factory work means less supervision by foremen. According to Bureau of Labor Statistics data, manufacturing employment fell from about 24.4 percent of civilian employment in 1970 to merely 13 percent in 2000. A woman who spends her career sitting at a desk may “end up with as much as 3.3 units of bmi more than someone with a highly active job,” explain economists Darius Lakdawalla and Tomas Philipson. And a person telecommuting from home may be sitting even closer to the refrigerator or cupboard. In 1970 the term “telecommuting” did not even exist. By 2000, however, with advances in computers and remote access technology, approximately 12 percent of the workforce worked from home at least part of the week. This figure does not include over 25 million home-based businesses in the United States. Casual observation implies that many telecommuters take breaks from their home work at coffee shops and other sellers of baked goods.

Finally, some analysts argue that over the past three decades the national anti-smoking campaign has driven up cigarette prices and led smokers to switch from nicotine to calories.5

Fast food vs. alternatives


Very few defenders of fast food would tell moms and dads to throw out the home-cooked meal and instead eat 21 meals a week at White Castle. But it is a mistake to stereotype fast food as simply a cheeseburger and a large fries. Fast food restaurants have vastly expanded their menus for a variety of reasons including health concerns and demographic shifts. The increasing role of Hispanic Americans in determining national food tastes has inspired many fast food franchises to offer tacos, burritos, and salsa salads. Wendy’s, traditionally known for its square-shaped hamburgers, offers a low-fat chili dish that the Minnesota attorney general’s office recommended as a “healthier choice” in its fast food guide. McDonald’s has continuously revamped its menu in recent years. On March 10, 2003, the company unveiled a new line of “premium salads” that feature Newman’s Own All-Natural Dressings. In its publicity blitz, McDonald’s facetiously asked, “What’s Next? Wine Tasting?” Meanwhile Burger King features broiled chicken teriyaki in addition to its traditional fare. Judge Sweet noted that the Subway sandwich chain, which boasts of healthy choices, hired a spokesman who apparently lost 230 pounds of weight while eating the “Subway Diet.” In fact, fast food meals today derive fewer calories from fat than they did in the 1970s. Consumers can customize their fast food meals, too. Simply by asking for “no mayo,” they may cut down fat calories by an enormous proportion. It is worth pointing out that fast food firms introduced these alternative meals in response to changing consumer tastes, not in reply to dubious lawsuits. During the 1990s, McDonald’s and Taco Bell invested millions of dollars trying to develop low-fat, commercially viable selections such as the McLean Deluxe hamburger and Taco Bell’s Border Lights. Burger King adopted its “Have It Your Way” slogan several decades ago.

While plaintiffs’ lawyers vigorously denounce the nutritional content of fast food, they tend to ignore the nutritional content of alternatives. Home cooking, of course, has a nice ring to it, and it is hard to criticize the idea of a traditional meal cooked by mom or dad. But if we put nostalgia aside for a moment, we can see that the typical American meal of 25 years ago might win taste contests but few prizes from today’s nutritionists. Meat loaf, fried chicken, butter-whipped potatoes, and a tall glass of whole milk may have kept us warm on a cold winter evening, but such a diet would surely fail a modern test for healthy living. And let’s not even discuss a crusty apple pie or bread pudding for dessert. Yesterday’s comfort food gives today’s dietitians indigestion. It is no surprise, then, that today’s fast food derives a smaller percentage of calories from fat than a typical home meal from 1977-78. In fact, even in the 1970s, fast food meals had almost the same fat/calorie ratio as home cooking at that time. By this measure of fat/calories, fast food in the 1970s looked healthier than restaurant cooking, according to usda figures. Therefore, the caricature of fast food restaurants as a devilish place for nutrition makes little historical sense.

Now, it is true that home cooking has changed since the 1970s and that it has made even more progress than fast food at reducing fat calories. Very few families these days feast on pork rinds and pecan pie, a development that flatters our current nutritional tables. How do fast food meals compare to schools? Despite the legions of concerned dietitians and pta leaders, school meals do not look considerably better on the test of fat. While schools provide slightly fewer fat calories than fast food, they deliver more saturated fat, the more dangerous subset of fats. The comparison to sit-down restaurants is similar, with no clear advantage to either fast food or sit-down restaurants. In fact, the Chou study cited above finds that a proliferation of full-service restaurants would raise obesity levels more than a proliferation of fast food establishments. Of course, fast food firms have made it easier for patrons to learn about nutritional content than fancier kinds of food outlets. Few patrons of the fabled 21 Club in New York would know that its $26 hamburger is made with rendered duck fat. Should superchef Daniel Boulud worry about lawsuits for daring to sell a $29 hamburger at db Bistro Moderne that is crafted from ground sirloin and braised short ribs, stuffed with foie gras, and topped with shaved black truffles?

In sum, the facts show that obese plaintiffs might just as well walk up to a fast food counter rather than tuck a napkin under their chins and dine at a chic restaurant or at a school.

Fast food’s detractors also like to criticize portion sizes. True, fast-food restaurants have been offering super-sized sandwiches, drinks, and french fries. But have these critics been to a movie theater lately, where popcorn containers look like bushel baskets? Or to fancy restaurants featuring all-you-can-eat Sunday buffets? A study in the Journal of the American Medical Association (January 22, 2003) cited the “most surprising result [as] the large portion-size increases for food consumed at home — a shift that indicates marked changes in eating behavior in general.” People eat bigger portions of hamburgers, fries, and Mexican food on their own kitchen tables than when they are sitting on a fast food restaurant stool. In the study, “Patterns and Trends in Food Portion Sizes, 1977-1988,” researchers Samara J. Nielsen and Barry M. Popkin found that “the average home-cooked hamburger now weighs in at about 8 ounces, versus perhaps 5.5 ounces in full-service restaurants and a little over 7 ounces at fast-food outlets.” When the usda surveyed portion sizes and compared them to official U.S. government portions, they did find that fast food hamburgers exceeded official estimates by 112 percent. Yet they also found that Americans were eating pasta portions that surpass official measures by 333 percent and muffins that rise to 480 percent of the official sizes.6 If we are turning into a jumbo people, we are a jumbo people everywhere we eat, not just where the tort lawyers target defendants.

The cost of protein


As discussed above, obtaining enough protein and calories to fuel the human body has been a constant struggle throughout history. A time traveler from almost any other era would be befuddled by our current obsession with losing weight, which has spurred America’s $50 billion diet industry, $12 billion in annual health club revenues, and the 100,000 radical gastric bypass surgeries last year. Nowadays in the United States food comes pretty cheap, and fast food has played a role in giving people access to inexpensive provisions.

There are many measures of nutritional value. In an earlier time, we might simply measure calories per dollar. Yet because critics accuse fast food of selling “empty” calories (that is, calories comprised of fats and sugars), I have developed a more specific benchmark, namely, cost per gram of protein. Protein is the building block for muscles, and animal protein foods including meat, poultry, fish, dairy products, and eggs contain the nine essential amino acids that cannot be synthesized in the body. Using the ratio of dollar/protein gram seems reasonable and, because it does not include fats and sugars, creates a tougher test for fast food than the dollar/calorie measure.

Comparing the cost of protein obtained at fast food restaurants to protein obtained at supermarkets, one finds that fast food restaurants provide reasonable value to the consumer, considering the cost of raw materials and the cost of time in preparing meals. In a survey of fast food chains and supermarkets in five southern California communities (where the fast food chains and the supermarkets were located within the same towns), I compared the cost of purchasing a “marquee” hamburger, a grilled chicken sandwich, a fish sandwich, a sliced turkey sandwich, and a green salad. The results suggest that in some cases consumers can actually purchase a high protein meal at a fast food chain for less than the cost of buying the separate groceries at a supermarket and preparing the sandwich themselves. The comparisons also understate the cost of supermarket purchases, for two principal reasons: First, supermarket prices generally reflect a cost savings for purchasing a larger quantity. You can order one fish fillet from Burger King, but it is nearly impossible to buy a single frozen fish fillet in your supermarket. Second, supermarket prices do not reflect the time and cost to the shopper of preparing the meal at home. Nor have I included the extra ingredients such as pickles, relish, onion, mustard, and so on. There is little doubt that for a worker earning the average hourly rate (which is $15, according to the Bureau of Labor Statistics), preparing a cooked sandwich would cost far more in materials and time than simply purchasing it from a fast food restaurant. Even for a minimum wage worker earning $5.15 per hour, a fast food sandwich is probably much cheaper than spending 30 minutes preparing and grilling a hamburger, fish fillet, or chicken breast.

On average, a gram of hamburger protein found in a Burger King Whopper or McDonald’s Big n’ Tasty costs about 7 cents. Each sandwich provides 25 grams of protein. During a recent national campaign, both of these restaurant chains slashed their prices, bringing the dollar/protein ratio down to just 3.8 cents. The supermarket survey shows that a gram of protein from a ground beef patty and bun costs about 8 cents (leaner beef would cost somewhat more, standard ground beef somewhat less). Again, the cost of supermarket beef does not include the cost of accompaniments such as lettuce and tomato, nor does it include any time or labor costs for preparing a sandwich yourself.

For fish fillets, the results were similar. A Burger King fish fillet provides protein at 7.8 cents per gram. Van de Kamp’s and Gorton’s frozen fish fillets cost 15 cents per gram.

The results for grilled chicken sandwiches display an advantage for supermarket buyers. A Burger King grilled chicken sandwich provides 35 grams of protein at 10.5 cents per gram. McDonald’s grilled chicken costs 13.9 cents per gram. Purchasing chicken breast fillets at a supermarket averages just 4.6 cents per gram of protein. Again, the comparison does not include the extra costs or time involved in creating a grilled chicken sandwich served with lettuce, tomato, and seasoning.

Sliced turkey also shows an advantage for supermarket shoppers. While a Subway turkey sandwich costs almost 24 cents per gram of protein, sliced Sara Lee turkey averages just over 10 cents per gram of protein. Once again, the Subway sandwich also includes lettuce, tomatoes, green peppers, onion, olives, pickles, and a choice of breads, as well as the convenience of someone else putting together the meal.

Salad greens are roughly similar in price at fast food restaurants and supermarkets. Because greens are not notable for their protein content, I have instead calculated the cost per ounce. A Burger King side salad costs just under 20 cents per ounce, compared with over 27 cents for a Fresh Express bag of prewashed “American Salad.”

In sum, fast food provides in a number of cases competitively priced foods per gram of protein. For people who lack the time, kitchen space, or ability to purchase from grocery stores and cook at home, fast food can provide significant benefits. Furthermore, if consumers choose with some level of prudence from the fast food menus, they can eat fairly nutritious meals.

Chasing after nutrition


I remember my mother forcing us to eat beef liver every two months because it was iron-rich. I hated it and often sneaked bite-sized pieces under the table to our appreciative sheepdog. Nowadays, few people press cholesterol-laden liver on their families. For liver-hating kids everywhere, that represents a big step forward, almost as important as the Salk vaccine.

What has been more fickle than diet recommendations over the years, which continuously spark new fads? In the 1980s and early 1990s, “carb-loading” was hot, and steaming bowls of pasta shoved roast beef off the dinner table. Today, a plate of pasta scares those on the popular, low-carb Atkins diet, who are instructed to load up their breakfast plates with fried eggs, ham, and bacon while leaving toast off to the side. According to the Atkins approach, it is fine to bite into a greasy hamburger, but don’t dare chew on the bun. Desserts, too, have changed. During the 1960s and 1970s, parents maneuvered to keep chocolate away from children, fearing the high fat and sugar content, as well as a connection to acne. More recently we read that cocoa powder and dark chocolate may help delay the progression of cardiovascular disease.7 Chocolate contains a healthful nutrient known as a flavonoid that may slow the oxidation of “bad cholesterol” (ldl). So maybe we should not worry so much about a few pimples.

Surely, you might say, there are obvious national standards, such as the official U.S. Department of Agriculture food pyramid. Why not force fast food firms to serve meals that fit into the pyramid’s architecture? The pyramid tells us to eat at least six servings of grain (breads, pasta, etc.) each day, two servings of fruit, and only a little bit of fat or sweets. Sounds reasonable, no? Here is what the controversial head of the Harvard School of Public Health says about the pyramid: “Some people are likely to die from following the usda pyramid because they will be eliminating healthy fats, such as liquid vegetable oils, that actually reduce the risk of heart disease.” Who should Wendy’s listen to? The U.S. government or Harvard? Is this a fair choice for a restaurant?

During the 1980s, nutrition advocates lobbied McDonald’s to switch its french frying oil from partially beef-derived to vegetable-based oil. Then, after McDonald’s switched, many of the same advocates assailed McDonald’s for using trans-fatty acids — a result of using the vegetable oils! Now McDonald’s is introducing new vegetable oils that reduce the trans-fatty acids.

Here again, fast food presents a very different case from tobacco, even though plaintiffs’ counsels are eager to deploy the same lucrative cookie-cutter approach to litigation. Fast food meals, though tasty to many patrons, are not chemically addictive. One seldom hears of Subway or Wendy’s customers shaking with withdrawal symptoms when they give up a turkey sandwich or a frozen fish fillet. Moreover, no one has yet claimed that he became sick or cancerous, or even choked or coughed, from “second-hand” eating. Swallowing food is very much an individual act.

Cigarette research has been rather consistent for decades in pointing to the physical effects of smoking. In contrast, diet advice and research has been inconsistent and often contradictory. As a result, fast food firms have been reacting to the changing tastes and nutritional expectations of customers. As stated above, in the 1970s there was very little difference between the fat content of home-cooked meals and that of fast food meals. Fast food chains did not start out by conspiring to sell diabolical menus. Over the past 20 years, homes and fast food restaurants have pursued lower-fat menus (though homes have admittedly moved more quickly). This is to be expected since commercial restaurants would tend to follow the tastes of patrons. Today, nearly every fast food restaurant offers non-fried poultry and low-fat salads. Furthermore, within 20 seconds of inquiring, each of the fast food chains mentioned in this paper produced nutritional content charts. Should we expect or demand that fast food restaurants lead the march to better menus? How could they? What would they base it on? The federal government’s nutrition pyramid? The Harvard pyramid? The Atkins diet? Weight Watchers? Oprah’s personal plan? Clearly the best avenue is for fast food firms to provide choices and provide information so that customers can be informed, prudent, and as up-to-date as they like.

In April 2003, the Wall Street Journal carried the following headline: “Wendy’s Sees Green in Salad Offering: More Sophistication, Ethnic Flavors Appeal to Women.” Salads had leapt to more than 10 percent of Wendy’s total sales, from 3 percent a year earlier. In October 2002, Bloomberg News announced that “Wendy’s 3rd-Qtr Net Rises 16% as Salads Boost Sales.” The story explained how Wendy’s new “Garden Sensations” salad strategy was drawing customers from sit-down restaurants, while also posing new challenges to McDonald’s and Burger King, “as consumers seek healthier choices.” The story then described how Wendy’s more healthful strategy spurred on “rival Burger King [which] is trying to gain market share by introducing new items that compete directly with Wendy’s, including a baked potato and chili.” Is this a broken system that desperately cries for judicial action? No, it is a super-competitive market where stores jockey for position, trying to please customers and their changing tastes for a more healthful lunch.

Faced with the conundrum of changing tastes and nutritional recommendations, Judge Sweet shrewdly took up the distinction between an inherently dangerous meal and a meal that may pose some legitimate risk, if only from over-consumption. The Restatement (Second) of Torts explained that “Ordinary sugar is a deadly poison to some diabetics” and that “Good whiskey is not unreasonably dangerous merely because it will make some people drunk, and is especially dangerous to alcoholics; but bad whiskey, containing a dangerous amount of fuel oil, is unreasonably dangerous.” These risks are not good reasons to outlaw good sugar or good whiskey. Fried fish may be oily, but that does not mean it is contaminated. Absent a truly compelling and sweeping health reason, we should not let lawsuits rob consumers of choices.

Judge Sweet recognized “that the dangers of over-consumption of . . . high-in-fat foods such as butter, are well-known. Thus any liability based on over-consumption is doomed if the consequences of such over-consumption are common knowledge. . . . Thus, in order to state a claim, the Complaint must allege either that the attributes of McDonald’s products are so extraordinarily unhealthy that they are outside the reasonable contemplation of the consuming public or that the products are so extraordinarily unhealthy as to be dangerous in their intended use. The Complaint — which merely alleges that the foods contain high levels of cholesterol, fat, salt and sugar, and that the foods are therefore unhealthy — fails to reach this bar.” Judge Sweet also found, as I did in my survey, that McDonald’s willingly provides information on the nutritional content of its products.

What would the plaintiffs’ counsel want McDonald’s to do — other than pay out settlement sums? Should Judge Sweet have stopped McDonald’s from flipping burgers? What about diners at the 21 Club? Should they too be protected, or are the fast food lawsuits a patronizing tool to protect the poor and allegedly ill-educated from their own mouths? If the fear is over-consumption, should McDonald’s discriminate against plump people? Should a cheeseburger require a doctor’s prescription? Should fast food firms be required to punch holes in a meal ticket and refuse to serve those who have already filled their card? Surely some intermeddlers could devise a national bmi card, certified by a government nutritionist, that determines how many fat grams Burger King may sell to you. Of course, that number would have to be revised with each new issue of the Journal of the American Medical Association and after every meeting of the American Society for Clinical Nutrition.

Governing what we eat


The food and drug Administration, with its battalion of researchers, aided by thousands of university and private-sector scientists throughout the world, is constantly exploring, testing, and digging for scientific insight. A class action lawsuit would not be digging for scientific inferences. Instead, plaintiffs’ lawyers would be digging into the pockets of franchise owners, employees, and shareholders in order to pull out gold. Moreover, the threat of such lawsuits can do no good to the employees, shareholders, or customers of fast food firms. When tort lawyers strut in front of cameras waving weighty complaints that are flimsy in facts, the media quickly follow the story. Nearly every major publication in the country carried stories about the McDonald’s obesity suit. If “McLawsuits” spread, we will see at least one, if not all, of the following three results: 1) lower wages for fast food employees; 2) lower stock prices for shareholders; 3) higher prices for consumers. Fast food restaurants hire and train hundreds of thousands of workers, attract investments from millions of middle-class citizens, and quench the hunger and thirst of tens of millions of satisfied patrons.

Let us be frank here. Depending on what you pile on it, a fast food burger may not enhance your health, and it may even hinder your ability to run a marathon — but it is very easy to find out how fatty that burger is. You do not need a lawyer by your side to pry open a brochure or to check the thousands of websites that will provide nutrition data. While it is unlikely that nutritionists will soon announce that super-sized double cheeseburgers will make you thin, society should not allow the latest fads or the most lucrative lawsuits to govern what we eat for lunch.




Notes

1The BMI is calculated by dividing weight in kilograms by height in meters squared. A person five foot five, weighing 150 pounds, would have a BMI of 25. A taller person — say, six feet tall — could weigh 184 and have a BMI of 25 too.

2Even though the BMI index was not widely used until the 1990s, it is possible to construct historical BMIs based on known heights and weights. See Dora Costa and Richard Steckel, “Long Term Trends in Health, Welfare, and Economic Growth in the United States,” in Floud and Steckel, eds., Health and Welfare During Industrialization (University of Chicago Press, 1997). These numbers are not definitive for each individual, since a very muscular person, for example, may have a high BMI simply because muscle weighs more than fat. For example, Arnold Schwarzenegger and Sylvester Stallone might technically be obese if one looked only at their BMI ratings.

3David M. Cutler and Edward L. Glaeser, “Why Have Americans Become More Obese?” (NBER Monograph, January 2003). This paper presents an intriguing hypothesis that technology has created obesity by making ready-to-eat foods more available.

4Cutler and Glaeser present these data, derived from the USDA’s “Continuing Survey of Food Intakes by Individuals.”

5See Shin-Yi Chou, Michael Grossman, and Henry Saffer, “An Economic Analysis of Adult Obesity: Results from the Behavioral Risk Factor Surveillance System” (National Bureau of Economic Research, July 2001).

6See Lisa R. Young and Marion Nestle, “The Contribution of Expanding Portion Sizes to the US Obesity Epidemic,” American Journal of Public Health (February 2002).

7Ying Wan, et al., “Effect of Cocoa Powder and Dark Chocolate on LDL,” American Journal of Clinical Nutrition (November 2001).

Todd G. Buchholz, an economic adviser in the George H.W. Bush administration, is the author of Market Shock (HarperBusiness). This essay was funded by a grant from the U.S. Chamber of Commerce.

Copyright © 2004 The Policy Review