Thursday, September 30, 2004

Presidential Debate Primer

That LBJ was so slick. He avoided debating Barry Goldwater in 1964. I would wager that W would like to pull the same trick himself. I cannot imagine the gut-wrenching pressure that W and Kerry are feeling as zero-hour approaches. If this is (fair & balanced) anxiety, so be it.

[x History News Network]
Presidential Debates: HNN Scrapbook
By Alex Bosworth

Frontrunners

After the first televised debate in 1960, when Vice President Nixon saw his solid lead in the polls evaporate after debating the photogenic Kennedy, Press Secretary James Hagerty predicted, "you can bet your bottom dollar that no incumbent president will ever engage in any such debate or joint appearance in the future."

President Lyndon B. Johnson avoided debating Barry Goldwater in 1964 through a technicality. Section 315 of the Communications Act reserved equal air time for all candidates, not just those belonging to the Republican or Democratic parties. Johnson claimed that a debate would give himself and Goldwater an unfair advantage over third party candidates. (In 1960 an exemption from the law was approvd by Congress.) Note: President Kennedy had considered debating Goldwater frequntly during the 1964 election.

Having learned his lesson in 1960, Nixon used the Communications Act to avoid debates during the 1968 and 1972 elections. The Federal Communications Commission decided that Section 315 did not apply to presidential debates in 1975.

In 1984 President Ronald Reagan agreed to debate Walter Mondale despite holding a massive lead in the polls. In the words of historian Alan Schroeder, Reagan's decision "shored up campaign debates as a permanent institution."

By 1992, the American public had come to expect debates. When President George H.W. Bush pulled out of a debate in Michigan, "Chicken George" protestors dressed in chicken suits began to arrive at Bush campaign events. The pressure forced Bush to agree to three debates with Clinton and Ross Perot.

In 1996 President Clinton, enjoying a sizable lead in the polls, scheduled his last debate with Bob Dole to occur during a televised baseball game. Advisor George Stephanopoulos commented: "we didn't want people watching the debates."

Appearing Presidential

In 1960 Kennedy inserted the requirement that both candidates stand in order to exploit the fact that Nixon had sprained his knee. Nixon shifted his weight during the debates, an action that looked uncertain and nervous.

During the first debate of 1976, the sound system went out. Gerald Ford and Jimmy Carter stood rigidly at their lecterns for more than twenty minutes while technicians worked to restore audio.

Carter demanded a smaller lectern to stand behind in order to mask his shorter height. In return, President Ford was permitted to pick a background color that would mask the fact that the president was balding.

President Ford wanted to attach the presidential seal to his lectern for his debates with Carter. President Carter proposed the same thing during negotiations with Reagan four years later.

Michael Dukakis, six inches shorter than Vice President George H.W. Bush, arranged in 1988 to stand on a ramp that would raise him up to Bush's height.

Geoge H.W. Bush preferred to stand on the right side of the stage to hide his receding hair line.

Format

The first town hall debate (where undecided voters are allowed to ask the candidates questions) began at the behest of Clinton in 1992. Clinton advisors wanted to showcase the future president's people skills.

Debates are usually scheduled to last 90 minutes. The longest debate lasted 100 minutes, during the first Reagan - Mondale debate. Reagan had originally wanted the 1984 debates to last just 60 minutes each.

The Commission on Presidential Debates invited Perot to the first three-way debate in 1992. The Commission decided that Perot did not have enough popular support to merit inclusion in the 1996 debates.

The most debates in any one year was four, in 1960.

The fewest number of debates occurred in 1980. President Carter boycotted one debate, and then engaged in a disastrous debate with Reagan one week before the election.

In 1984, Mondale and Reagan rejected a combined 83 journalists as debate panelists. The League of Women Voters condemned both campaigns for "totally abus[ing]" the process.

Preparation

Bob Dole spent little time preparing for his 1996 debates with President Clinton. President Nixon refused to spend any time preparing for his debates with Kennedy in 1960. Both men lost their respective elections.

Candidates can be equally hurt by preparing for the wrong kind of debate. In 1984 Reagan entered the first debate with Mondale expecting to be attacked. When he was not, a confused Reagan floundered throughout the debate.

Clinton researched debates in both 1996 and 1992 extensively. In 1992 advisor Harry Thomason arranged for camera shots of Clinton to include Bush or Perot in the background, reacting to Clinton's speech. Clinton came off looking like the strongest of the candidates.

Ford was the first candidate to act out an entire debate with his advisors to prepare for the 1976 debates.

Reagan rehearsed his 1980 line "There you go again," in order to make it seem spontaneous.

Lessons

"TV favors the underdog," in the words of Alan Schroeder. Ever since Nixon lost the 1960 election, the frontrunner's campaign has typically tried to reduce the number of debates, while the candidate who is behind has tried to increase the number.

James Baker, who famously suckered Carter into a late debate with Reagan, argues that once debates are scheduled, the polls "freeze." Remaining undecided voters wait until after viewing the debates to make up their mind.

Cartoon characterizations win debates. Mondale made Reagan seem old in 1984 at their fist debate, Clinton made George H.W. Bush look out of touch in 1992, and Dukakis came across as wooden after the 1988 debates.

Vice presidential candidates can attack their opponents with few negative consequences. Lloyd Bentsen demolished Dan Quayle in 1988 by referring to him as "no Jack Kennedy." Quayle tried to return the favor in 1992 by attacking Clinton's character.

Catchy slogans can convince the American voter better than factual information. Reagan managed to erase all concerns about his position on Medicare with four words to Carter: "there you go again." Similarly, Clinton gained a strong following with American women despite allegations of his infidelity by telling audiences that he felt their pain.

Televised debates favor politicians who are able to appear charming and at ease while on the stage. Kennedy trounced a sweating Nixon in 1960. Clinton played to the audience in 1992 by stepping off the debate stage and into the crowd. Former Hollywood star Reagan famously quipped when asked if he was nervous to stand on the same stage as President Carter: "Not at all. I've been on the same stage with John Wayne."

Alex Bosworth is an HNN intern. He graduated with a degree in history from Whitman College in May 2004.

Copyright © 2004 History News Network

Getting To The Bottom Of Things

In the daily Technology newsletter from The Washington Post, a trivia quiz is supplied. I got the correct answer today. If this is (fair & balanced) smugness, so be it.

According to a recent survey, what percentage of wireless-device users say they have e-mailed from a restroom?
1%
15%
30%
45%
In a recent survey by Harris Interactive commissioned by wireless provider T-Mobile USA Inc., x percent of wireless-device users said they have e-mailed from a restroom, 19 percent while eating in a restaurant, and 21 percent while talking to friends or family.

Copyright © 2004 The Washington Post

The Roots Of War In Iraq

We have met the enemy and he is us. If this is (fair & balanced) petroleum avarice, so be it.

1.
[x Slate]
Oil Terror: Don't blame Osama for high gas prices.
By Daniel Gross

As the price of crude oil spiked in recent months, economists and energy pundits were stumped. The data on global economic growth, supply, and demand didn't seem to warrant the action in the markets. There had to be an alternative explanation.

Thus was born the "terror premium." In May, when crude topped $40 per barrel, economists and oil analysts began to argue that prices were being pushed up by fear of terrorism and political instability in major oil production centers—Nigeria, Venezuela, Russia, and particularly Iraq and Saudi Arabia. What's more, several big events slated for the summer—the Euro 2004 soccer tournament, the Athens Olympics, the U.S. political conventions—would provide targets for attacks. George Worthington, Asia Pacific chief economist for Thomson IFR, quantified the terror premium as $8 a barrel. The same month, the Economist likewise settled on $8, and the Kerry campaign noted that "Economists estimate that Americans are now paying a 'terror premium' of $10 to $15 per barrel for oil." In August, HSBC economist John Butler suggested the premium was somewhere between $10 and $15.

As evidence for the terror premium, analysts tended to cite the difference between the price of a barrel of crude oil to be delivered in the next month and the price of a barrel to be delivered much later. In March 2003, for example, just before the Iraq war started, light sweet crude traded at about $35 per barrel in the spot market while the price for a June 2004 contract was only about $25. In May, when oil hit $40, the price for a barrel of oil to be delivered in July 2005 was about $35. In other words, the market assumed that oil prices would return to historical means in the future once the temporary instability responsible for today's elevated prices dissipated.

But what if the theory of the temporary terrorism premium is wrong? After all, terror and instability are likely to be with us for the foreseeable future. And the absence of terrorist incidents at major events this summer didn't calm the oil market down. Quite the contrary: Oil prices are climbing and topped $50 a barrel this week. Could it be that the rise of oil prices is caused by global economic forces that are more powerful than terrorism?

As Barry Ritholtz points out, there may never have been much of a terror/instability premium in oil prices to begin with. After 9/11, when the world suddenly became aware of the ravages of terrorism, oil prices plummeted briefly. Why? Investors feared the global economy would contract, thus tamping down demand. Further, between April 2002 and January 2004, oil prices were relatively stable. (The price spiked sharply before the outbreak of hostilities in Iraq in early 2003 but quickly settled back down to prewar levels.) It's only in the past year that prices have taken off.

What's changed, then? Global economic growth, continued increases in demand from all corners of the globe, and concerns about capacity. Three years ago, Japan, the United States, and Europe were in a rare period of synchronous recession. Today, all three are growing, and China's appetite for industrial commodities is suddenly omnivorous. Perhaps the price of oil is rising because the long-term equation of supply, demand, and capacity has changed in the past few years.

The first reaction to such an assertion—by investors and by analysts—is denial. Analysts tend to forecast by analogy: Under generally similar macroeconomic circumstances in the past, oil prices behaved a certain way, so that's how they should behave next year. But perhaps we are at the beginning of a new era in which the old rules have been thrown out the window. The International Energy Agency noted that second-quarter oil demand grew by a whopping 5 percent, despite the higher oil prices. Meanwhile, the United States, the world's largest consumer of oil, produces less and less of its own oil and imports more and more each year; car sales are growing in China; and the huge economies of Latin America—Argentina and Brazil—seem to be recovering. If demand continues to grow rapidly, concerns over production capacity will likely increase, not lessen.


Daniel Gross is a journalist, editor, and New York Times best-selling author based in New York and Connecticut who specializes in three broad areas: business history, the links between business and politics, and the culture of Wall Street. A graduate of Cornell University, he holds an A.M. in American history from Harvard University, and has been a fellow at the New America Foundation. He has worked as a reporter at The New Republic and Bloomberg News, and has written articles, book reviews, essays, and commentary for more than 60 publications, including New York Times, Washington Post, Boston Globe, New York, New York Observer, Slate, American Prospect, and Washington Monthly. He currently contributes columns to Slate, Attache, and CEO.

He appears frequently in the media to discuss issues affecting Wall Street and Washington, and in 2000 coined the term “the politics of personal finance.” Gross has appeared on CNBC, CNN, Fox News Channel, The News Hour with Jim Lehrer, C-SPAN, Bloomberg Television, Reuters Television, and on more than 35 radio programs,, including Fresh Air with Terry Gross, and NPR’s “Marketplace.”

Gross is the author of three books: Forbes Greatest Business Stories of All Time (Wiley, 1996), a New York Times and Business Week best-seller that has been translated into seven languages; Bull Run: Wall Street, the Democrats, and the New Politics of Personal Finance (PublicAffairs, 2000), and Generations of Corning: 150 Years in the Life of a Global Corporation, 1851-2001 (Oxford University Press, 2001), co-authored with Davis Dyer.
Since 1999, Gross has edited STERNbusiness, the semi-annual management journal published by New York University’s Stern School of Business.

In collaboration with design and editorial professionals, Gross has worked with individuals and companies in areas such as executive recruiting, management consulting, media, insurance and utilities, and provided editorial services including speechwriting, ghostwriting, business plans and annual reports, op-ed articles, and book packaging.


Copyright © 2004 Slate Magazine

2.
[x The Boston Globe]
What I hope they'll say: We're a selfish nation
by Derrick Z. Jackson

During a football commercial break, my TV turned into a kaleidoscope for a crazily spinning Hummer. During another timeout, a Cadillac spun around a dance floor, bullying several foreign luxury cars off to the side.

No other metaphors are necessary to understand the United States on the eve of the presidential debates.

About 1,050 U.S. soldiers are dead in Iraq. Up to 15,000 Iraqi civilians are dead. None of that has persuaded us Americans to put down the kaleidoscope and stop spinning in our own orbits. No amount of mass sacrifice abroad has resulted in mass sacrifice at home. No amount of failure in the original mission of finding weapons of mass destruction in Iraq has made us question the fantasy of bullying the world. Our toys really are us. We're big, we're bad, and you Euro girlie-cars, we're cutting in.

It would be sad to conclude someday that our leaders sent our soldiers halfway around the world to die for our cars. In the absence of weapons of mass destruction and in the absence of Saddam Hussein being tied to Sept. 11, there is not much left to conclude. On its current Web site, the Economist Intelligence Unit says "the biggest potential prize" and an "ideal prospect" for international oil companies is Iraq, home to the world's second- or third-largest oil reserves. In 1993, the deaths of a mere 18 Army Rangers in resource-starved Somalia made us flee that country.

Today we accept 58 times more American fatalities to secure Iraq. We accept the death of human beings who just finished being boys and girls, yet we have not accepted the notion that to avoid losing more of them, the rest of us must grow up. In 1991, during the first Gulf War to defend Kuwait from Saddam, Americans were consuming 25.2 percent of the world's oil. Today the figure is 26.1 percent, according to statistics kept by British Petroleum.

A huge part of that consumption is our insistence on huge cars, symbolized by Hummers and Caddys. But almost everything about our lifestyles, from our obesity epidemic to our homes, reeks of not giving one whit about being only 4 percent of the planet's population yet creating a quarter of the greenhouse gases that contribute to global warming. Even though the size of the American family has shrunk over the last half-century, the size of the average American home has more than doubled, with a single home in the suburbs loaded with more technology than whole villages in the developing world.

If any of this comes up during the debates, it will be a miracle. The last thing voters want to hear from a presidential candidate is that a more secure America means a less selfish America. You certainly will not hear that from President Bush, who says he can drill us into energy independence, even if that takes out a few snow geese and polar bears up in the Arctic. Nor will you probably hear much about sacrifice from his challenger, John Kerry. The Massachusetts senator has a voting record that earned him the endorsement of many environmental groups. But in the heat of pandering to voters, Kerry also said: "You want to drive a great big SUV? Terrific. That's America."

It should be a national shame that more than 1,000 soldiers have died in Iraq, and to this day the only sacrifice President Bush has asked of Americans was so trivial as to be utterly American. "One of the great goals of this nation's war is to restore public confidence in the airline industry," Bush said 16 days after 9/11. "It's to tell the traveling public: Get on board. Do your business around the country. Fly and enjoy America's great destination spots. Get down to Disney World in Florida."

Hummers, Caddys and Disney World. That's America. A Martian landing in front of an American TV set on a Sunday afternoon would conclude that that is what our modern wars are for.

The presidential debates start tonight. Bush and Kerry will say they can best finish the job in Iraq and make America more secure. Neither has dared to tell Americans that the job begins at home. The debate will matter when one of them asks us to put down the kaleidoscope and end the fantasy of a chicken in every pot and a gas guzzler in every garage.

Derrick Z. Jackson was a 2001 finalist for the Pulitzer Prize in commentary and a winner of commentary awards from the National Education Writers Association and the Unity Awards in Media from Lincoln University in Missouri.

A Globe columnist since 1988, Jackson is a five-time winner and 10-time finalist for political and sports commentary from the National Association of Black Journalists. He is a three-time winner of the Sword of Hope commentary award from the New England Division of the American Cancer Society.

Prior to joining the Globe, Jackson won several awards at Newsday, including the 1985 Columbia University Meyer Berger Award for coverage of New York City.

Jackson, 45, is a native of Milwaukee, WI, and a 1976 graduate of the University of Wisconsin at Milwaukee. Jackson was a Nieman Fellow in Journalism at Harvard University in 1984. He holds honorary degrees from the Episcopal Divinity School in Cambridge, MA, Salem State College, and a human rights award from Curry College in Milton, MA


Copyright © 2004 The Boston Globe

Wednesday, September 29, 2004

National Energy Policy?

We will answer for our gluttony. If this is (fair & balanced) eschatology, so be it.

[x The Chronicle of Higher Education]
THE NATURAL WORLD: The End of Easy Oil
By MALCOLM G. SCULLY

You don't have to be a conspiracy theorist or a Michael Moore enthusiast to think that Donald Rumsfeld and his colleagues in the Bush administration are being disingenuous when they declare that the war in Iraq is not about oil.

In fact, according to the authors of two new books, most foreign-policy and many domestic decisions made by the current administration -- and by its predecessors going back to that of Franklin D. Roosevelt -- have been shaped, overtly or covertly, by a desire to assure a secure supply of cheap petroleum for America's economic and military needs. And, the authors of the books conclude, maintaining that "energy security" will become more difficult, more dangerous, and more likely to produce violence in the years ahead.

Our petroleum habit will have growing influence on both geopolitical and economic issues, according to Paul Roberts in The End of Oil: On the Edge of a Perilous New World, published by Houghton Mifflin, and Michael T. Klare, in Blood and Oil: The Dangers and Consequences of America's Growing Petroleum Dependency, published by Metropolitan Books.

As Roberts, a writer who focuses on economic and environmental issues, says: "Although we will not run out of oil tomorrow, we are nearing the end of what might be called easy oil. Even in the best of circumstances, the oil that remains will be more costly to find and produce and less dependable than the oil we are using today."

Klare, a professor of peace and world-security studies at Hampshire College and defense correspondent for The Nation, suggests that the United States has never resolved the inherent tension between our need for assured supplies of petroleum to keep the economy cooking and our growing reliance on overseas sources of that oil, especially from areas, like the Persian Gulf, that have a long and continuing history of instability.

Rather than develop a sustained strategy for reducing our reliance on such sources, he says, American leaders "have chosen to securitize oil -- that is, to cast its continued availability as a matter of 'national security,' and thus something that can be safeguarded through the use of military force."

Klare argues that our demands for energy and those of other major powers will require the petroleum-rich Gulf states to "boost their combined oil output by 85 percent between now and 2020. ... Left to themselves, the Gulf countries are unlikely to succeed; it will take continued American intervention and the sacrifice of more and more American blood to come even close. The Bush administration has chosen to preserve America's existing energy posture by tying its fortunes to Persian Gulf oil."

Even more worrisome, Klare says, is the intense and growing competition among countries such as the United States, China, India, and those in the European Community over petroleum supplies. "This competition is already aggravating tensions in several areas, including the Persian Gulf and Caspian Sea basins," he writes. "And although the great powers will no doubt seek to avoid clashing directly, their deepening entanglement in local disputes is bound to fan the flames of regional conflicts and increase the potential for major conflagrations."

That's pretty alarming stuff, and some people may be tempted to dismiss Roberts's and Klare's analyses as anti-Bush, anti-oil rhetoric. But the questions they raise transcend approval or disapproval of any one administration, and go to the core of whether any country can -- purposefully and without vast disruptions -- make the transition from an economy dependent on one finite resource to an economy based on renewable, nonpolluting resources.

The authors argue that such a transition would be difficult in the best of times, and that these are not the best of times.

Roberts notes, for instance, that the development of renewable alternatives to petroleum, such as biofuels, solar power, clean coal, and hydrogen, has not been as rapid or as simple as their promoters had hoped. And even if those alternatives had been developed more fully, he adds, "many of the new fuels and technologies lack high power density and simply will not be able to deliver the same energy punch as the hydrocarbons they replace."

What that means, he says, is that the new technologies must be accompanied by sharp increases in energy efficiency. He is not sanguine about achieving such gains. "In spite of high energy prices and rising concerns about energy security, consumers and policymakers alike have all but stopped talking about the ways we use energy, how much we waste, and what might be changed."

Klare writes that President Bush's choice of Vice President Dick Cheney to conduct a major review of energy policy preordained an antiefficiency outcome. When the National Energy Policy Development Group began its work, in February 2001, he writes, the United States "stood at a crossroads." It could "continue consuming more and more petroleum and sinking deeper and deeper into its dependence on imports," or "it could choose an alternative route, enforcing strict energy conservation, encouraging the use of fuel-efficient vehicles, and promoting the development of renewable energy sources."

While the group's report -- National Energy Policy -- gave lip service to the concepts of conservation and energy self-sufficiency, he says, a close reading "reveals something radically different." The policy "never envisions any reduction in our use of petroleum," Klare writes. "Instead it proposes steps that would increase consumption while making token efforts to slow, but not halt, our dependence on foreign providers."

Given the Bush administration's close ties to the oil-and-gas industry, such an outcome may have been inevitable, Klare says. But even an administration without such links would find it politically risky to move to a radically different energy policy. Like his predecessors, he notes, President Bush "understood that shifting to other sources of energy would entail a change in lifestyle that the American public might not easily accept. ... And so he chose the path of least resistance."

Roberts, who focuses on the question of total energy supply more than on the geopolitical consequences of relying on foreign oil, finds little cause for optimism in our current strategy. The longer we put off the transition to a postpetroleum era, the harder that transition will be, he says, and the more unrest and violence we will encounter.

As oil supplies dwindle, "energy security, always a critical mission for any nation, will steadily acquire greater urgency and priority," he writes. "As it does, international tensions and the risk of conflict will rise, and these growing threats will make it increasingly difficult for governments to focus on longer-term challenges, such as climate or alternative fuels -- challenges that are in themselves critical to energy security, yet which, paradoxically, will be seen as distractions from the campaign to keep energy flowing. ... The more obvious it becomes that an oil-dominated energy economy is inherently insecure, the harder it becomes to move on to something else."

In the meantime, Klare argues, the Bush administration's war on terrorism, the impulse of its neoconservative supporters to spread "democracy" to the Middle East, and our desperate need for stable supplies of oil have merged into a single strategy -- one that will commit us to maintaining military forces in many parts of the world and to using those forces to protect oil fields and supply routes.

"It is getting hard," he writes, "to distinguish U.S. military operations designed to fight terrorism from those designed to protect energy assets."

Many of the authors' arguments and conclusions have been advanced before, and both men fall into the category of "energy pessimists," who do not believe that we will be able to maintain our current levels of oil consumption for as long as agencies like the U.S. Geological Survey and Europe's International Energy Agency predict. Such agencies, Roberts says, "are under intense political pressure to err on the side of wild optimism."

But regardless of whether Klare and Roberts err on the side of pessimism, their message is unsettling: We are headed into uncharted territory, led by a government that seems prepared to use force, when necessary, to preserve the current system. We face growing competition from other countries for a finite resource at a time of growing animosity toward the United States.

It is a message that is moving beyond academic and environmental circles. In a recent "midyear outlook" report, Wachovia Securities, a large investment company, examines the impact of "the end of cheap oil" for investors. "We neither expect, nor wish to dwell on, worst-case scenarios -- but the market knows it is foolhardy to ignore the possibilities," the report says. It warns that with record-high oil prices and many domestic refineries operating at or near capacity, "a disruption somewhere in the production chain could have a greater than normal effect on energy markets."

The war on terror, it adds, "raises the risk that such a disruption would not be an accident."

Malcolm G. Scully is The Chronicle's editor at large.

Copyright © 2004 by The Chronicle of Higher Education



Both W & Kerry Should Answer Who Will Die, And When, And For What!

As I write this, I am listening to Arthur Schlesinger, Jr. on the "Diane Rehm Show" on Texas Public Radio via the Internet. Schlesinger maintains that we have recreated the Imperial Presidency and that W will gain passage of the Patriot Act II if he is reelected. How soon we forget and how little we learn from past mistakes. Every time I hear some Rightist proclaim that we must support our president, I want to vomit. In the meantime, our troops are dying and innocent Iraqi men, women, and children are dying at our hands. I remember Robert S. McNamara confessing in Errol Morris' "The Fog of War" that he and others planning the firebombing of Japanese cities during WWII were guilty of war crimes. As McNamara noted, though, the victors are never convicted of war crimes. When Don Imus—on his radio show—calls Rummy, Wolfie, and the Dickster "war criminals," the I-Man is closer to the truth than he realizes. If this is (fair & balanced) genocide, so be it.

[x History News Network]
The Big Question that Needs to Be Asked at the Presidential Debates
By Tom Palaima

If you have seen the film "Black Hawk Down" or read We Were Soldiers Once and Young or visited the Imperial War Museum in London, you might think, as Douglas MacArthur did, that Plato said, "Only the dead have seen an end to war."

Plato could have, but he didn't. It was George Santayana, looking at World War I veterans celebrating in a British pub, who uttered the sad words: "The poor fellows think they are safe! They think that the war is over! Only the dead have seen the end of war." American soldiers see no end to war in Iraq. Yet both presidential candidates have avoided telling us what they will do with our soldiers and our weapons if they are elected in about five weeks.

Do we even want to know? President Bush says he will stand firm. But this is easier to do in Alabama and Washington than in Fallujah. He also now wants other United Nations leaders, or rather their soldiers, to stand firm with our soldiers in the war zone our senators and representatives gave him the authority to create.

Senator John Kerry has his own vague approach to strengthen our shrinking coalition of the increasingly unwilling. European countries whose families still remember the Somme, Stalingrad and Dien Bien Phu will send their soldiers off to a new locus of sorrow. Our one relevant cultural memory, Vietnam, has disappeared in a political shell game concerning old service records and old combat medals.

Will the presidential debates force the candidates to stop playing politics with American and Iraqi lives? Is there any journalist with enough authority and integrity, and plain guts, to be our elder Cato and stay on message with a handful of questions? What do these two men who would be our commander-in-chief for the next four years think of the recent assessment of the leading Iraq expert at the Army War College's strategic studies institute that the insurgency in Iraq cannot be killed by our overwhelming firepower? Is the professor of strategy at the Air War College in need of new glasses when he sees "no ray of light on the Iraqi horizon"? Things have reached the point where Americans deserve straight answers to the kinds of moral questions soldier poet Siegfried Sassoon posed as he waded resolutely out of the killing trenches of World War I and back to London: What are our set aims? What is our time limit for accomplishing them? What price are we willing to pay in human lives? And whose lives will we pay? And where is peace?

We have in the last seventy years increasingly sought and achieved peace through desolation. Since Oswald Spengler published his The Decline of the West after World War I, the United States has been seen, quite rightly, as a Roman civilizing presence in the world. We build things - roads, arenas, luxury villas - and we destroy things with the same energetic efficiency. We use pragmatic Roman methods as we try to shape world affairs to our purposes.

The Roman historian Tacitus put it this way: "Where they make a desolation, they call it peace." Ali Adr, a temporarily dispersed pro-Sadr fighter, I am sure has never read Roman history. But he is bluntly Tacitean: "The Americans destroy, we build." We destroy. And destroying brought a long period of peace, at least in Europe and the United States, until what H.G. Wells would call man's beast nature crept back in Bosnia, in Kosovo, in lower Manhattan. And our response has been the same as ever: overwhelming force.

In April 1967, Martin Luther King reasoned that his own government was "the greatest purveyor of violence in the world today." And we were once openly glad of it. We dropped 1.36 million tons of bombs on Germany. One hundred and sixty thousand tons of bombs incinerated sixty six Japanese cities. We then moved into the atomic age, and Nagasaki and Hirsohima joined the list.

Our men came home, my father and father-in-law among them. We had peace. What Bob Dylan called the "big bombs and death planes" secured that peace.

We thought we had to make it even more secure. So we began dropping 7,078,032 tons of bombs on a single small country in southeast Asia, a thousand pounds for every living soul. Peace came there, too, but no victory. And our men came home, give or take 58,226.

Those whose names are etched in mirrored stone a short distance from White House and Congress died by degrees. No more than 300 in any set battle. At most 543 in a week.

Only the dead see the end of war. But we the living decide who will die, and when, and for what.

And we and our next president need to reach a moral decision. Now.

Tom Palaima teaches Classics in the College of Liberal Arts at the University of Texas. This article was first published by the Sacramento Bee and is reprinted with permission of the author.

Copyright © 2004 Tom Palaima


Christopher Buckley Is A Right-Wing Kinkster!

I like it when a Republican can make me laugh out loud. Most of the Rightists provoke my gag reflex. The Kinkster is another writer who makes me laugh out loud. Neither of the presidential candidates make me laugh out loud. That is why I am supporting Richard (Kinky) Friedman for governor of Texas. I am going to suggest that the Kinkster hire Buckley as his speechwriter. If this is (fair & balanced) levity, so be it.

[x New Yorker]
RULES OF ENGAGEMENT
by CHRISTOPHER BUCKLEY

At no time during these debates shall either candidate move from their designated area behind their respective podiums.
—From the agreement worked out for the Presidential debates.

Paragraph Two: Dress.
Candidates shall wear business attire. At no time during the debates shall either candidate remove any article of clothing, such as tie, belt, socks, suspenders, etc. Candidates shall not wear helmets, padding, girdles, prosthetic devices, or “elevator”-type shoes. Per above, candidates shall not remove shoes or throw same at each other during debate. Once a debate is concluded, candidates shall be permitted to toss articles of clothing, excepting underwear, into the audience for keepsake purposes.

Paragraph Six: Hand gestures.
“Italian,” “French,” “Latino,” “Bulgarian,” or other ethnic-style gestures intended to demean, impugn, or otherwise derogate opponent by casting aspersions on opponent’s manhood, abilities as lover, or cuckold status are prohibited. Standard “American”-style gestures meant to convey honest bewilderment, doubt, etc., shall be permitted. Candidates shall not point rotating index fingers at their own temples to imply that opponent is mentally deranged. Candidates shall at no time insert fingers in their own throats to signify urge to vomit. Candidates shall under no circumstances insert fingers into opponent’s throat.

Paragraph Seventeen A: Bodily fluids-Perspiration.
Debate sponsors shall make every effort to maintain comfortable temperature onstage. Candidates shall make reasonable use of underarm deodorant and other antiperspirant measures, subject to review by Secret Service, before the debates. In the event that perspiration is unavoidable, candidates may deploy one plain white cotton handkerchief measuring eight inches square. Handkerchief may not be used to suggest that opponent wants to surrender in global war on terrorism.

Paragraph Forty-two: Language.
Candidates shall address each other in terms of mutual respect (“Mr. President,” “Senator,” etc.). Use of endearing modifiers (“my distinguished opponent,” “the honorable gentleman,” “Pookie,” “Diddums,” etc.) is permitted. The following terms are specifically forbidden and may not be used until after each debate is formally concluded: “girlie-man,” “draft dodger,” “drunk,” “ignoramus,” “Jesus freak,” “frog,” “bozo,” “wimp,” “toad,” “lickspittle,” “rat bastard,” “polluting bastard,” “lying bastard,” “demon spawn,” “archfiend,” or compound nouns ending in “-hole” or “-ucker.”

Paragraph Fifty-eight: Spousal references.
Each candidate may make one reference to his spouse. All references to consist of boilerplate praise, e.g., “I would not be standing here without [spouse’s first name]” or “[Spouse’s name] would make a magnificent First Lady.” Candidates shall not pose hypothetical scenarios involving violent rape or murder of opponent’s spouse so as to taunt opponent with respect to his views on the death penalty.

Paragraph Ninety-eight: Vietnam.
Neither candidate shall mention the word “Vietnam.” In the event that either candidate utters said word in the course of a debate, the debate shall be concluded immediately and declared forfeit to the third-party candidate.


Christopher Buckley Posted by Hello

Christopher Buckley (son of William F. Buckley) was born in New York in 1952. He graduated with honors from Yale University, shipped out with the Merchant Marine and was managing editor of Esquire magazine at the age of 24. At age 29, he published his first best seller, Steaming To Bamboola: The World of a Tramp Freighter and became chief speechwriter to the Vice President of the United States, George H.W. Bush. Buckley has traveled and adventured far and wide.

Buckley is the author of eleven books, many of them national bestsellers, including Thank You For Smoking, God Is My Broker, Little Green Men, No Way To Treat A First Lady, Washington Schlepped Here and Florence of Arabia. Several of them are being developed by Hollywood; “Not,” Buckley remarks wryly, “that anything ever happens, but this hasn’t stopped me from saying of them, ‘Soon to be a major motion picture.’” His books have been translated into over a dozen languages, including Russian and Korean.


Copyright © CondéNet 2004. All rights reserved.

Tuesday, September 28, 2004

9/28/2004: Bush (317 Electoral Votes) Kerry (207 Electoral Votes)


[Click on image to enlarge it.] Dark Red=Bush; Pink=Weak Bush; Pink Outline=Barely Bush; Dark Blue=Kerry; Light Blue=Weak Kerry; Blue Outline=Barely Kerry Posted by Hello

O, Great! Uzbeks As Al Qaeda Fighters

My beloved daughter and son-in-law met in Uzbekistan as Peace Corps Volunteers. A visit to them in the mid-1990s during the Uzbek transition from Soviet republic to sovereignty provided an appreciation of Uzbek culture and history. Now, Uzbek volunteers are being drawn into the war on terror as bin Laden sends al Qaeda fighters on to Iraq. The future looks gloomy. If this is (fair & balanced) pessimism, so be it.

[x The Christian Science Monitor]
Al Qaeda's Uzbek bodyguards: As Pakistan rounds up more Al Qaeda operatives in its cities, hundreds of Uzbek fighters remain in the tribal hills.
By Owais Tohid

PESHAWAR, PAKISTAN - Pakistani forces have scored a number of recent successes in ferreting out Al Qaeda operatives from cities and towns across the country.

The latest operation took place over the weekend in the southern town of Nawabshah, where Pakistani forces reportedly killed Amjad Hussain Farooqi, a Pakistani Al Qaeda operative allegedly involved in two assassination attempts against President Pervez Musharraf as well as the murder of reporter Daniel Pearl.

But even as mid-level Al Qaeda operatives are rounded up in civilian homes and apartments, Pakistani forces have been struggling to wipe out a significant contingent of 600 to 700 fighters operating in the rugged tribal region along the border with Afghanistan. Within this phalanx may be the elusive big fish, including Osama bin Laden and his deputy Ayman al-Zawahiri - being protected by mostly Uzbek militants.

Speaking to reporters yesterday, the commander of US forces in Afghanistan reiterated that top Al Qaeda leaders could be in Pakistan.

"We see relatively littlely evidence of senior Al Qaeda personality figures being here [in Afghanistan] because they can feel more protected by their foreign fighters in remote areas inside Pakistan," said Lt. Gen. David Barno.

Hundreds of Uzbek militants now form the bulwark of Al Qaeda's defenses in South Waziristan. The Central Asians are filling the ranks left by Arab fighters who left the region for the Middle East on the orders of Mr. bin Laden months ago, say tribal sources.

"The Arab militants hardly participate in the [South Waziristan] fight as they have handed over control of the battlefield to these Uzbeks. This saves their ranks from losses," says tribesman Mohammad Noor. "They are using the Uzbeks cleverly here. Many locals are now unhappy with the Uzbeks" for drawing attacks from Pakistani forces.

With Al Qaeda's leadership focused on broad planning, command of the day-to-day fighting in the tribal region has been delegated to Qari Tahir Yaldashev. Mr. Yaldashev, who is directly linked to Al Qaeda's leadership, was a founding member of the Islamic Movement of Uzbekistan (IMU). He was the deputy of IMU's founder, Juma Naghanmani, who was killed in Afghanistan by US bombings following Sept. 11, 2001.

After suffering casualties from US forces in the Shah-e Kot mountains of Afghanistan, Yaldashev and some 250 families of Central Asian militants fled to South Waziristan. They joined hordes of Al Qaeda militants of Arab and African origins who escaped the US and its allies at the battle of Tora Bora.

Most of these militants found South Waziristan a haven; local mujahideen and staunch Islamist tribesmen were both ideological counterparts and fellow veterans of the US-sponsored fight against the Soviets in Afghanistan. Thus emerged a new anti-US triangle made up of core Al Qaeda militants, Central Asian fighters from Uzbekistan and Chechnya, and local force of tribesmen.

In the past, "Al Qaeda never let militants from other regions enter the inner circle, which is purely of Arab origin. But Al Qaeda leadership is aware of the qualities of Uzbek militants and their women.... Both are known as staunch jihadis," says Peshawar-based analyst, Mohammad Riaz.

The tribesmen narrate a story of an Uzbek family that stunned even the Arabs. Just after the fall of the Taliban, an Uzbek militant was fighting in Afghanistan and his wife and 8-year-old son were in South Waziristan.

"When the jihadis brought the body of the Uzbek militant named Ali, his wife dressed up in all white, and his son swung a gun in the air saying, 'Ali is not dead, now the real Ali is born,'" recalls tribesman Farid Khan.

Pakistan's field commander in South Waziristan, Maj. Gen. Niaz Khattak, says the fighters appear to be "trained militants." They eat sardines and drink canned juice; do a lot of exercise; and carry military maps, explosives, and Thuraya satellite phones.

Along with the foreign fighters, Yaldashev has at his disposal fresh local recruits from among the Mehsud tribe.

In a bid to win over tribesmen, Pakistan lifted its embargo this weekend against South Waziristan. It was put in place after tribal leaders refused to help officials track foreign militants.

When bin Laden issued his redeployment orders, most Arab militants left the area for the Middle East. But an estimated 25 to 50 Arab militants are still believed to be in hiding in the mountains of South Waziristan. Possible hideouts include the highest peak, Shawwal, as well as the Khamrang and Bush Sar ranges, which are covered with thick forests and have natural caves. Local tribesmen say the Arab militants are guarded by dozens of armed masked men in these inaccessible locales.

Earlier this month, Pakistan destroyed an alleged terrorist training camp in South Waziristan. Sources say the training center was run by Yaldeshev, who, along with 150 to 200 mostly Uzbek and local militants, recently shifted to the hilly areas surrounded by the Karvan Manza and Kunnigram mountains after escaping earlier military operations.

The race is on as Pakistani forces pursue the foreign fighters before the mountains fill with snow in November.

For many of the Uzbeks, there is no choice but to fight. Their homeland is a tightly controlled police state, and the only path of return would risk an engagement with US forces in Afghanistan. Nor can they hope to blend in among a friendly population - their round faces, thin beards, and pierced noses set them apart from both local tribesmen and Arabs in the Middle East.

"With the persistent pressure of Pakistani security forces and having no point of return, Uzbeks will prefer to explode themselves rather than accept the defeat," says Sailab Mehsud, a regional expert.

Copyright © 2004 The Christian Science Monitor

The Kinkster & Friends (Kerry, W, & the Slickster)

Only the Kinkster can conjure up imagined telephone conversations with Kerry, W, and the Slickster. Kinky Friedman for Governor of Texas! How Hard Could It Be? If this is (fair & balanced) lunacy, so be it.

[x Texas Monthly]
Bring Him On
by Richard (Kinky) Friedman

I'm pals with Clinton and pals with Bush—so, obviously, if John Kerry wants to be president, he has to make friends with me. Hey, is that my phone ringing?

"START TALKIN'," I SAID as I picked up the blower.

"Kinkster," said a familiar voice, "this is John Kerry. I haven't been very happy with you lately."

"Why the long face, John?"

"Are you aware that I'm running for president of the United States?"

"Are you aware," I said somewhat indignantly, "that my books have been translated into more languages than your wife speaks?"

There was silence, followed by a peculiar choking sound. I puffed patiently on my cigar and waited. One of the drawbacks to the telephone is that there's very little you can do to physically help the party on the other end of the line. Either Kerry would recover by himself or else he was definitely going to lose Ohio.

"I went to Vietnam," he said at last.

"I heard something about that," I said.

Indeed, it was one of the things I really liked about Kerry. America was full of patriotic-seeming people, from John Wayne to most of our top elected officials, who, when the time had come to serve their country, had not answered the call.

"I went to Vietnam myself earlier this year," I said. "Nobody told me the war was over."

I heard what sounded like a practiced, good-natured chuckle from John Kerry. That was the trouble with politicians, I thought. Once they'd been on the circuit for a while, their words, gestures, even laughter—all were suspect, relegated to rote and habit. Something as natural as a smile became a mere rictus of power and greed. They couldn't help themselves; it was the way of their people. As Henry Kissinger once observed, "Ninety percent of politicians give the other ten percent a bad name."

"I'll get to the point," Kerry said. "I know you're pals with George W.—"

"I'm also pals with Bill Clinton," I said. "In fact, I'm proud to say I'm the only man who's slept with two presidents."

"That is something to be proud of. But I don't understand how you can support Bush's policies. I'm told you grew up a Democrat. What happened?"

What did happen, I wondered, to the little boy who cried when Adlai Stevenson lost? What happened to the young man whose heroes were Abraham, Martin, and John? Time changes the river, I suppose, and it changes all of us as well. I was tired of Sudan being on the Human Rights Commission of the United Nations. I was tired of dictators with Swiss bank accounts, like Castro and Arafat and Mugabe, masquerading as men of the people. I was tired of Europeans picking on cowboys, everybody picking on the Jews, and the whole supposedly civilized world of gutless wonders, including the dinosaur graveyard called Berkeley, picking on America and Israel. As I write this, 1.2 million black Christian and Muslim Sudanese are starving to death thanks to the Arab government in Khartoum and the worldwide mafia of France, Germany, China, Russia, and practically every Islamic country on the face of the earth. What happened to the little boy who cried when Adlai Stevenson lost? He died in Darfur.

"I don't know what happened," I said. "But as Joseph Heller once wrote, 'Something happened.'"

"You'll be back," said Kerry. "You'll be back."

He was telling me about his new health plan and how the economy was losing jobs when I heard a beeping sound on the blower and realized I had incoming wounded.

"Hold the weddin', John," I said. Then I pushed the call-waiting button.

"Start talkin'," I said.

"Hey, Kinkster!" said a familiar voice, this time with a big, friendly Texas drawl. "It's George W. How're things goin' at the ranch?"

"Fair to Midland, George," I said. "John Kerry's on the other line telling me about his new health plan. What's your health plan?"

"Don't get sick," said George with his own practiced, good-natured chuckle.

"He also told me the economy is losing jobs."

"What do you care, Kink? You told me you never had a job in your life."

"That's not true," I said. "I used to write a column for Texas Monthly, but it got outsourced to Pakistan."

"Kink, the economy's doin' fine. The country's turnin' the corner. We even have bin Laden in custody."

"I remember you told me that. Where is he now?"

"Time-share condominium in Port Aransas. His time's gonna run out two weeks before the election."

I chatted with George awhile longer, then finished up with John. I had just returned to my chair and unmuted FOX News when the phone rang again. I power walked into the office and picked up the blower.

"Start talkin'," I said.

"Kinky, it's Bill Clinton. How's it hangin', brother?"

"Okay, Bill. I just talked to George Bush and John Kerry on the phone."

"Skull and Bones! Skull and Bones! Tyin' up the telephones!" he chanted. "Hell, I still think about that night in Australia when you and me and Will Smith all went to that Maynard Ferguson concert. Too bad Will didn't bring his wife, wasn't it? Man, that was a party!"

I remembered that night too. Millions of people undoubtedly love Bill Clinton, but I've always believed he has few real friends. That night he and I had talked about the recent death of one of his very closest, Buddy the dog. Like they say, if you want a friend in Washington, get a dog.

"Hey, Kink. There's a big ol' white pigeon sittin' on my windowsill here at my office in Harlem. Do you recall once asking me why there were white pigeons in Hawaii and dark pigeons in New York?"

"Sure. And you answered, 'Because God seeks balance in all things.'"

"That's right. Hell, I always wanted to be a black Baptist preacher when I grew up."

"Be careful what you wish for."

"Imagine, a white pigeon right in the middle of Harlem. If the whole world could see that, what do you reckon they'd say?"

"There goes the neighborhood?"

There followed the raw, real laughter of a lonely man who'd flown a little too close to the sun.

"Just remember, Kink," said Bill. "Two big best-selling authors like us got to stick together. Those other guys? Hell, they're only runnin' for president."

Copyright © 2004 Texas Monthly Magazine

Standing Tall

Height is destiny. I stand at 6-feet plus. I owe my improbable rise to greatness to being a six-footer. If you believe that, I have some great apartment buildings in Fallujah that would be a terrific investment opportunity. If this is (fair & balanced) stature, so be it.

[x The Chronicle of Higher Education]
Political Timber: Glitter, Froth, and Measuring Tape
By EDWARD TENNER

As the presidential debates approach, some anxious Democrats are taking comfort in the five-inch height advantage of their candidate, who stands 6 feet 4 inches to George W. Bush's 5 feet 11 inches. They remember, all too well, the 1988 presidential debates between George H.W. Bush and Michael S. Dukakis.

At the time, the newspaper columnist Charles Krauthammer described the elder Bush as "tall and terrible. He whined. He stumbled. He looked nervous and hyperactive. From the first question about drugs, he was on the defensive." Then Krauthammer also mentioned the results of a focus group of undecided voters convened by The Washington Post, who ultimately leaned toward Bush. After the candidates shook hands, one member had explicitly mentioned the six-inch gap in height.

The focus-group participants had cited other factors, of course, but the possibly fatal handshake was added to the capital's political lore. "Half to two-thirds of what people take away is visual rather than verbal," a Republican pollster told The New York Times in 1996. "It's huge." To some Democrats, that principle implies the need for a physically imposing candidate. After the initial surge of Gov. Howard Dean of Vermont, some supporters of rival Democrats stooped to open heightism, deriding Dean as an example of "short man's syndrome."

How did it come to this? Why is stature now considered such a political advantage -- or liability?

It's easy to blame the tube for fostering a flight from serious issues into glitter, froth, and measuring tape. But taller was seen as better in the 19th century, too, and long before. The already imposing Lincoln may have chosen his signature stove-pipe hat to further accentuate the strong point of his appearance. Herodotus heard that the Ethiopians made the tallest and strongest men their kings.

Still, height was not considered destiny. James Madison's nickname, "Little Jemmy" -- his height is usually given at 5 feet 4 inches -- was not politically fatal. Lincoln's shorter opponents and their fans accepted and even flaunted their stature. Stephen A. Douglas was famous as the "little giant," and Gen. George B. McClellan, whatever his failings as a Civil War commander, won the 1864 Democratic nomination as "Little Mac," a phrase his troops had always used affectionately. (A brilliant military engineer, he was also compared admiringly with Napoleon earlier in his career.) Friend and foe spent little time talking about height. It was a given, to be used derisively or positively.

That attitude changed toward the end of the century. Timothy A. Judge, a professor of management at the University of Florida, and Daniel M. Cable, an associate professor of management and organizational behavior at the University of North Carolina at Chapel Hill, who study height and success, have observed in a recent analysis of the literature on the topic in the Journal of Applied Psychology that William McKinley, elected in 1896, was the last president shorter than the average man. And there were signs of the end of the good-natured banter of the waning century. McKinley's journalistic critics portrayed him as a "little boy" controlled by his big nursemaid, the Republican boss Mark Hanna, and the growing big-business trusts.

Fear of the big began to mix with mockery of the small. An unpublished University of Iowa dissertation by Michael Tavel Clarke, "These Days of Large Things: The Culture of Size in America, 1865-1930" (2001), suggests that the interest in personal size and strength was partly a response to the emergence of industrial combinations and other corporate giants that threatened to crush individuality. At the same time, the scientific professionals of the late-19th and early-20th centuries regarded small stature in Africa, Asia, and Europe as a throwback to primitivism and feared its importation. Eugenic interpretations of stature abounded.

For example, William Zebina Ripley's The Races of Europe, published in 1899, popularized the division of the Old World into distinctive biological types, with tall Northern European blonds on top physically and mentally as well as geographically, followed by the stockier Alpines and the still-darker Mediterraneans. America's old racial stock (those called "native Americans" around 1900 were mainly Anglo-Saxon Protestants) was threatened by an influx from the shorter nations of Eastern and Southern Europe.

With the closure of the frontier in the 1890s, medical and educational authorities believed a new struggle would occur within the growing cities, where high-density living and immigration seemed to be endangering public health. They established height and weight standards and fitness programs to help assure the stature of a more-diverse urban population, meeting the threat of degeneration.

For their part, African-American people were starting to stand tall in sports. In 1908 Jack Johnson, more than six feet tall, defeated the world boxing champion, a 5-foot-7-inch white Canadian named Tommy Burns, seeming to confirm the fears of the founder of the modern Olympics, Baron Pierre de Coubertin, four years earlier that "black men, red men, and yellow men" would eventually "leave the white man behind them" in competition.

Several decades later, the stereotype of the short, simian Japanese marked World War II-era racism in America, and the emergence of better nourished and taller postwar generations of Japanese has not yet ended the acrimony about height among nations and races. In 2001 the Sunday Telegraph reported a campaign by the Chinese government to encourage the nation's children to drink more milk (even though many are lactose intolerant) after the humiliation of learning that Japanese average height had overtaken Chinese stature for the first time in recorded history.

Height is not only a nationalist concern, of course. It can be a revealing index of social change. For economic historians, records of stature, whether from military data or archaeological digs, illuminate health and living standards in a way that production and consumption data alone never can. Contemporary changes, too, can signal the real rise and decline of public welfare.

Consider the public-health catastrophe of North Korea. According to a 2003 report of the World Food Program and Unicef, 42 percent of North Korean children are now classified as stunted, their growth markedly below their age norms, and most may never recover. Thanks to prosperity and a Western diet, 17-year-old boys near the border on the South Korean side average 5 feet 8 inches; most teenagers on the North Korean side stood less than five feet, even though before World War II, Koreans in the northern part of the country had been slightly taller. Height is thus a mirror of the isolation and decline of the North Korean economy, with its widespread poverty and resulting malnourishment.

Yet the United States shows that political freedom and apparently abundant food are not necessarily enough. In a paper published earlier this year in the journal Economics and Human Biology, the University of Munich economic historians John Komlos and Marieluise Baur show how "within the course of the 20th century the American population went through a virtual metamorphosis from being the tallest in the world, to being among the most overweight." In the mid-19th century, Americans were from 3 to 9 centimeters taller than Western and Northern Europeans, and underweight. Now the Dutch and Scandinavians (followed by the British and Germans) are from 3 to 7 centimeters taller than Americans, who have one of the highest rates of obesity. (Beginning in the 1970s, Uncle Sam ceased to be drawn mostly as tall and thin and has often been cut down to size, according to the University of Oregon journalism professor and cartoonist Thomas H. Bivins, who has studied the figure's history.) Because their study excludes Asian and Latino people and those born outside America, and because black people show the same pattern as the broader population, Komlos and Baur discount immigration as the reason why Americans have become relatively shorter. Their hypothesis is that European welfare state policies and greater social equality have produced better nutrition and health care.

Two strains of social science collide, then, when stature rears it head in politics. One historicizes height as convention and metaphor, a symbol of dominance or otherness, a relic of imperialism and nativism. The other takes height seriously as a yardstick of overall fitness, as the authorities of the progressive era saw it, a characteristic predicting intelligence and performance. In their survey article, Judge and Cable suggest that tall people may make more money at least partly because they actually are better at their work. For example, being tall can generate admiration, which can promote self-esteem, which can enhance competence. Another study in the College Mathematics Journal by Paul M. Sommers, an economist at Middlebury College, compares the heights of American presidents with their ratings in two surveys of historians, and finds that a disproportionate number of the highest-rated chief executives were taller than average -- if only because "historians want someone they can look up to in the highest office." Perhaps the members of the Washington Post focus group were on to something.

Yet ultimately, height is a social as well as an anatomical fact. While physically altering height is one of the most painful of all surgical interventions -- limb lengthening requires cutting through the thigh bones and having the patient turn screws in agony over months and months to deposit new calcium -- elites have relatively painless ways to manage impressions. In 1840 in Paris Sketch Book, the novelist William Makepeace Thackeray depicted a magnificent wig, sumptuous coat, and high-heeled shoes (Rex), a little bald man in his underwear (Ludovicus), and their fusion in the fully clothed Sun King (Ludovicus Rex) -- elevated by his footwear. More recently, shorter-than-average male film stars -- from Alan Ladd and Humphrey Bogart to Tom Cruise -- have been aided by costume and adroit cinematography. But tricks like the "Ladd box" (on which the actor stood) would not have worked if the people who used them hadn't had their own ability to project a charismatic, dashing -- in fact, "larger than life" -- persona. Outside show business, too, we have all known or seen people who have managed to appear taller than they actually were.

There is thus hope for shorter candidates to cast long shadows with the proper delivery and gestures, and not being seen to care about stature. Howard Dean's real height problem may not have been being under 5 feet 9 inches but in insisting he was 5 feet 8 3/4. And whatever merits Bush's and Kerry's debating arguments might have, much more will depend on their rhetorical prowess than on their stature. The correlation between height and success may be significant, but the exceptions have been as striking as the rule. Above all, we should think twice about height as a proxy for greatness on the world stage. At 6 feet 4 to 6 feet 6 inches, according to the FBI "wanted poster," Osama bin Laden would stand above both candidates.

Edward Tenner is a senior research associate at the Jerome and Dorothy Lemelson Center for the Study of Invention and Innovation at the Smithsonian Institution's National Museum of American History. He is the author of several books, most recently Our Own Devices: The Past and Future of Body Technology (Alfred A. Knopf, 2003).

Copyright © 2004 by The Chronicle of Higher Education



Monday, September 27, 2004

Tell It Like It Is, Juan!

Wake up! We are committing war crimes in Iraq! W had better hope that we win. Otherwise, he might have Slobodan Milosevic as a cellmate. If this is (fair & balanced) revisionism, so be it.

[x History News Network]
If America Were Iraq, What would it be Like?
By Juan Cole

President Bush said last week that the Iraqis are refuting the pessimists and implied that things are improving in that country.

What would America look like if it were in Iraq's current situation? The population of the US is over 11 times that of Iraq, so a lot of statistics would have to be multiplied by that number.

Thus, violence killed 300 Iraqis last week, the equivalent proportionately of 3,300 Americans. What if 3,300 Americans had died in car bombings, grenade and rocket attacks, machine gun spray, and aerial bombardment in the last week? That is a number greater than the deaths on September 11, and if America were Iraq, it would be an ongoing, weekly or monthly toll.

And what if those deaths occurred all over the country, including in the capital of Washington, DC, but mainly above the Mason Dixon line, in Boston, Minneapolis, Salt Lake City, and San Francisco?

What if the grounds of the White House and the government buildings near the Mall were constantly taking mortar fire? What if almost nobody in the State Department at Foggy Bottom, the White House, or the Pentagon dared venture out of their buildings, and considered it dangerous to go over to Crystal City or Alexandria?

What if all the reporters for all the major television and print media were trapped in five-star hotels in Washington, DC and New York, unable to move more than a few blocks safely, and dependent on stringers to know what was happening in Oklahoma City and St. Louis? What if the only time they ventured into the Midwest was if they could be embedded in Army or National Guard units?

There are estimated to be some 25,000 guerrillas in Iraq engaged in concerted acts of violence. What if there were private armies totalling 275,000 men, armed with machine guns, assault rifles (legal again!), rocket-propelled grenades, and mortar launchers, hiding out in dangerous urban areas of cities all over the country? What if they completely controlled Seattle, Portland, San Francisco, Salt Lake City, Las Vegas, Denver and Omaha, such that local police and Federal troops could not go into those cities?

What if, during the past year, the secretary of state (Aqilah Hashemi), the president (Izzedine Salim), and the attorney general (Muhammad Baqir al-Hakim) had all been assassinated?

What if all the cities in the U.S. were wracked by a crime wave, with thousands of murders, kidnappings, burglaries, and carjackings in every major city every year?

What if the Air Force routinely (I mean daily or weekly) bombed Billings, Montana, Flint, Michigan, Watts in Los Angeles, Philadelphia, Anacostia in Washington, DC, and other urban areas, attempting to target "safe houses" of "criminal gangs," but inevitably killing a lot of children and little old ladies?

What if, from time to time, the U.S. Army besieged Virginia Beach, killing hundreds of armed members of the Christian Soldiers? What if entire platoons of the Christian Soldiers militia holed up in Arlington National Cemetery, and were bombarded by U.S. Air Force warplanes daily, destroying thousands of graves and even pulverizing the Vietnam Memorial over on the Mall? What if the National Council of Churches had to call for a popular march of thousands of believers to converge on the National Cathedral to stop the U.S. Army from demolishing it to get at a rogue band of the Timothy McVeigh Memorial Brigades?

What if there were virtually no commercial air traffic in the country? What if many roads were highly dangerous, especially Interstate 95 from Richmond to Washington, DC, and I-95 and I-91 up to Boston? If you got on I-95 anywhere along that over 500-mile stretch, you would risk being carjacked, kidnapped, or having your car sprayed with machine gun fire.

What if no one had electricity for much more than ten hours a day, and often less? What if it went off at unpredictable times, causing factories to grind to a halt and air conditioning to fail in the middle of the summer in Houston and Miami? What if the Alaska pipeline were bombed and disabled at least monthly? What if unemployment hovered around 40 percent?

What if veterans of militia actions at Ruby Ridge and the Oklahoma City bombing were brought in to run the government on the theory that you need a tough guy in these times of crisis?

What if municipal elections were cancelled and cliques close to the new "president" quietly installed in the statehouses as "governors?" What if several of these governors (especially of Montana and Wyoming) were assassinated soon after taking office or resigned when their children were taken hostage by guerrillas?

What if the leader of the European Union maintained that the citizens of the United States are, under these conditions, refuting pessimism and that freedom and democracy are just around the corner?

What if?

Juan Cole is Professor of Modern Middle Eastern and South Asian History at the University of Michigan. His website is Informed Comment: Thoughts on the Middle East, History, and Religion..

Copyright © 2004 History News Network

Hannah Arendt Got It Right

I encountered Hannah Arendt when I was a senior in high school. My favorite history teacher was Donald Ramstetter. He wore a neatly trimmed beard (a vandyke?) in the Cold War 1950s when conformity was the norm. His U. S. history class was unstructured and unconventional. I remember it as a college prep course. Mr. Ramstetter distributed handouts; unheard of in all of my other classes. I most vividly remember a handout of an excerpt from Hannah Arendt's On The Human Condition. It was the first time that I remember taking an idea seriously. Arendt (and Mr. Ramstetter) taught me the difference between work and labor. If this is (fair & balanced) nostalgia, so be it.

[x BookForum]
F For Effort
Brown v. Board of Education: A Failure At Fifty
by Robert S. Boynton

On the morning of September 4, 1957, Elizabeth Eckford set off for her first day of classes at Central High School in Little Rock, Arkansas. When the black teenager arrived, a white mob, backed by the Arkansas National Guard, prevented her from entering. In the days that followed, photographs of eckford being cursed at and spat on by the good citizens of Little Rock were reprinted in magazines and newspapers around the world. Reactions to the photos varied: Liberals were shamed; southern racists steeled themselves for the "massive resistance" to integration they had promised after the Brown v. [the Topeka, KS] Board of Education decision three years before; America's cold-war foes used the images as proof that the capitalist system was riddled with racism.

One of the most enigmatic responses came from the philosopher Hannah Arendt. "Reflections on Little Rock" was originally commissioned by the then-liberal Norman Podhoretz at the then-liberal Commentary magazine. While he judged the piece provocative and brilliant, the other editors were hostile to her thesis that educational integration was being mishandled, first delaying publication of the essay and then insisting on accompanying it with a scathing rebuttal by the philosopher Sidney Hook. Arendt eventually tired of Commentary's vacillations and withdrew the article. In the year after the Little Rock confrontation, Arkansas stalled its integration efforts, and in 1958, the governor, Orville Faubus, turned the public schools over to a private corporation, which promised to maintain segregation and close down the black schools. This confirmed Arendt's skepticism about federally enforced integration, and she offered the piece to Irving Howe, who published it in Dissent in the fall of 1959.

Written with Arendt's characteristic "Olympian authority" (as Ralph Ellison later called it), the Dissent version of "Reflections" began on an uncharacteristically personal note. She had, as always, full confidence in her position, but the vicious prepublication gossip in the two years since she wrote "Reflections" intimated the kind of response the piece might get. "Since what I wrote may shock good people and be misused by bad ones," she wrote, "I should like to make it clear that as a Jew I take my sympathy for the cause of the Negroes as for all oppressed or underprivileged peoples for granted and should appreciate it if the reader did likewise."

They didn't, of course, and Arendt was probably naive to hope that an apologia would assuage her critics. While clearly writing out of sympathy for, and identification with, the black children, her philosophically informed analysis was out of sync with the left-liberal, post-Brown consensus. Where civil-rights lawyers were redoubling their legal efforts in the wake of the Supreme Court's disappointing 1955 decision (known as "Brown II," the decision decreed that integration proceed with "all deliberate speed"—which the South took as license to delay the process indefinitely), Arendt believed the basic terms of the conflict still needed clarification. "It is not the social custom of segregation that is unconstitutional, but its legal enforcement," she wrote in one of the essay's less inflammatory passages.

But while many of Arendt's observations were off-base (as even she later admitted), the questions raised by her essay anticipated some of the most trenchant criticisms of educational integration made on the occasion of Brown's fiftieth anniversary this past May. Given the country's dismal failure to integrate public schools, not to mention public life, Arendt's skepticism today seems more prescient than insensitive. Among her insights was that America's racial problems, as well as the remedies to those problems, were inscribed within larger political questions. "The point at stake, therefore, is not the well-being of the Negro population alone," she wrote, "but, at least in the long run, the survival of the Republic."

Arendt's imperious tone ("oppressed minorities were never the best judges on the order of priorities in such matters"), as well as some of the ideas in "Reflections on Little Rock," make for uncomfortable reading. Arendt argued that the choice to integrate schools first—rather than, say, the workplace or housing—was a mistake for the burgeoning civil-rights movement. Not only did it put children on the front lines of an ugly battle (she accused black parents of using them as proxies), it politicized the educational system, which she believed should be immune to such forces. Not only would forced integration of schools undercut the larger cause, it would also embitter potential allies, scar black children, and eventually fail, she predicted.

If this wasn't contentious enough, Arendt couched her analysis in the rhetoric of the rights of states (a favorite Dixiecrat formulation) to thwart federal intrusion. Finally she argued that—given the laws forbidding mixed-race marriages, which existed in twenty-nine states in 1957—the integrationist's efforts were misdirected. "The Civil Rights bill did not go far enough, for it left untouched the most outrageous law of Southern states," she wrote, "the law which makes mixed marriage a criminal offense." According to Arendt, southern blacks ought to make the repeal of miscegenation laws, not the integration of classrooms, their first political priority.

As in all her work, Arendt's principal concern in "Reflections" was over the autonomy of what she called "the political"—the central feature of the tripartite framework ("the political," "the social," and "the private") that she articulated in The Human Condition in 1958. According to Arendt's schema, schools sat precisely at the juncture of the three realms: the private right of parents to raise children as they want; the social right of all to keep the company they wish; and the government's political right to prepare children for future duty as citizens. So situated, schools were the last place the movement for a just, racially integrated society (something she supported) should start. The goal of a just society, Arendt believed, was to make sure these three spheres were respected accordingly. Allowing discrimination where it didn't belong—and, conversely, prohibiting it from where it did—was for Arendt the true outrage.

Much to her readers' surprise, she followed her pro-forma denunciation of segregation with a detailed defense of the principle of "discrimination," in which she explained its appropriate meaning in each sphere. While discrimination has no place in the political sphere (where, for example, all are free to vote), it is appropriate in the private (where parents have the right to raise children as they prefer) and the social (where we all have the right to keep the company we wish). "What equality is to the body politic—its innermost principle—discrimination is to society," she wrote.

As a German Jew and author of The Origins of Totalitarianism, Arendt's primary fear for America (a country she believed prone to conformism) was that it might become a "mass society" in which social equality was legally enforced. More than her liberal, legally minded American colleagues, Arendt feared that forcing educational integration might hasten the rise of an antiblack, racist ideology of the sort that had been used to rationalize violence against Jews in Hitler's Germany. She had seen how ideology mobilized opinion and understood the "deep structure" of society—parts of which were more susceptible to legal action than others. While in the short run classrooms would become integrated, Arendt believed that America would do itself irreparable future harm by failing to make African-American political equality its first priority.

Despite the article's generally hostile reception, it received the 1959 Longview Foundation award for the year's outstanding little-magazine article—an appropriate honor for a philosopher who always took the "long view" on any question. One of Arendt's most infamous works (Eichmann in Jerusalem being the other), "Reflections on Little Rock" has found a second life in the gay-marriage movement, which has adopted her argument that a citizen has a right to marry whomever he or she wants. But the fact that she did not allow it to be reprinted during her lifetime indicates the ambiguity of its legacy.

* * *


How does one "celebrate" a failure? This was the question facing the authors of the dozen or so books published to commemorate the fiftieth anniversary of the Brown decision. Some consider Brown broadly and judge it more for the alleged consequences of its principles (such as the civil-rights movement) than the efficacy of its rulings. In an updated version of Richard Kluger's 1975 Simple Justice, still the most comprehensive history of Brown, the author counts a variety of black achievements—Martin Luther King, Jr.'s birthday is a legal holiday, Confederate flags no longer fly over southern state capitols, Denzel Washington and Halle Berry receive Oscars—as part of Brown's legacy. Danielle S. Allen, a classicist and political scientist at the University of Chicago, recasts the philosophical significance of Brown in Talking to Strangers, arguing that between 1954 and 1964, America experienced nothing less than the founding of a "new constitution," which delineated the possibility of new forms of democracy and citizenship.

But most of the Brown commentators take a less sanguine view of racial progress over the past fifty years. The title of the Harvard Civil Rights Project's 2004 study, "Brown at 50: King's Dream or Plessy's Nightmare?" lays the choice out nicely. By 1996, black students were the majority in the public schools in most large metropolitan areas. Over 90 percent of the students in public schools in Atlanta, New Orleans, Chicago, and Washington, DC, were minorities.

Polls show that support for race-oriented plans like affirmative action has never been lower. "We are but one generation into an integrated society, and the signs are that the majority of the population is tired with the process," writes Harvard law professor Charles J. Ogletree, Jr. in his memoir All Deliberate Speed. Many black people have become "integration weary," writes Georgetown law professor Sheryll Cashin in The Failures of Integration. "Americans seem to have come to a tacit, unspoken understanding: State-ordered segregation has rightly been eliminated, but voluntary separation is acceptable, natural, sometimes even preferable." The pessimism of Berkeley historian Waldo E. Martin, Jr.'s introduction to Brown v. Board: The Landmark Oral Argument Before the Supreme Court, is also typical: "The post-Brown history of integration exposes the assumption of a national commitment to integration to be idealistic, perhaps overstated, and maybe even illusory, if not downright delusory."

The poobah of Brown skeptics is veteran civil-rights activist Derrick Bell. "How could a decision that promised so much and, by its terms, accomplished so little, have gained so hallowed a place," he wonders in Silent Covenants. The author of several best sellers on what he calls the "permanence" of racism in America and something of a celebrity for his "tell it like it is" brand of racial realism, Bell in some respects agrees with the Arendtian position that integration, wrongly pursued, has encouraged the development of a full-fledged racist ideology. Outright racism, he argues, has simply gone underground, where it is less visible but more pervasive than ever. The man who once worked tirelessly to desegregate schools throughout the South now believes integration is little more than a cruel joke played on black people.

Bell stands out as one of the only commentators on race to acknowledge the totalizing impact of America's racial tragedy by noting the harm that segregation has caused whites as well as blacks. "Segregation perpetuates the sense of white children that their privileged status as whites is deserved rather than bestowed by law and tradition," a delusion that "afflicts white children with a lifelong mental and emotional handicap that is as destructive to whites as the required strictures of segregation are to Negroes," he writes, adopting Brown-era terminology. Oppression harms the oppressor as well as the oppressed.

In Silent Covenants, Bell calls the Brown decision a "long-running racial melodrama," and it is easy to see why. From his beginnings in Pittsburgh (where he was the only black student in his law class) to Harvard Law School (where, for a time, he was its only tenured black professor), Bell has played a significant role in the movement for racial justice. After finishing law school in 1957, he met with William H. Hastie, the first black federal judge and a longtime civil-rights activist. Bell told the judge he wanted to become a civil-rights lawyer and was crestfallen when Hastie delivered the bad news: Brown redefined the constitutional rights to which blacks are entitled, so while there might be some "mopping up to do," the field of civil-rights law had essentially shut down, he said. "Son, I am afraid that you were born fifteen years too late to have a career in civil rights."

Undiscouraged, Bell worked for Thurgood Marshall at the NAACP Legal Defense Fund, handling most of its southern school litigation from 1960 to 1965. It was dangerous work; that Bell thought of himself as "the briefcase-carrying counterpart of the Lone Ranger" only slightly overstates the peril he faced. At one point, he spent so much time arguing cases in Mississippi that the closely watched lawyer was made to file state income tax there. Bell taught for sixteen years at Harvard Law School before leaving to protest the school's failure to tenure even one black woman. He is now a perpetually reappointed visiting professor of law at NYU.

Bell first rehearsed the ideas that appear in Silent Covenant in a 1976 Yale Law Journal article titled "Serving Two Masters: Integration Ideals and Client Interests in School Desegregation Litigation." In the essay he argued that civil-rights lawyers had become more committed to their belief in integration than they were to the educational interests of their clients. "Educational equity rather than integrated idealism was the appropriate goal. . . . While the rhetoric of integration promised much, court orders to ensure that black youngsters actually received the education they needed to progress would have achieved more," he writes in Silent Covenant.

The problem with Brown, according to Bell, is that it created the fiction that outlawing segregation automatically cleared the path of progress for blacks. "By doing nothing more than rewiring the rhetoric of equality, the Brown Court foreclosed the possibility of recognizing racism as a broadly shared cultural condition. In short, the equality model offered reassurance and short-term gains, but contained within its structure the seeds of its destruction," he writes. The "been there, done that" vein of color blindness advocated by critics of race-linked programs is the result of this fiction.

As he's done to good effect in earlier books, Bell performs a thought experiment in Silent Covenants, envisaging an alternative history of the past fifty years. "Could the Court have written a decision that disappointed the hopes of most civil rights lawyers and those they represented while opening up opportunities for effective schooling capable of turning constitutional defeat into a major educational victory?" he asks. "I think the answer is yes." Bell advocates a return to the NAACP's initial legal strategy, which was to equalize expenditures on education in the hope that doing so would force the states either to offer truly equal facilities or to recognize that integration was the more economically feasible option. The key to Bell's plan is that the Court would have to actually enforce the decision in Plessy v. Ferguson, which held that separate education could be equal. Bell's proposal comes down to "desegregating the money," with education rather than integration as a goal.

* * *


It is sometimes forgotten that very little integration took place in the decade after Brown, and only then in the wake of the 1964 Civil Rights Act, which contained a provision requiring compliance with desegregation orders as a condition of receiving federal education funds. The representation of black students in southern schools with white majorities didn't even break 0.1 percent until 1960, moving from 2.3 percent in 1964 to 13.9 percent in 1967. It is currently at around 30 percent (down from a 1988 high of 43.5 percent)—which means that we have resegregated our schools back to 1968/1970 levels.

Ogletree is part of the cadre of black students born in the early '50s who were among the first to benefit from Brown's effects in the early '70s. In All Deliberate Speed, the self-described "Brown Baby" blends memoir and history in a way that gives his reflections on Brown a closely observed, narrative authenticity not found in the hagiographies of Thurgood Marshall and his NAACP colleagues. Unlike Arendt, Kluger, and even Bell, Ogletree truly lived Brown and is hence well-positioned to judge its results.

Born in 1952 to a farming family in Merced, California, Ogletree was one of sixty-eight black students (out of 1,500 freshman) who arrived at Stanford University in the fall of 1971. The sociologist St. Clair Drake had recently been lured from Chicago to run the newly created African and Afro-American Studies programs and became a mentor to Ogletree. He quickly got involved in university activism, campaigning to free Angela Davis from prison (she in turn encourages him to work on behalf of lesser-known inmates), protesting the racist pseudoscience of semiconductor inventor and erstwhile eugenicist William Shockley, and walking out on graduation speaker Daniel Patrick Moynihan to protest his view of the black family as dysfunctional. Ogletree arrived at Harvard Law School at the height of the Boston busing crisis, during which he witnessed the emotional toll of forced integration. He then worked for the Washington, DC, public defender's office before taking a faculty position at Harvard in 1989.

Having so benefited from the movement that Brown began, Ogletree is careful about the conclusions he draws from his success. One of Anita Hill's principal advisers during the Clarence Thomas hearings, he is critical of the false optimism of the Supreme Court justice's worldview. ("Thomas spoke of an America that did not exist . . . [where] the problems of racism had been solved, and we black people only needed to pull ourselves up by our bootstraps and move forward.") He perceives the gains and losses of integration clearly and comes to conclusions similar to, though less radical than, Bell's. "As I reflect on these early efforts to promote the Brown mandate of integrated education, I'm struck by our failure ever to ask the hard and obvious questions about what we were doing. Why were black children being forced to go to white schools, without anyone's raising the question of more resources for black schools?" he asks. Ogletree, whose two children attend public schools, knows firsthand that integration on its own is no panacea. "Ironically, Cambridge had voluntarily desegregated its schools after Brown," he writes. "It had a complex system in place to balance students racially at every school. Yet, even in their integrated classroom, black, Latino, and poor students lagged behind other students."

Like Bell, Ogletree gestures to an alternative post-Brown history, one in which integration might have been a less formulaic process. Instead, he writes, Brown left African-Americans with the worst of both worlds: "When schools were integrated, whites did not attend black schools staffed by black teachers and black principals. Instead, blacks went to the better-funded white schools. In this way, integration ended one vital aspect of the 'equalization' strategy pursued by the NAACP in the cases leading up to Brown I, while at the same time perpetuating the segregation of public education."

Ogletree's conclusion is stark. "The important goal of full equality in education following slavery and Jim Crow segregation was compromised from the beginning. . . . Fifty years after Brown there is little left to celebrate." Beyond the inadequacies of the Brown decisions themselves, Ogletree blames the "false promise of integration," which perceives the policy as an end in itself, rather than a means to an end.

As much as it pains my liberal soul to admit it, I don't believe Bell and Ogletree are wrong to give up on educational integration (at least in the short term). Although most data indicate that black children who attend racially mixed schools perform better than those who remain in single-race, overwhelmingly black schools, even integration's most vociferous proponents admit that it isn't clear why this is so. Many scholars, like Harvard sociologist Orlando Patterson, believe that the achievement level of children in mixed schools has more to do with socioeconomic status (both theirs and their fellow students') than racial mixing. The only consistent correlation in such studies is between student achievement and the level of a parent's education, leading to the conclusion that well-educated kids have well-educated parents, regardless of the schools they attend.

In some respects, we have come full circle in the fifty years since Brown. Segregation in cities now approaches Brown-era levels, although largely as a "function of economic and class factors rather than of racist prejudices against Afro-American and Euro-American children going to school together," writes Patterson in The Ordeal of Integration. To counter these trends, he advocates some very Arendtian positions, arguing that it "makes more sense in many cases to concentrate on those measures that will first integrate neighborhoods and occupations and let the integration of schools follow from them." Perhaps living and working together, in addition to intermarriage (which Patterson advocates), may be the means to integrated schools rather than the other way around. If Brown took us down "the wrong road," as Bell suggests, it didn't take us in the wrong direction.

Robert S. Boynton, director of NYU’s graduate magazine journalism program, has written for the New Yorker and the Atlantic Monthly. Boynton is writing a book about American literary nonfiction.

Copyright © 2004 BookForum




Sunday, September 26, 2004

Try, Try Again?

If this is (fair & balanced) discourse, so be it.

[x Paul Graham Blog]
The Age of The Essay
by Paul Graham

Remember the essays you had to write in high school? Topic sentence, introductory paragraph, supporting paragraphs, conclusion. The conclusion being, say, that Ahab in Moby Dick was a Christ-like figure.

Oy. So I'm going to try to give the other side of the story: what an essay really is, and how you write one. Or at least, how I write one.

Mods

The most obvious difference between real essays and the things one has to write in school is that real essays are not exclusively about English literature. Certainly schools should teach students how to write. But due to a series of historical accidents the teaching of writing has gotten mixed together with the study of literature. And so all over the country students are writing not about how a baseball team with a small budget might compete with the Yankees, or the role of color in fashion, or what constitutes a good dessert, but about symbolism in Dickens.

With the result that writing is made to seem boring and pointless. Who cares about symbolism in Dickens? Dickens himself would be more interested in an essay about color or baseball.

How did things get this way? To answer that we have to go back almost a thousand years. Around 1100, Europe at last began to catch its breath after centuries of chaos, and once they had the luxury of curiosity they rediscovered what we call "the classics." The effect was rather as if we were visited by beings from another solar system. These earlier civilizations were so much more sophisticated that for the next several centuries the main work of European scholars, in almost every field, was to assimilate what they knew.

During this period the study of ancient texts acquired great prestige. It seemed the essence of what scholars did. As European scholarship gained momentum it became less and less important; by 1350 someone who wanted to learn about science could find better teachers than Aristotle in his own era.1 But schools change slower than scholarship. In the 19th century the study of ancient texts was still the backbone of the curriculum.

The time was then ripe for the question: if the study of ancient texts is a valid field for scholarship, why not modern texts? The answer, of course, is that the original raison d'etre of classical scholarship was a kind of intellectual archaeology that does not need to be done in the case of contemporary authors. But for obvious reasons no one wanted to give that answer. The archaeological work being mostly done, it implied that those studying the classics were, if not wasting their time, at least working on problems of minor importance.

And so began the study of modern literature. There was a good deal of resistance at first. The first courses in English literature seem to have been offered by the newer colleges, particularly American ones. Dartmouth, the University of Vermont, Amherst, and University College, London taught English literature in the 1820s. But Harvard didn't have a professor of English literature until 1876, and Oxford not till 1885. (Oxford had a chair of Chinese before it had one of English.)2
What tipped the scales, at least in the US, seems to have been the idea that professors should do research as well as teach. This idea (along with the PhD, the department, and indeed the whole concept of the modern university) was imported from Germany in the late 19th century. Beginning at Johns Hopkins in 1876, the new model spread rapidly.

Writing was one of the casualties. Colleges had long taught English composition. But how do you do research on composition? The professors who taught math could be required to do original math, the professors who taught history could be required to write scholarly articles about history, but what about the professors who taught rhetoric or composition? What should they do research on? The closest thing seemed to be English literature.3

And so in the late 19th century the teaching of writing was inherited by English professors. This had two drawbacks: (a) an expert on literature need not himself be a good writer, any more than an art historian has to be a good painter, and (b) the subject of writing now tends to be literature, since that's what the professor is interested in.

High schools imitate universities. The seeds of our miserable high school experiences were sown in 1892, when the National Education Association "formally recommended that literature and composition be unified in the high school course." 4 The 'riting component of the 3 Rs then morphed into English, with the bizarre consequence that high school students now had to write about English literature-- to write, without even realizing it, imitations of whatever English professors had been publishing in their journals a few decades before.

It's no wonder if this seems to the student a pointless exercise, because we're now three steps removed from real work: the students are imitating English professors, who are imitating classical scholars, who are merely the inheritors of a tradition growing out of what was, 700 years ago, fascinating and urgently needed work.

No Defense

The other big difference between a real essay and the things they make you write in school is that a real essay doesn't take a position and then defend it. That principle, like the idea that we ought to be writing about literature, turns out to be another intellectual hangover of long forgotten origins.

It's often mistakenly believed that medieval universities were mostly seminaries. In fact they were more law schools. And at least in our tradition lawyers are advocates, trained to take either side of an argument and make as good a case for it as they can. Whether cause or effect, this spirit pervaded early universities. The study of rhetoric, the art of arguing persuasively, was a third of the undergraduate curriculum.5 And after the lecture the most common form of discussion was the disputation. This is at least nominally preserved in our present-day thesis defense: most people treat the words thesis and dissertation as interchangeable, but originally, at least, a thesis was a position one took and the dissertation was the argument by which one defended it.

Defending a position may be a necessary evil in a legal dispute, but it's not the best way to get at the truth, as I think lawyers would be the first to admit. It's not just that you miss subtleties this way. The real problem is that you can't change the question.

And yet this principle is built into the very structure of the things they teach you to write in high school. The topic sentence is your thesis, chosen in advance, the supporting paragraphs the blows you strike in the conflict, and the conclusion-- uh, what is the conclusion? I was never sure about that in high school. It seemed as if we were just supposed to restate what we said in the first paragraph, but in different enough words that no one could tell. Why bother? But when you understand the origins of this sort of "essay," you can see where the conclusion comes from. It's the concluding remarks to the jury.

Good writing should be convincing, certainly, but it should be convincing because you got the right answers, not because you did a good job of arguing. When I give a draft of an essay to friends, there are two things I want to know: which parts bore them, and which seem unconvincing. The boring bits can usually be fixed by cutting. But I don't try to fix the unconvincing bits by arguing more cleverly. I need to talk the matter over.

At the very least I must have explained something badly. In that case, in the course of the conversation I'll be forced to come up a with a clearer explanation, which I can just incorporate in the essay. More often than not I have to change what I was saying as well. But the aim is never to be convincing per se. As the reader gets smarter, convincing and true become identical, so if I can convince smart readers I must be near the truth.

The sort of writing that attempts to persuade may be a valid (or at least inevitable) form, but it's historically inaccurate to call it an essay. An essay is something else.

Trying

To understand what a real essay is, we have to reach back into history again, though this time not so far. To Michel de Montaigne, who in 1580 published a book of what he called "essais." He was doing something quite different from what lawyers do, and the difference is embodied in the name. Essayer is the French verb meaning "to try" and an essai is an attempt. An essay is something you write to try to figure something out.

Figure out what? You don't know yet. And so you can't begin with a thesis, because you don't have one, and may never have one. An essay doesn't begin with a statement, but with a question. In a real essay, you don't take a position and defend it. You notice a door that's ajar, and you open it and walk in to see what's inside.

If all you want to do is figure things out, why do you need to write anything, though? Why not just sit and think? Well, there precisely is Montaigne's great discovery. Expressing ideas helps to form them. Indeed, helps is far too weak a word. Most of what ends up in my essays I only thought of when I sat down to write them. That's why I write them.

In the things you write in school you are, in theory, merely explaining yourself to the reader. In a real essay you're writing for yourself. You're thinking out loud.

But not quite. Just as inviting people over forces you to clean up your apartment, writing something that other people will read forces you to think well. So it does matter to have an audience. The things I've written just for myself are no good. They tend to peter out. When I run into difficulties, I find I conclude with a few vague questions and then drift off to get a cup of tea.

Many published essays peter out in the same way. Particularly the sort written by the staff writers of newsmagazines. Outside writers tend to supply editorials of the defend-a-position variety, which make a beeline toward a rousing (and foreordained) conclusion. But the staff writers feel obliged to write something "balanced." Since they're writing for a popular magazine, they start with the most radioactively controversial questions, from which-- because they're writing for a popular magazine-- they then proceed to recoil in terror. Abortion, for or against? This group says one thing. That group says another. One thing is certain: the question is a complex one. (But don't get mad at us. We didn't draw any conclusions.)

The River

Questions aren't enough. An essay has to come up with answers. They don't always, of course. Sometimes you start with a promising question and get nowhere. But those you don't publish. Those are like experiments that get inconclusive results. An essay you publish ought to tell the reader something he didn't already know.

But what you tell him doesn't matter, so long as it's interesting. I'm sometimes accused of meandering. In defend-a-position writing that would be a flaw. There you're not concerned with truth. You already know where you're going, and you want to go straight there, blustering through obstacles, and hand-waving your way across swampy ground. But that's not what you're trying to do in an essay. An essay is supposed to be a search for truth. It would be suspicious if it didn't meander.

The Meander (aka Menderes) is a river in Turkey. As you might expect, it winds all over the place. But it doesn't do this out of frivolity. The path it has discovered is the most economical route to the sea.6

The river's algorithm is simple. At each step, flow down. For the essayist this translates to: flow interesting. Of all the places to go next, choose the most interesting. One can't have quite as little foresight as a river. I always know generally what I want to write about. But not the specific conclusions I want to reach; from paragraph to paragraph I let the ideas take their course.

This doesn't always work. Sometimes, like a river, one runs up against a wall. Then I do the same thing the river does: backtrack. At one point in this essay I found that after following a certain thread I ran out of ideas. I had to go back seven paragraphs and start over in another direction.

Fundamentally an essay is a train of thought-- but a cleaned-up train of thought, as dialogue is cleaned-up conversation. Real thought, like real conversation, is full of false starts. It would be exhausting to read. You need to cut and fill to emphasize the central thread, like an illustrator inking over a pencil drawing. But don't change so much that you lose the spontaneity of the original.

Err on the side of the river. An essay is not a reference work. It's not something you read looking for a specific answer, and feel cheated if you don't find it. I'd much rather read an essay that went off in an unexpected but interesting direction than one that plodded dutifully along a prescribed course.

Surprise

So what's interesting? For me, interesting means surprise. Interfaces, as Geoffrey James has said, should follow the principle of least astonishment. A button that looks like it will make a machine stop should make it stop, not speed up. Essays should do the opposite. Essays should aim for maximum surprise.

I was afraid of flying for a long time and could only travel vicariously. When friends came back from faraway places, it wasn't just out of politeness that I asked what they saw. I really wanted to know. And I found the best way to get information out of them was to ask what surprised them. How was the place different from what they expected? This is an extremely useful question. You can ask it of the most unobservant people, and it will extract information they didn't even know they were recording.

Surprises are things that you not only didn't know, but that contradict things you thought you knew. And so they're the most valuable sort of fact you can get. They're like a food that's not merely healthy, but counteracts the unhealthy effects of things you've already eaten.

How do you find surprises? Well, therein lies half the work of essay writing. (The other half is expressing yourself well.) The trick is to use yourself as a proxy for the reader. You should only write about things you've thought about a lot. And anything you come across that surprises you, who've thought about the topic a lot, will probably surprise most readers.

For example, in a recent essay I pointed out that because you can only judge computer programmers by working with them, no one knows who the best programmers are overall. I didn't realize this when I began that essay, and even now I find it kind of weird. That's what you're looking for.

So if you want to write essays, you need two ingredients: a few topics you've thought about a lot, and some ability to ferret out the unexpected.

What should you think about? My guess is that it doesn't matter-- that anything can be interesting if you get deeply enough into it. One possible exception might be things that have deliberately had all the variation sucked out of them, like working in fast food. In retrospect, was there anything interesting about working at Baskin-Robbins? Well, it was interesting how important color was to the customers. Kids a certain age would point into the case and say that they wanted yellow. Did they want French Vanilla or Lemon? They would just look at you blankly. They wanted yellow. And then there was the mystery of why the perennial favorite Pralines 'n' Cream was so appealing. (I think now it was the salt.) And the difference in the way fathers and mothers bought ice cream for their kids: the fathers like benevolent kings bestowing largesse, the mothers harried, giving in to pressure. So, yes, there does seem to be some material even in fast food.

I didn't notice those things at the time, though. At sixteen I was about as observant as a lump of rock. I can see more now in the fragments of memory I preserve of that age than I could see at the time from having it all happening live, right in front of me.

Observation

So the ability to ferret out the unexpected must not merely be an inborn one. It must be something you can learn. How do you learn it?

To some extent it's like learning history. When you first read history, it's just a whirl of names and dates. Nothing seems to stick. But the more you learn, the more hooks you have for new facts to stick onto-- which means you accumulate knowledge at what's colloquially called an exponential rate. Once you remember that Normans conquered England in 1066, it will catch your attention when you hear that other Normans conquered southern Italy at about the same time. Which will make you wonder about Normandy, and take note when a third book mentions that Normans were not, like most of what is now called France, tribes that flowed in as the Roman empire collapsed, but Vikings (norman = north man) who arrived four centuries later in 911. Which makes it easier to remember that Dublin was also established by Vikings in the 840s. Etc, etc squared.

Collecting surprises is a similar process. The more anomalies you've seen, the more easily you'll notice new ones. Which means, oddly enough, that as you grow older, life should become more and more surprising. When I was a kid, I used to think adults had it all figured out. I had it backwards. Kids are the ones who have it all figured out. They're just mistaken.

When it comes to surprises, the rich get richer. But (as with wealth) there may be habits of mind that will help the process along. It's good to have a habit of asking questions, especially questions beginning with Why. But not in the random way that three year olds ask why. There are an infinite number of questions. How do you find the fruitful ones?

I find it especially useful to ask why about things that seem wrong. For example, why should there be a connection between humor and misfortune? Why do we find it funny when a character, even one we like, slips on a banana peel? There's a whole essay's worth of surprises there for sure.

If you want to notice things that seem wrong, you'll find a degree of skepticism helpful. I take it as an axiom that we're only achieving 1% of what we could. This helps counteract the rule that gets beaten into our heads as children: that things are the way they are because that is how things have to be. For example, everyone I've talked to while writing this essay felt the same about English classes-- that the whole process seemed pointless. But none of us had the balls at the time to hypothesize that it was, in fact, all a mistake. We all thought there was just something we weren't getting.

I have a hunch you want to pay attention not just to things that seem wrong, but things that seem wrong in a humorous way. I'm always pleased when I see someone laugh as they read a draft of an essay. But why should I be? I'm aiming for good ideas. Why should good ideas be funny? The connection may be surprise. Surprises make us laugh, and surprises are what one wants to deliver.

I write down things that surprise me in notebooks. I never actually get around to reading them and using what I've written, but I do tend to reproduce the same thoughts later. So the main value of notebooks may be what writing things down leaves in your head.

People trying to be cool will find themselves at a disadvantage when collecting surprises. To be surprised is to be mistaken. And the essence of cool, as any fourteen year old could tell you, is nil admirari. When you're mistaken, don't dwell on it; just act like nothing's wrong and maybe no one will notice.

One of the keys to coolness is to avoid situations where inexperience may make you look foolish. If you want to find surprises you should do the opposite. Study lots of different things, because some of the most interesting surprises are unexpected connections between different fields. For example, jam, bacon, pickles, and cheese, which are among the most pleasing of foods, were all originally intended as methods of preservation. And so were books and paintings.

Whatever you study, include history-- but social and economic history, not political history. History seems to me so important that it's misleading to treat it as a mere field of study. Another way to describe it is all the data we have so far.

Among other things, studying history gives one confidence that there are good ideas waiting to be discovered right under our noses. Swords evolved during the Bronze Age out of daggers, which (like their flint predecessors) had a hilt separate from the blade. Because swords are longer the hilts kept breaking off. But it took five hundred years before someone thought of casting hilt and blade as one piece.

Disobedience

Above all, make a habit of paying attention to things you're not supposed to, either because they're "inappropriate," or not important, or not what you're supposed to be working on. If you're curious about something, trust your instincts. Follow the threads that attract your attention. If there's something you're really interested in, you'll find they have an uncanny way of leading back to it anyway, just as the conversation of people who are especially proud of something always tends to lead back to it.

For example, I've always been fascinated by comb-overs, especially the extreme sort that make a man look as if he's wearing a beret made of his own hair. Surely this is a lowly sort of thing to be interested in-- the sort of superficial quizzing best left to teenage girls. And yet there is something underneath. The key question, I realized, is how does the comber-over not see how odd he looks? And the answer is that he got to look that way incrementally. What began as combing his hair a little carefully over a thin patch has gradually, over 20 years, grown into a monstrosity. Gradualness is very powerful. And that power can be used for constructive purposes too: just as you can trick yourself into looking like a freak, you can trick yourself into creating something so grand that you would never have dared to plan such a thing. Indeed, this is just how most good software gets created. You start by writing a stripped-down kernel (how hard can it be?) and gradually it grows into a complete operating system. Hence the next leap: could you do the same thing in painting, or in a novel?

See what you can extract from a frivolous question? If there's one piece of advice I would give about writing essays, it would be: don't do as you're told. Don't believe what you're supposed to. Don't write the essay readers expect; one learns nothing from what one expects. And don't write the way they taught you to in school.

The most important sort of disobedience is to write essays at all. Fortunately, this sort of disobedience shows signs of becoming rampant. It used to be that only a tiny number of officially approved writers were allowed to write essays. Magazines published few of them, and judged them less by what they said than who wrote them; a magazine might publish a story by an unknown writer if it was good enough, but if they published an essay on x it had to be by someone who was at least forty and whose job title had x in it. Which is a problem, because there are a lot of things insiders can't say precisely because they're insiders.

The Internet is changing that. Anyone can publish an essay on the Web, and it gets judged, as any writing should, by what it says, not who wrote it. Who are you to write about x? You are whatever you wrote.

Popular magazines made the period between the spread of literacy and the arrival of TV the golden age of the short story. The Web may well make this the golden age of the essay. And that's certainly not something I realized when I started writing this.

Notes

1I'm thinking of Oresme (c. 1323-82). But it's hard to pick a date, because there was a sudden drop-off in scholarship just as Europeans finished assimilating classical science. The cause may have been the plague of 1347; the trend in scientific progress matches the population curve.

2Parker, William R. "Where Do College English Departments Come From?" College English 28 (1966-67), pp. 339-351. Reprinted in Gray, Donald J. (ed). The Department of English at Indiana University Bloomington 1868-1970. Indiana University Publications.

Daniels, Robert V. The University of Vermont: The First Two Hundred Years. University of Vermont, 1991.

Mueller, Friedrich M. "Letter to the Pall Mall Gazette. 1886/87." Reprinted in Bacon, Alan (ed). The Nineteenth-Century History of English Studies. Ashgate, 1998.

3 I'm compressing the story a bit. At first literature took a back seat to philology, which (a) seemed more serious and (b) was popular in Germany, where many of the leading scholars of that generation had been trained.

In some cases the writing teachers were transformed in situ into English professors. Francis James Child, who had been Boylston Professor of Rhetoric at Harvard since 1851, became in 1876 the university's first professor of English.

4Parker, op. cit., p. 25.

5The undergraduate curriculum or trivium (whence "trivial") consisted of Latin grammar, rhetoric, and logic. Candidates for masters' degrees went on to study the quadrivium of arithmetic, geometry, music, and astronomy. Together these were the seven liberal arts.

The study of rhetoric was inherited directly from Rome, where it was considered the most important subject. It would not be far from the truth to say that education in the classical world meant training landowners' sons to speak well enough to defend their interests in political and legal disputes.

6Trevor Blackwell points out that this isn't strictly true, because the outside edges of curves erode faster.

Paul Graham is currently working on a new programming language called Arc. In 1995 he developed with Robert Morris the first web-based application, Viaweb, which was acquired by Yahoo in 1998. In 2002 he described a simple but effective Bayesian spam filter that inspired most current filters.

Paul is the author of On Lisp (Prentice Hall, 1993), ANSI Common Lisp (Prentice Hall, 1995), and Hackers & Painters (O'Reilly, 2004). He has an AB from Cornell and a PhD in Computer Science from Harvard, and studied painting at RISD and the Accademia di Belle Arti in Florence.

Thanks to Ken Anderson, Trevor Blackwell, Sarah Harlin, Jessica Livingston, Jackie McDonough, and Robert Morris for reading drafts of this.


© mmiv pg