Monday, May 31, 2004

You Are What You Eat

I caught Dr. Phil on the Larry King Show on CNN the other night. No wonder Dr. Phil is Oprah's favorite! He is a weight-loss expert. Just as Dr. Laura has no training in psychology, Dr. Phil is not a nutritionist! If this is (fair & balanced) debunking, so be it.



[x Harvard Magazine]
The Way We Eat Now
Ancient bodies collide with modern technology to produce a flabby, disease-ridden populace.
by Craig Lambert

Last year, Morgan Spurlock decided to eat all his meals at McDonald's for a month. For 30 straight days, everything he took in—breakfast, lunch, dinner, even his bottled water—came from McDonald's. Spurlock recorded the results on camera for his film Super Size Me, which won the Best Director prize for documentaries at this year's Sundance Film Festival. Super Size Me is also a kind of shock/horror movie, as viewers see the 33-year-old Spurlock's physical condition collapse, day by day. "My body just basically falls apart over the course of this diet," Spurlock told Newsweek. "I start to get tired, I start to get headaches; my liver basically starts to fill up with fat because there's so much fat and sugar in this food. My blood sugar skyrockets, my cholesterol goes up off the charts, my blood pressure becomes completely unmanageable. The doctors were like, 'You have to stop.'" In one month on the fast-food regime, he gained 25 pounds.

Spurlock's total immersion in fast food was a one-subject research study, and his body's response a warning about the way we eat now. "Super Size Me" could be a credo for the United States, where people, like their automobiles, have become gargantuan. "SUVs, big homes, penis enlargement, breast enlargement, bulking up with steroids—it's a context of everything getting bigger," says K. Dun Gifford '60, LL.B. '66, president of the Oldways Preservation and Exchange Trust, a nonprofit organization specializing in food, diet, and nutrition education.

Everywhere in the world, the richest people build the biggest homes, but as the world's wealthiest nation, the United States is also building the biggest bodies. It's hardly cause for patriotic pride. "We're leading a race we shouldn't want to win," says associate professor of pediatrics David Ludwig. Many foreigners already view Americans as rich, greedy over-consumers, stuffing themselves with far more than their share of the planet's resources, and obese American travelers waddling through international airports and hotel lobbies only reinforce that image. Yet our fat problem is becoming a global one as food corporations export our sugary, salty, fatty diet: Beijing has more than a hundred McDonald's franchises, which advertise and price the same food in the same way, and with the same level of success.

Two-thirds of American adults are overweight, and half of these are obese. (Overweight means having a body mass index, or BMI, of 25 or greater, obese, 30 or greater: to calculate BMI, a widely used measure, take the square of your height in inches and then divide your weight, in pounds, by that number; then multiply the result by 703. Or calculate it on-line at www.cdc.gov/nccdphp/dnpa/bmi/calc-bmi.htm.) Even adults in the upper end of the "normal" range, who have BMIs of 22 to 24, would generally live longer if they lost some fat; add in these people and it appears that "up to 80 percent of American adults should weigh less than they do," says Walter C. Willett, M.D., D.P.H. '80, Stare professor of epidemiology and nutrition at the School of Public Health.

The epidemic of obesity is a vast and growing public health problem. "Weight sits like a spider at the center of an intricate, tangled web of health and disease," writes Willett in Eat, Drink, and Be Healthy: The Harvard Medical School Guide to Healthy Eating, arguably the best and most scientifically sound book on nutrition for the general public. He notes that three aspects of weight—BMI, waist size, and weight gained after one's early twenties—are linked to chances of having or dying from heart disease, strokes and other cardiovascular diseases, diabetes, and several types of cancer, plus suffering from arthritis, infertility, gallstones, asthma, and even snoring. "Weight is much more important than serum cholesterol," Willett asserts; as a cause of premature, preventable deaths, he adds, excess weight and obesity rank a very close second to smoking, partly because there are twice as many fat people as smokers. In fact, since smokers tend to be leaner, the decrease in smoking prevalence has actually swelled the ranks of the fat.

The obesity epidemic arrived with astonishing speed. After tens of thousands of generations of human evolution, flab has become widespread only in the past 50 years, and waistlines have ballooned exponentially in the last two decades. In 1980, 46 percent of U.S. adults were overweight; by 2000, the figure was 64.5 percent: nearly a 1 percent annual increase in the ranks of the fat. At this rate, by 2040, 100 percent of American adults will be overweight and "it may happen more quickly," says John Foreyt of Baylor College of Medicine, who spoke at a conference organized by Gifford's Oldways group in 2003. Foreyt noted that, 20 years ago, he rarely saw 300-pound patients; now they are common. Childhood obesity, also once rare, has mushroomed: 15 percent of children between ages six and 19 are now overweight, and even 10 percent of those between two and five. "This may be the first generation of children who will die before their parents," Foreyt says.

Lifestyles of the Rich and Gluttonous

Weight gain, loss, and regulation are marvelously complex, but certain simple principles stand out. Like CICO: calories in, calories out. When the human body takes in more energy than it expends, it stores the excess as fat. Today, Americans eat 200 calories more food energy per day than they did 10 years ago; that alone would add 20 pounds annually to one's bulk. All demographic segments are fattening up, but the growth in adipose tissue isn't random. "The highly educated have only half the level of obesity of those with lower education," Willett says. A recent paper in the American Journal of Clinical Nutrition argued that the poor tend toward greater obesity because eating energy-dense, highly palatable, refined foods is cheaper per calorie consumed than buying fish and fresh fruits and vegetables. At the Oldways conference, Foreyt noted that 80 percent of African-American females are overweight, and that Hispanic women were the second-heaviest group. "The last to fatten will be rich white women," he observed.

One explanation for our slide into overconsumption is that "the character of modern Americans is somehow inherently weak and we are incapable of discipline," says Ludwig. "The food industry would love to explain obesity as a problem of personal responsibility, since it takes the onus off them for marketing fast food, soft drinks, and other high-calorie, low-quality products."

Personal responsibility surely does play a role, but we also live in a "toxic environment" that in many ways discourages healthy eating, says Ludwig. "There's the incessant advertising and marketing of the poorest quality foods imaginable. To address this epidemic, you'd want to make healthful foods widely available, inexpensive, and convenient, and unhealthful foods relatively less so. Instead, we've done the opposite."

Never in human experience has food been available in the staggering profusion seen in North America today. We are awash in edibles shipped in from around the planet; seasonality has largely disappeared. Food obtrudes itself constantly, seductively, into our lives—on sidewalks, in airplanes, at gas stations and movie theaters. "Caloric intake is directly related to gross national product per capita," says Moore professor of biological anthropology Richard Wrangham. "It's very difficult to resist the temptation to take in more calories if they are available. People keep regarding it as an American problem, but it's a global problem as countries get richer." Still, the lavish banquet's first seating is right here in the United States of America.




Surrounded by bits of primate anatomy, Richard Wrangham holds the skull of a chimpanzee. Note the size of the chimp's jaws and teeth. Portrait by Jim Harrison


"The French explanation for why Americans are so big is simple," said Jody Adams, chef/partner of Rialto, a restaurant in Harvard Square, speaking at the Oldways conference. "We eat lots of sugar, and we eat between meals. In France, no one gets so fat as to sue the restaurant!" Indeed, the national response to our glut of comestibles is apparently to eat only one meal a day—all day long. We eat everywhere and at all times: at work, at play, and in transit. "Japanese cars—the ones sold in Japan—don't have drink holders," New York Times health columnist Jane Brody said at the Oldways conference. "The Japanese don't eat and drink in their cars."

Steven Gortmaker, professor of society, human development, and health at the School of Public Health, observes that the convenience-food culture is so ubiquitous that even conscientious parents have trouble steering their children away from junk food. "You let your kids go on a 'play date,'" says the father of two, "and they come home and say, 'We went to Burger King for lunch.'" (He notes that on any given day, 30 percent of American children aged four to 19 eat fast food, and older and wealthier ones eat even more. Overall, 7 percent of the U.S. population visits McDonald's each day, and 20 to 25 percent eat in some kind of fast-food restaurant.) But taking the family to McDonald's for, say, Chicken McNuggets, French fries, and a sugar-sweetened beverage—a meal loaded with calories, salt, trans fats (the most unhealthy, artery-clogging fats of all, typified in "partially hydrogenated" oils), fried foods, starch, and sugar—makes Gortmaker shake his head. "I can't imagine a worse meal for kids," he says. "They call this a 'Happy Meal'?"

Humans can eat convenient, refined, highly processed food with great speed, enabling them to consume an astonishing caloric load—literally thousands of calories—in minutes. Gortmaker, Ludwig, and colleagues did research comparing caloric intake on days when children ate in a fast-food restaurant to days when they did not; they soaked up 126 calories more on fast-food days, which could translate into a weight gain of 13 pounds per year on fast food alone.

Pumping up portion size makes good business sense, because the cost of ingredients like sugar and water for a carbonated soda is trivial, and customers perceive the larger amount as delivering greater value. "When you have calories that are incredibly cheap, in a culture where 'bigger is better,' that's a dangerous combination," says Walter Willett. "The French aren't so interested in the amount of food; they are more concerned with its quality. But feeling stuffed and loosening your belt has high value in American culture. We eat as if every meal is a festival." Willet recalls seeing neighboring French and German restaurants on a trip to Basel, Switzerland, several years ago. "The German restaurant was piling big mountains of sausages and potatoes on the plates," he says. "The French place had a delicately broiled trout and three beautifully presented spears of asparagus. In the United States we have adopted the mainstream Anglo-German eating culture: lots of meat and potatoes."




Walter Willett with a vegetable salad at Sebastian's Café, at the School of Public Health. He advised the cafeteria on healthy choices for its menu. Portrait by Jim Harrison


Furthermore, "Portion sizes have increased dramatically since the 1950s," says Beatrice Lorge Rogers '68, professor of economics and food policy at Tufts University's Friedman School of Nutrition Science and Policy. For proof, consider a 1950s advertising jingle: "Pepsi-Cola hits the spot/12 full ounces, that's a lot." Well, it's not a lot any more. For decades, 12 ounces (itself a move up from earlier 6.5- and 10-ounce bottles) was the standard serving size for soft drinks. But since the 1970s, soft drink bottles have grown to 20 and 24 ounces; today, even one-liter (33.8 ounce) bottles are marketed as "single servings." It doesn't stop there. The 7-11 convenience store chain offers a Double Gulp cup filled with 64 ounces of ice and soda: a half-gallon "serving." Surely, the 128-ounce Gallon Guzzle is on the horizon.

The Technology of Appetite

Soft drinks are becoming America's favorite breakfast beverage, and specialty sandwiches and burritos for breakfast are fast-growing items, part of the trend toward eating out for all meals. The restaurant industry—which employs 12 million workers (second only to government) and has projected sales of $440.1 billion this year, according to its national association—ranks among the nation's largest businesses. Today, Americans spend 49 cents of every food dollar on food eaten outside the home, where, according to Rogers, they consume 30 percent of their calories. That includes take-out food (which some parts of the restaurant industry now style as "home meal replacement").

This represents a drastic change from the 1950s, when people ate far more of their meals at home, with their families, and at a leisurely pace. "A hundred years ago there was no such thing as a snack food—nothing you could pop open and overeat," says Mollie Katzen, author of The Moosewood Cookbook and many others, and a consultant to Harvard Dining Services. "There were stew pots. Things took a long time to cook, and a meal was the result of someone's labor."

The 1950s were also an era in which the kitchen—not the television room—was the heart of the home. "In some ways, you can see obesity as the tip of the iceberg, sitting on top of huge societal issues," says Willett. "There are enormous pressures on homes with both the husband and wife in the work force. One reason things need to be fast is that Mom is not at home preparing meals and waiting for the kids to come home from school any more. She is out there in the office all day, commuting home, and maybe working extra hours at night. This means heating something in the microwave or hitting the drive-through at McDonald's. There really is a time issue—people do have less time. Yet, look at the number of hours spent watching television. Somehow we've lost an element of creativity and control over our lives. All too many people have become passive."

Technology may have entrenched that passivity, while making food preparation easier and faster. Three Harvard economists, professors of economics Edward Glaeser and David Cutler, and graduate student Jesse Shapiro, argued in a recent paper that improved technology has cut the time needed to prepare food, allowing us to eat more conveniently. For example, in 1978, they note, only 8 percent of homes had microwave ovens, but 83 percent do today. Food that once took hours to prepare is now "nuked" in minutes.

Technology can also change what we eat. Potatoes used to be baked, boiled, or mashed; the labor involved in peeling, cutting, and cooking French fries meant that few home cooks served them, the economists point out. But now factories prepare potatoes for frying and ship them to fast-food outlets or freeze them for microwave cooking at home. Americans ate 30 percent more potatoes between 1977 and 1995, most of that increase coming in the form of French fries and potato chips. In general, technology has enabled the food industry to do more of the work of preparing and cooking what we eat, increasing the proportion of processed victuals in the nation's diet. Frequently, processing also folds in more ingredients; russet potatoes, for example, contain no added salt or oil, though most potato chips do.

But the most powerful technology driving the obesity epidemic is television. "The best single behavioral predictor of obesity in children and adults is the amount of television viewing," says the School of Public Health's Gortmaker. "The relationship is nearly as strong as what you see between smoking and lung cancer. Everybody thinks it's because TV watching is sedentary, you're just sitting there for hours—but that's only about one-third of the effect. Our guesstimate is that two-thirds is the effect of advertising in changing what you eat." Willett asserts, "You can't expect three- and four-year-olds to make decisions about the long-term consequences of their food choices. But every year they are subjected to intensive and increasingly polished messages promoting foods that are almost entirely junk." (Furthermore, in some future year when the Internet merges with broadband cable TV, advertisers will be able to target their messages far more precisely. "It won't be just to kids," Gortmaker says. "It'll be to your kid.")

Within our laissez-faire system of food supply, the food vendors' actions aren't illegal, or even inherently immoral. "The food industry's major objective is to get us to intake more food," says Gortmaker. "And the TV industry's objective is to get us to watch more television, to be sedentary. Advertising is the action that keeps them both successful. So you've got two huge industries being successful at what they are supposed to do: creating more intake and less activity. And since larger people require more food energy just to sustain themselves, the food industry is growing a larger market for itself."

That industry spends billions of dollars on research, says Willett. "They have carefully researched the exact levels of sweetness and saltiness that will make every food as attractive as possible," he explains. "Each company is putting out its bait, trying to make it more attractive than its competitors. Food industry science is getting better, more refined, and more powerful as we go along. They do good science—they don't throw their money down the drain. What we spend on nutrition education is only in the tens of millions of dollars annually. There's a huge imbalance, and it tips more and more in favor of the food industry every year. Food executives like to say, 'Just educate the consumer—when they create the demand for healthier food, we'll supply it!' That's a bit disingenuous when you consider that they are already spending billions to 'educate' consumers."

Motionless America

The old order Amish of Ontario, Canada, have escaped much of that advertising, and the TV viewing as well. They have an obesity rate of 4 percent, less than one-seventh the U.S. norm. Yet the Amish eat heartily, and not all health food: pancakes, ham, cake, and milk—but also ample amounts of fresh fruits and vegetables. It seems that the secret to the "Amish paradox" is their low-technology lifestyle, which entails vastly more physical activity than its modern correlate. David R. Bassett, a professor of exercise science at the University of Tennessee, gave pedometers to 98 of these Amish adults and found that the men averaged 18,000 steps per day, the women 14,000—about nine miles and seven miles, respectively. The Amish men averaged 10 hours a week of vigorous activities like shoveling or tossing bales of hay (women, 3.5 hours) and 43 hours of moderate exertion like gardening or doing laundry (women, 39 hours).

"The Amish are not freaks," says professor of anthropology Daniel Lieberman, a skeletal biologist. "They are just anachronisms. Human beings are adapted for endurance exercise. We evolved to be long-distance runners—running a marathon is not a freak activity. We can outrun just about any other creature."

Though only a few pockets of hunter-gatherers remain on Earth, for the first couple of million years of our species' evolution—99.5 percent of the human experience—all people sustained themselves by hunting animals and gathering food from wild plants. Agriculture arose only 10,000 to 12,000 years ago, permitting more stable settlements and food supplies. Hunter-gatherers spend much of every day traveling: "Who ever heard of a sedentary hunter-gatherer?" asks Lieberman, laughing. (There were a few sedentary hunter-gatherers, he notes—in the Pacific Northwest where salmon ran plentifully.) But although humans are designed to be highly active, the chronic ailments of sedentary life and obesity, like diabetes and heart disease, typically turn fatal only when people are past reproductive age. Thus, natural selection doesn't weed out couch potatoes.

Since the Industrial Revolution, and particularly in the last half-century, technology has enabled us to conduct an increasingly immobile daily life. In Benjamin Franklin's time, virtually all Americans were farmers. Even a century later, before the invention of the automobile, many farmed or at least used their bodies vigorously every day. Walter Willett's family has been involved in dairy farming in Michigan for many generations, and he himself was a 4-H member who grew award-winning vegetables as a young man. "At higher levels of activity, people seem to balance their caloric intake and expenditure extremely well," he says. "If our grandparents were farmers, they were moving all day long—not jogging for an hour, but staying active eight to 12 hours a day. Physically, I'm very active myself, probably in the upper 5 percent, but I'm still very inactive compared with my grandfather.

"The way we do our work has changed, and so has the way we spend our leisure time," he continues. "The average number of television hours watched per week is close to a full-time job! People used to go for walks and visit their neighbors. Much of that is gone as well." Not only do many adults spend their work lives in front of computer screens, but the design of public spaces outside their offices eliminates physical activity. In skyscrapers, it's often hard to find the stairs; electronic sensors in public restrooms are eliminating even the most minimal actions of flushing toilets or turning faucets on and off.

Cities are designed for automobiles, not for healthier ways of getting about like walking or bicycling. "In fact, we've made it dangerous and unattractive to do so," says Willett, recalling a symposium on urban environments that the School of Public Health held with the Graduate School of Design: "For the architects, designing spaces to encourage physical activity wasn't even on the table." (Even so, cities tend to have lower rates of obesity than suburbs or rural areas. Few residents of Manhattan, for example, own cars. The density of the urban landscape allows one to walk to the drug store, subway, or dry cleaner.)

Furthermore, modern children "don't have to forage or walk long distances," says Lieberman. "Kids today sit in front of a TV or computer. They ride to school on a school bus. We even have them rolling their school backpacks on wheels because we are afraid of them overloading their backbones."

In sum, we no longer live like hunter-gatherers, but we still have hunter-gatherer genes. Humans evolved in a state of ceaseless physical activity; they ate seasonally, since there was no other choice; and frequently there was nothing to eat at all. To get through hard winters and famines, the human body evolved a brilliant mechanism of storing energy in fat cells. The problem, for most of humanity's time on Earth, has been a scarcity of calories, not a surfeit. Our fat-storage mechanism worked beautifully until 50 to 100 years ago. But since then, "The speed of environmental change has far surpassed our ability to adapt," says Dun Gifford of Oldways. Our bodies were not designed to handle so much caloric input and so little energy outflow. "There are many forces," Willett says, "and all are pushing in the wrong direction simultaneously."

Darwinian Dietetics

Different scholars and popular writers have argued that human beings have "evolved" to be carnivores, herbivores, frugivores, or omnivores, but anthropologist Richard Wrangham says we are "cookivores," grinning at the neologism. "We evolved to eat cooked foods," he declares. "Raw food eating is never practiced systematically anywhere in the world."

Wrangham spent fours years trying to disprove that last statement in a global investigation of current and historical cultures. He looked for the most extreme examples of people eating a pure raw-food diet, but failed to find any, "except for people in urban settings who were philosophically committed to raw food," he says. One researcher studied several hundred German raw-foodists, who had access to food of "astonishingly high quality" relative to wild raw foods, says Wrangham. Nonetheless, 25 percent of this group was chronically underweight, and 50 percent of the females "were so low in energy that they stopped having menstrual periods," he says. So even under exceptionally good conditions of superb year-round food availability, people had low energy and were "biologically incapable of appropriate reproduction, " says Wrangham. From an evolutionary point of view, sterility gets you bounced from the gene pool.

The genus Homo appeared about two million years ago, and even "the most skeptical archaeologist" will agree that fire was being controlled in southern Europe between 300,000 and 400,000 years ago, says Wrangham. Sound evidence of fireplaces dating from 380,000 years ago exists, for example, at Terra Amata in France, near Nice; other sites have earth ovens dug into cave floors. "Many regard this as the first evidence of cooking," he says, "but to me, this is rather sophisticated stuff, and is probably the earliest evidence we have of something that very likely was going on before."

Wrangham takes an extreme position: he postulates that cooking food over fires began by about 1.6 million years ago, and was an innovation so important that it allowed the evolution of Homo erectus, the earliest hominid to resemble modern humans (see "Primal Kitchens," November-December 2000, page 13). "Cooking enabled these animals—the very earliest erectus—to acquire their food more efficiently and to get more of it," he says. "A principal reason was that it made food softer."

Softer food has many implications. Imagine what a nonhuman, raw-food-eating primate like a chimpanzee consumes in one day. "It's a great big pile of leaves, seeds, and roots," Wrangham explains, gesturing with his hands to suggest a mound the size of a small shrub. Humans, with generally larger bodies, nonetheless fuel themselves with a far smaller volume of food. "Compared with other primates, we are evolved to eat foods of high caloric density—meats, roots, seeds," he says. Cooking makes this possible by changing the brittleness of collagen fiber, softening it and making meat far easier to chew. "People who think that meat dominated the diet of early Homo may well be right," he says, "but they would have to have spent five hours a day just chewing. Raw meat is very hard to chew, and presumably raw wild meat is even harder."

Consider again the chimpanzees, who spend as much time eating as one would expect for primates of their size and weight (100 to 120 pounds). "In primates, there's a nice relation between body weight and the amount of time spent eating," Wrangham explains. Chimps spend about six hours a day chewing. Humans, who typically weigh more than chimpanzees, should theoretically eat more and spend even more time at it. Instead, data from 15 cross-cultural studies indicate that on average, human beings spend about one hour a day chewing food.

Chimps' jaws and teeth are bigger than ours, and they like to eat meat—they will work hard to get it—"but they can't chew meat at all fast," Wrangham says. "The rate at which they chew and swallow meat is equivalent to the way they eat fruits: 300 to 400 calories per hour." In contrast, humans eating cooked, softened food of high caloric density can take in 2,000 calories during their daily hour of chewing and swallowing.

Cooking might be considered the first food-processing technology, and like its successors, it has had profound effects on the human body, as in the growth of bones. Various signals influence human growth; some come from genes, and others come from the environment, particularly for the musculo-skeletal system, whose job is engaging with the environment. Less chewing of cooked food, for example, has altered the anatomy of our skulls, jaws, faces, and teeth. "Chewing is a major activity that involves muscular forces," says skeletal biologist Daniel Lieberman. "It has incredible effects on how the skull grows." Chewing can transform anatomy rather quickly; in one study, in which Lieberman fed pigs a diet of softened food, in a matter of months their skulls developed shorter and narrower dimensions and their snouts developed thinner bones than those of pigs eating a hard-food diet.

The same thing happens with human beings. "Since the beginning of the fossil record, humans have become much more gracile," Lieberman says. "Our bones have become thinner, our faces smaller, and our teeth smaller—especially permanent teeth—although we have the same number of teeth. More recently, with the Industrial Revolution, people have become more sedentary; they interact with their environment in a less forceful way. We load our bones less and the bones become thinner. Osteoporosis is a disease of industrialism."

In today's world, where we not only cook but eat a great deal of processed food that has been ground up before it reaches our mouths, we don't generate as much force when chewing. In fact, for millennia human food has been growing less tough, fibrous, and hard. "The size of the human face has gotten about 12 percent smaller since the Paleolithic," Lieberman says, "particularly around the oral cavity, due to the effects of mechanical loading on the size of the face. Fourteen thousand years ago, a much larger proportion of the face was between the bottom of the jaw and the nostrils." The size of teeth has not decreased as fast (genetic factors control more of their variation); hence, modern teeth are actually too big for our mouths—wisdom teeth become impacted and require extraction.

The health hazards of sedentary life seem like an adult problem, but actually, the skeletal system is most responsive to loading when it is immature. There is only one window for accumulating bone mass—during the first two decades of life. "Peak bone mass occurs at the end of adolescence," Lieberman explains, "and we lose bone steadily thereafter. Kids who are active grow more robust bones. If you're sedentary as a juvenile, you don't grow as much bone mass—so as you get older and lose bone mass, you drop below the threshold for osteoporosis." Furthermore, females get osteoporosis more readily than men because they start with less adult bone mass; as life spans lengthen, says research fellow in cell biology Jennifer Sacheck, of Harvard Medical School, older men will also begin showing symptoms of osteoporosis.

Weight-bearing exercise only slows the rate of bone loss for adults; pre-adolescent bone growth is far more important to long-term skeletal strength. Hence, the sedentary lifestyles of today's youngsters—and the cutbacks on school physical-education programs—may be sowing the seeds of widespread skeletal breakdown as their cohort matures.

Sweet Tooth Bites the Hand That Feeds It

The dramatic upsurge in consumption of carbonated soft drinks, paired with the simultaneous decline in milk drinking, may also weaken future bones. Both milk (lactose) and soda (sucrose, fructose) are sweet, but soda is sweeter, and today's consumers are hooked on sugar. "We probably evolved our sense of sweetness to detect subtle amounts of carbohydrates in foods, because they provide energy," says Walter Willett. "But now the expectations of sweetness have been ratcheted up. A product is not deemed attractive if it is not as sweet as its competitor." Sugars added to foods made up 11 percent of the calories in American diets in the late 1970s; today they are 16 percent.

Humans did not always have such a sweet tooth. Our hormones and metabolism have remained essentially unchanged for the past 100,000 years, 90,000 of which were spent as hunter-gatherers. Grains, the source of products such as bread, baked goods, and corn syrup, did not become plentiful in the human diet until the establishment of agriculture.

With agriculture, human health declined, says Lieberman, partly because farming is such hard work, and partly because it allows higher population densities, in which infection spreads more easily. "There was more disease, a decrease in body size, higher mortality rates among juveniles, and more stress lines in bones and teeth," Lieberman says. Cultivating grain also allowed farmers to space their children more closely. Hunter-gatherers have long intervals between births, because they do not wean children until age four or five, when teeth are ready to chew hard foods. ("You can't feed babies beef jerky," jokes Lieberman.) Farmers, however, can make gruel—a high-calorie mush of roots or grains like millet, taro, or oats that doesn't require chewing—and wean children much sooner.

So grain farming allowed bigger families and has changed the human situation in endless ways. But while people have eaten grains for a hundred centuries, until the last half-century, most grains consumed were not heavily processed. "In the last 50 years, the extent of processing has increased so much that prepared breakfast cereals—even without added sugar—act exactly like sugar itself," says pediatrics specialist David Ludwig. "As far as our hormones and metabolism are concerned, there's no difference between a bowl of unsweetened corn flakes and a bowl of table sugar. Starch is 100 percent glucose [table sugar is half glucose, half fructose] and our bodies can digest it into sugar instantly.

"We are not adapted to handle fast-acting carbohydrates," Ludwig continues. "Glucose is the gold standard of energy metabolism. The brain is exquisitely dependent on having a continuous supply of glucose: too low a glucose level poses an immediate threat to survival. [But] too high a level causes damage to tissues, as with diabetes. The body is designed to keep blood glucose within a tight range, and it does this beautifully, even with extreme nutrient ratios: we can survive indefinitely on a diet of 60 percent carbohydrates and 20 percent fat, or 20 percent carbohydrates and 60 percent fat. But we never [before] had to assimilate a heavy dose of high-glycemic carbohydrates."

In 1981, David Jenkins, a professor of nutrition at the University of Toronto, led a team that tested various foods to determine which were best for diabetics. They developed a "glycemic index" that ranked foods from 0 to 100, depending on how rapidly the body turned them into glucose. This work overturned some established bromides, such as the distinction between "simple" and "complex" carbohydrates: a baked russet potato, for example, traditionally defined as a complex carbohydrate, has a glycemic rating of 85 (±12; studies vary) whereas a 12-ounce can of Coca-Cola appears on some glycemic indices at 63.

Eating high-glycemic foods dumps large amounts of glucose suddenly into the bloodstream, triggering the pancreas to secrete insulin, the hormone that allows glucose to enter the body's cells for metabolism or storage. The pancreas over-responds to the spike in glucose—a more rapid rise than a hunter-gatherer's bloodstream was likely to encounter—and secretes lots of insulin. But while high-glycemic foods raise blood sugar quickly, "they also leave the gastrointestinal tract quickly," Ludwig explains. "The plug gets pulled." With so much insulin circulating, blood sugar plummets. This triggers a second wave of hormones, including stress hormones like epinephrine. "The body puts on the emergency brakes," says Ludwig. "It releases any stored fuels—the liver starts releasing glucose. This raises blood sugar back into the normal range, but at a cost to the body."

One cost, documented by studies at the School of Public Health, is that going through this kind of physiologic stress three to five times per day doubles the risk of heart attacks. Another cost is excess hunger. The precipitous drop in blood sugar triggers primal mechanisms in the brain: "The brain thinks the body is starving," Ludwig explains. "It doesn't care about the 30 pounds of fat socked away, so it sends you to the refrigerator to get a quick fix, like a can of soda."

Glycemic spikes may underlie Ludwig and Gortmaker's finding, published in the Lancet two years ago, that each additional daily serving of a sugar-sweetened beverage multiplies the risk of obesity by 1.6. Some argue that people compensate for such sugary intake by eating less later on, to balance it out, but Ludwig asserts, "We don't compensate well when calories come in liquid form. The meal has to go through your gut, where the brain gets satiety signals that slow you down. On the other hand, you could drink a 64-ounce soft drink before you knew what hit you."

Since humans can take in large amounts of food in a short time, "we are adapted to receiving much higher glycemic loads than other primates," says Richard Wrangham, speculating that nonhuman primates may be poor models for research on human diabetes because they have a different insulin system. The only component of the hunter-gatherer diet likely to cause extreme insulin spikes is honey, which Wrangham feels "is likely to have been very important, at least seasonally, for our ancestors. Chimpanzees love honey and modern hunter-gatherers take in tremendous amounts of it. People have been seen eating as much as four pounds at a sitting."

We don't know how often such honey binges occurred in the distant past; Ludwig opines that finding a beehive was "a very infrequent event" for early humans. What is certain is that hunter-gatherers never experienced anything like the routine daily glucose-insulin cycles that characterize a modern diet loaded with refined sugars and starches. Constantly buffeted by these insulin surges, over time the body's cells develop insulin resistance, a decreased response to insulin's signal to take in glucose. When the cells slam their doors shut, high levels of glucose keep circulating in the bloodstream, prompting the pancreas to secrete even more insulin. This syndrome can turn into an endocrine disorder called hyperinsulinemia that sets the stage for Type II, or adult-onset, diabetes, which has become epidemic in recent years.

A Chicken in Every Potbelly

Ironically, U.S. government agencies' attempts to deal with obesity during the last three decades—encouraging people to eat less fat and more carbohydrates, for example—actually may have exacerbated the problem. Take the Department of Agriculture's (USDA) Food Guide Pyramid, first promulgated in 1992. The pyramid's diagram of dietary recommendations is a familiar sight on cereal boxes—hardly a coincidence, since the guidelines suggest six to 11 servings daily from the "bread, cereal, rice, and pasta" group. The USDA recommends eating more of these starches than any other category of food. Unfortunately, such starches are nearly all high-glycemic carbohydrates, which drive obesity, hyperinsulinemia, and Type II diabetes. "At best, the USDA pyramid offers wishy-washy, scientifically unfounded advice on an absolutely vital topic—what to eat," writes Willett in Eat, Drink, and Be Healthy. "At worst, the misinformation contributes to overweight, poor health, and unnecessary early deaths."

Note that the pyramid comes from the Department of Agriculture, not from an agency charged with promoting health, like the National Institutes of Health or the Department of Health and Human Services (DHHS). The USDA essentially promotes and regulates commerce, and its pyramid (currently under revision; expect a new version in 2005) was the focus of intensive lobbying and political struggle by agribusinesses in the meat, sugar, dairy, and cereal industries, among others.

Food is the most essential of all economic goods. Fifty percent of the world's assets, employment, and consumer expenditures belong to the food system, according to Harvard Business School's Ray Goldberg, Moffett professor of agriculture and business emeritus. (In the United States, 17 percent of employment is in what Goldberg calls the "value-added food chain.") He adds that "7 percent of the farmers produce 80 percent of the food—and do it on one-third of the land in cultivation. In the United States, half the net income of farmers comes from the government, in forms like price supports and land set-asides." The food industry is huge and exerts enormous influence on government policy.

Consider the flap that arose after the United Nations' World Health Organization (WHO) and Food and Agriculture Organization issued a report in 2003 recommending guidelines for eating to improve world nutrition and prevent chronic diseases. Instead of applauding the report, the DHHS issued a 28-page, line-by-line critique and tried to get WHO to quash it. WHO recommended that people limit their intake of added sugars to no more than 10 percent of calories eaten, a guideline poorly received by the Sugar Association, a trade group that has threatened to pressure Congress to challenge the United States' $406 million contribution to WHO.

Clearly, some food industries have for many years successfully influenced the government in ways that keep the prices of certain foods artificially low. David Ludwig questions farm subsidies of "billions to the lowest-quality foods"—for example, grains like corn ("for corn sweeteners and animal feed to make Big Macs") and wheat ("refined carbohydrates.") Meanwhile, the government does not subsidize far healthier items like fruits, vegetables, beans, and nuts. "It's a perverse situation," he says. "The foods that are the worst for us have an artificially low price, and the best foods cost more. This is worse than a free market: we are creating a mirror-world here."

Governmental policies like cutting school budgets by dropping physical education programs may also prove to be a false economy. "Supposedly, in the richest, most powerful nation on earth, we can't afford physical-education programs for our kids," says Willett. "That's really obscene. Instead, we'll be spending $100 billion on the consequences. We simply have to make these investments." Ludwig concurs. "There's fast food sold in school cafeterias, soft drinks and candies in school vending machines, and advertising in classrooms on Channel One. Meanwhile there are cutbacks in physical education, as if it were a luxury. What was once daily and mandatory is now infrequent and optional."

Curing the Edible Complex

The food industry itself has begun to make certain investments in the direction of healthier eating. "In the future, I see a convergence between food and health," says Goldberg. "The food industry has been warned of the backlash that could hit them, like it did tobacco." He suggests that the food industry will become more responsive to consumers' health concerns regarding issues like bioengineered ingredients in foodstuffs. People "want a diversity of sources for their food, and traceability of sources," he says. "The bar code will become a vehicle not just for pricing, but for describing and listing ingredients."

Even fast-food chains are changing; in the past year, they reported a 16 percent growth in servings of main-dish salads. Willet sees no reason why healthy eating should not be as delicious and attractive as junk food, and the franchisers may be headed that way as well. McDonald's is currently testing an adult meal that includes a pedometer and "Step With It" booklet along with any entrée salad. In its kids' meals, Wendy's is trying out fruit cups with melon slices instead of French fries. Yogurt manufacturer Stonyfield Farm has launched a chain of healthful fast-food restaurants called O'Naturals. And Dun Gifford has an answer for parents who say, "My kids won't eat anything but Doritos." A mother he knows puts out an after-school snack platter of sliced apples, grapes, raisins, nuts, and tangerine sections. "The kids don't complain at all," he says. "Or even notice."




Dun Gifford tosses a tomato amid Mediterranean staples like pasta and olive oil—which his Oldways Foundation recommends for healthy eating—at Formaggio Kitchen, a specialty food store in Cambridge. Portrait by Jim Harrison


Doritos themselves are getting healthier. Fitness expert Kenneth Cooper, M.P.H. '62, founder of the Cooper Aerobics Center in Dallas, has been working with PepsiCo's CEO, Steven S. Reinemund, to develop new products and modify existing items in a healthier direction. The company's Frito-Lay unit last year eliminated trans fats from its salty offerings. Frito-Lay introduced organic, healthier versions of Doritos and Cheetos under the Natural sub-brand. "As a result, 55 million pounds of trans fats will be removed from the American diet over the next 12 months," Cooper says. "It cost $37 million to retool—and it was done without a price increase. PepsiCo is in 150 countries, and many of their healthier products will soon be promoted throughout the world. Physical fitness is good business for the individual and for the corporation."

PepsiCo sells plenty of food and beverages from vending machines, many of them in schools. "You don't resolve the obesity problem in children by taking the vending machines out of schools," Cooper declares. "Kids will still get what they want. Put better products in the machines and get physical education back in the schools." Accordingly, PepsiCo is stocking some school machines with fruit juices from its Tropicana and Dole brands, Gatorade, and Aquafina bottled water; others offer Frito-Lay products that meet Cooper's "Class I" standard: no trans fats and restricted amounts of calories, fat, saturated fat, and sodium.

Parents need to create and enforce some Class I standards of their own. "We have got to stop being afraid of our children, and tell them what to eat," said Washington Post writer Judith Weinraub at the 2003 Oldways conference. Steven Gortmaker, too, has some simple counsel for parents. First, limit children's television viewing; the American Academy of Pediatrics recommends no more than two hours daily. Second, no TV in the room where the kids sleep. "Sixty percent of American children—including 25 percent of those between birth and age two—have televisions in their bedrooms, and they average an extra daily hour of viewing there," says Gortmaker. "Parents don't control that viewing."

Ironically, or perhaps fittingly, the television and advertising industries, so much a part of the obesity problem, may also be part of its solution. "The business of advertising junk food is seduction," says Gifford. "In beer and corn chip ads, you see beautiful, thin people playing volleyball on the beach. Even people who are grossly unfit, sitting on the couch eating those chips and drinking that beer, see this as a positive thing. They're having a good time on the beach, and that gets associated with chips and beer.

"There was once a very successful U.S. government program aimed at changing eating habits," he continues. "It happened during World War II, and it was called 'food rationing.' They made it a patriotic thing to change the way you ate. The government hired the best people on Madison Avenue to come to Washington and work for the War Department. It worked splendidly. To convince people to eat wisely, a determined, clever program could make a difference." Ludwig compares the obesity crisis to global warming. "Is it 100 percent proven that we are in for an environmental calamity? Do we want to wait until Washington, D.C., is submerged by rising ocean levels to take action?" he asks. "The risks of inaction are much greater than the risks of action."


Inner Wisdom


"People tend to eat the same amount of bulk, no matter what the calories," says research fellow in cell biology Jennifer Sacheck of Harvard Medical School. "They'll fill their plate with the same amount of food. So if the foods are energy-dense, they take in more calories, but things that have a lot of water, air, and fiber in them, like fruits and fresh vegetables, fill you up more without the caloric load." Because fat, at nine calories per gram, is the densest form of food energy we consume, it's much easier to overeat on fat. Doing so tends to add body weight more readily, Sacheck says, "because fat is more efficiently stored." (Storing 100 calories of protein, for example, takes nearly twice as much energy as storing 100 calories of fat.)

Not only food bulk, but hormonal response, affects appetite. The hypothalamus seems to control body weight, triggering several homeostatic mechanisms to maintain weight at a fixed "set point." "A lack of blood sugar stimulates secretion of hormones such as ghrelin [an appetite stimulant] and leptin [an appetite suppressant] that cascade to trigger a desire to eat," Sacheck explains. "If you lose fat, leptin decreases and ghrelin increases, causing you to eat more—and you gain weight back. The body equilibrates. Hormones like leptin regulate the set point."

The set point is linked to one's basal metabolic rate (BMR)—the number of calories needed to maintain life in a resting individual. The brain's continuous demand for glucose accounts for 20 to 21 percent of our BMR, Sacheck explains; the liver takes up another 21 percent; the heart and kidneys each absorb nearly 10 percent; and digestion accounts for 7 to 10 percent of the BMR. Physical activity can account for 10 to 30 percent of calories burned daily, while BMR takes up 70 percent or more. Since BMR increases with lean body mass, activities that build and tone muscle will burn more calories and perhaps lower one's set point as well.

[not pictured] Walter Willett's Healthy Eating Pyramid, described in his book Eat, Drink, and Be Healthy, differs from the better known USDA pyramid in several crucial respects. Willett identifies "daily exercise and weight control," which the USDA pyramid does not mention, as the very foundation of sound nutrition. The USDA draws no distinction, as Willett does, between whole-grain foods and refined (i.e., white) bread, cereal, rice, and pasta (the USDA recommends a whopping six to 11 servings per day from this group, in which Willett includes potatoes and sweets). Willett also separates healthy fats (mono- and polyunsaturated fats) from unhealthy (saturated and trans fats) ones, whereas the USDA lumps all fats, oils, and even sweets into a single category. In addition, the Healthy Eating Pyramid commends nuts and legumes, giving them their own tier. It also suggests multiple vitamins and moderate alcohol intake, two other topics omitted by the USDA.


Craig A. Lambert '69, Ph.D. '78, is deputy editor of Harvard Magazine.

Copyright © 2004 Harvard Magazine

Sunday, May 30, 2004

The Cobra Gets It Right

I moved to Sun City in time for the Memorial Day observance. In honor of the veterans of WWII, a plaza with engraved pavers will be dedicated on May 31, 2004. I bought a brick and due to the space limitations of 3 lines of 16 characters each, I was handcuffed. So, the engraving will read


Robert Sapper

L. A. Chistopher

1942-1945—Thanks

in honor of my father and stepfather who served respectively in the USN/USMC and the USAAF. My father went through boot training in both the USN in San Diego and then was trained to Quantico to go through boot training all over again. Later, the War Department streamlined the training for the Seabees (USN Construction Battalions), but Robert Sapper could wear either uniform and—when he was behaving—was either a Carpenter's Mate Second Class or a Technical Sergeant. In any event, Maureen Dowd gets it right and I wanted to get it right, too. If this is (fair & balanced) gratitude, so be it.



[x NYTimes]
An Ode to Clarity
By MAUREEN DOWD

WASHINGTON

I was one of the snobs who hated the design of the World War II memorial. As a native Washingtonian, I felt sad to see L'Enfant's empty, perfect stretch of mall, elegantly anchored by the Lincoln Memorial and the Washington Monument, broken up.

And while heaven knows we could use a triumphalist moment about now, the architecture seemed so ugly for such a beautiful victory, and so 19th-century German for such a 20th-century American ode to heroism.

But when I went Friday and saw all the adorable World War II veterans rolling in wheelchairs, walking slowly with canes or on their own, sitting on the benches that encircle the fountains, taking pictures with children and grandchildren, meeting up with their old buddies, the memorial was suddenly a lovely place to be.

It may not be perfect as a piece of architecture, but it's perfect as a showcase for the ordinary guys who achieved the extraordinary.

Thrilled with their moment in the sun in their usual humble way, inspecting the memorial they earned 60 years after D-Day, they looked in that setting as shining and valuable as jewels in a Tiffany's window.

"We won because we were the smoking and drinking generation," grinned 83-year-old Joseph Patrick Walsh, who was part of the "miserable, cold" Normandy invasion. He spent 32 years in the Navy and fought in Vietnam, and lived for years on Staten Island and the Upper West Side. He showed off the tattoo of a leering Japanese soldier on his arm, and another tattoo with his wife's name and a bar of ink where his wife made him take out "Margie," an earlier girlfriend's name.

World War II had such stark moral clarity in history that it's almost irrelevant in providing lessons about conflict in a grayer time. The Japanese bombed us; they didn't have putatively threatening "weapons of mass destruction-related program activities," as President Bush said of Iraq.

Although conservatives compared Saddam to Hitler, America did not have to be persuaded with "actionable" intelligence before confronting Hitler. That dictator was an individual weapon of mass destruction.

I asked Mr. Walsh how he felt about the Iraq war.

"You gotta back the kids," he said. "And you gotta back the president. But I hate to see it looking like Vietnam night after night on TV, not getting nowhere, taking a town and then having to take it back again. They called us `baby killers' when we got home. This is a politicians' war, not a people's war. You can't win a guerrilla war against religious fanatics. Personally, I don't think we should have went in without U.N. backing. We had nobody — a few Spanish."

Over by the Pacific Arch, George Jonic, 89, was approached by a teenage girl in a T-shirt and toe ring, who was looking at him as if he were Orlando Bloom. "Can I have my picture taken with you?" she asked. Mr. Jonic was an Army combat engineer who landed on Omaha Beach on D-Day. "My commanding officer and a lot of other people got killed," he said, wiping away a tear. "I was lucky."

Asked about Iraq, Mr. Jonic, who lives in Sandwich, Mass., said he supported the president and his attempt to bring democracy to the Middle East. "We were in the `civilized' war," he said. "But I have to tell you, there's just as much confusion in all wars." He gave me a little salute.

From the standpoint of the soldiers, all wars are hellish, killing and trying not to be killed.

The speeches Saturday stressed how everyone in America had a role in World War II, men and women, young and old, all pulling together. In the Iraq war, there's not much sharing the pain. Most Americans take tax cuts while forcibly re-upped reservists fight without the right armor and face a shortage of bullets.

"Nobody is sacrificing now except the poor guys — men and women — over there," said Bob Dole, who got the memorial built for "the disappearing generation."

"We were all on the same page then, supporting the war. Today, it's more 50-50 about Iraq."

But this weekend, at this memorial, there was a flashback to moral clarity, and a chance to honor our heroes who fought fascism. "Oh, man, it's great," said Don Smith, 78, who enlisted in the Navy when he was 18 and who had arranged to meet two shipmates he hadn't seen since 1945. "I never expected it to be so big. It's too bad that they made it so late. At least it's here and I'm alive to see it."

Copyright © 2004 The New York Times, Inc.



The Trickster: Civil Rights President?

Turn, Turn, Turn sang Pete Seeger (and the Byrds?): To everything there is a season. On the 50th anniversary of Brown v. Topeka Board of Education, there has been a deluge of articles on the historical significance of Brown. I have been on the lookout for the ultimate revisionism of the historical legacy of Brown and I think I found it. The Trickster (Richard M. Nixon) is celebrated for his innovative Southern Strategy in 1968 and thereafter. The Southern Strategy has been so successful that in 2004, seventy-seven elections for state office in Texas will see Republicans elected without Democrat opposition. Prior to the Southern Strategy, Texas was a blue state. Today, blue has been overshadowed by red as Republicans hold sway in a direct reversal of the Solid (Democrat) South. Virtually all interpretations of the Trickster's civil rights record find him to be cynical and devious. For another take, here is a tribute to the Trickster as an advocate of peaceful desegregation in the South. However, white flight to private schools throughout the South preserved separate (and unequal) education. If this is (fair & balanced) rejection of revisionist nonsense, so be it.



[x History News Network]
Why Richard Nixon Deserves to Be Remembered Along with Brown
By Joseph J. Sabia

In recent weeks Americans gathered to celebrate the 50th anniversary of the Brown decision, which overturned the “separate, but equal” doctrine and ordered the desegregation of public schools “with all deliberate speed.” However, many Southern schools dragged their feet on integration, with districts steadfastly refusing to obey the court order. When federal bureaucrats tried to intervene to force desegregation, tensions grew. Summing up the situation, Senator Richard Russell, D-GA, stated in 1970, “The people of (the South) are more worked up over this problem than anything I’ve seen in all my years in politics.” Enter Richard Nixon: racial healer.

In the fall of 1968, 68 percent of black children in the South were attending all-black schools. By 1974, that number had fallen to 8 percent. This extraordinary accomplishment was achieved through the shrewd political skills and raw courage of President Nixon, Secretary of Labor George Schultz, and Attorney General John Mitchell.

In his book With Nixon, speechwriter Ray Price outlined Nixon’s school desegregation goals:



Nixon’s aim was to use the minimum coercion necessary to achieve the essential national goal, to encourage local initiative, to respect diversity, and, to the extent possible, to treat the entire nation equally – blacks equally with whites, the South equally with the North.


Vice President Spiro Agnew was chosen to chair a special Cabinet Committee on Education, the purpose of which was to find the best course of action to peacefully desegregate Southern schools in accordance with a 1969 court order. This Cabinet committee voted to create several state advisory panels, which were staffed with a diverse cross-section of leaders from each Southern state. These committees included white segregationists, black leaders, and other government officials.

Initially, there was little reason to believe that these state advisory committees would accomplish much. But Nixon pressed on. On June 24, 1970, the president met with the 15-member Mississippi State Advisory Committee in the White House. As Nixon reported in his memoirs, one of the black committee members expressed his optimism:



The day before yesterday I was in jail for going to the wrong beach. Today, Mr. President, I am meeting you. If that’s possible anything can happen.


And it did. In an incredible gesture of good faith, Mississippi Manufacturers Association president Warren Wood and Biloxi NAACP president Dr. Gilbert Mason agreed to serve as co-chairman of the Mississippi committee. According to Price, Mason christened his new relationship with Wood by saying:



If you and I can’t do this, nobody else in the state of Mississippi can. We’re probably the only black and white men in the state who can get together on something like this.


Nixon met personally with seven state advisory committees, expressing his belief that they could work together to peacefully solve one of the great crises of our time:



With each of (the committees) I stressed the same points. First, I condemned the hypocrisy in much of the North about the segregation problem. I affirmed my belief that the South should be treated with understanding and patience, but I also stressed the need to solve the problem through peaceful compliance. Second, I emphasized my commitment to the principle of local leadership to solve local problems.


During the 1960s, many liberals self-righteously screamed about racism, demanding that the federal government coerce Southerners into racial integration. The result of their heavy-handed tactics was more racial antagonism.

The president tried a different approach – cooperation. Thanks to Nixon’s strong leadership, Shultz’s masterful negotiating skills and Mitchell’s ability to keep overzealous Justice Department officials in check, the state advisory committees were an overwhelming success.

In a 1970 memo, presidential counselor Daniel Patrick Moynihan wrote, “There has been more change in the structure of American public school education in the last month than in the past 100 years.” And, like going to China, only Nixon could have done it.

While much is made over his “Southern strategy” in 1968, few understand that the Southern strategy brought the South back into the nation’s body politic by appealing to sentiments that united all Americans: patriotism, duty, and cooperation. Nixon refused to condescend to Southerners. He treated them as Americans, equal in every way to Northerners. And because Nixon took that course, he was able to achieve one of the greatest civil rights triumphs of the twenty-first century: the peaceful desegregation of Southern schools.

Nixon gets almost no credit for his civil rights efforts. Thanks to the liberal press, most Americans think that Nixon’s civil rights record consists of him making a few racist statements in the Oval Office. Given the historical record, this is a tragedy.

In the Brown celebrations, virtually no mentions of the former president were made. Nixon’s civil rights triumphs have been flushed down the memory hole. Moynihan summed it up in a December 1970 speech, transcribed by Price:



Since [Nixon assumed office]...the great symbol of racial subjugation, the dual school system of the South, virtually intact two years ago, has quietly and finally been dismantled. All in all, a record of good fortune and much genuine achievement. And yet how little the administration seems to be credited with what it has achieved.


If we are to honor the Supreme Court for its decision in Brown, we should also honor Richard Nixon for peacefully carrying out its historic judgment.
_______________________
Joseph Sabia is a Ph.D. candidate in economics at Cornell University.

Copyright © 2004 frontpagemag.com and is reprinted with permission




Saturday, May 29, 2004

Michael Moore Ain't What He Seems

There have been rumblings about Michael Moore's work as a documentary film maker. The rumblings erupt in Andrew Anthony's reportage from the 2004 Cannes Film Festival. Richard Schickel's description of Moore as the very definition of the unreliable narrator did it for me. Richard Schickel is no hack. If this is (fair & balanced) disillusionment, so be it.



[x Guardian Unlimited]
Michael and me
by
Andrew Anthony

The film-maker who could help to bring down Bush has been larging it at Cannes. He has made millions asking awkward questions of corporate America. But there are a few awkward questions we'd like to ask him...

It would be wrong to suggest that all of human life passes through the lobby of the Majestic hotel in Cannes. Better to say that beneath its exotic arrangement of palm trees, hanging rugs, Roman statues and permanently illuminated chandeliers goes all of human life with a movie to sell. And therefore followed, of course, by a few other forms of life.

In this baroque setting, on any given day during festival fortnight, the movers, the shakers, the wheelers, the dealers, the chancers, the prancers, the stars and the starlets perform a complex social dance that, with its anxious overlapping non-conversations, might have been choreographed by Robert Altman. Never stopping for more than a moment, the parties embrace, scan the room for someone more powerful, more famous or more beautiful, promise to fix something up, and then move swiftly on. Later, these fleeting encounters will be described as meetings.

Not much stops the palm-squeezing and back-slapping. No one, for example, is too distracted when David Carradine, the star of the Seventies TV series Kung Fu, blows kisses from the top of the stairs, even though he is wearing a Mao jacket and sunglasses and a pair of pumps with 'Kill Bill' lettering to promote his role in Quentin Tarantino's film. Nor are there more than a few jerked necks when Harvey Weinstein, the dark prince of the deal, walks through brandishing a terrifying grin. But everything freezes as a large man with a fast-food gut and a laboured waddle, wispy beard and glasses, makes his way to the door.

Extended hands are left unshaken, air-kisses go unaired, the hubbub softens and two strikingly elegant women teeter on their kitten heels to get a better view, their faces a portrait of rapt admiration. Here comes Michael Moore, film-maker, author, political activist, global phenomenon.

Last week on the baking Côte d'Azur, there was no one hotter than the big fellow from Michigan. Among the stylish hordes of the Croisette, there was no greater attraction than this ursine fig ure in his ill-fitting suit. Everyone wanted a piece of him, and there is a lot of him to go around, but after months of requests, I had the only one-on-one interview. Michael and Me, we had a real meeting arranged.

Moore arrived in Cannes by his traditional mode of transport - on a wave of controversy. Disney had announced that it would not distribute his new film, Fahrenheit 9/11, in America, which left the film's producers, Miramax, a division of Disney, looking for a new partner. Moore accused Disney of censoring his film to protect the tax breaks its Disneyworld complex enjoys in Florida, the state controlled by Jeb Bush, brother of the President (Fahrenheit 9/11 details the cronyism and corruption of the Bush regime, as well as its failings in the 'war against terror').

Disney countered that Moore had known for more than a year that it would not handle the film and was only complaining now to publicise his film. Nevertheless, the director once again successfully positioned himself on the moral high ground in a battle against a multinational corporation. He finessed the same manoeuvre with Stupid White Men, his bestselling critique of American capitalism, by claiming that Harper Collins had tried to suppress the book, and that it only agreed to publish him following a protest by librarians.

Moore, the king-sized millionaire, walking testament to American consumption, is a master of making himself appear the little guy. He told reporters that before Disney, Mel Gibson's company, Icon, had also dropped the film, following a phone call from a man in Washington who told Icon that if they continued with the film Gibson would no longer be welcome at the White House. Icon denied the story, but how could they prove that the mysterious Washington caller did not exist?

The net effect of all these claims and counter-claims was that Fahrenheit 9/11 was the film that everyone on the Croisette wanted to see. But as not everyone had tickets, the old-fashioned capitalist marketing ploy of making demand outstrip supply ensured maximum frenzy and thus still greater demand. In Cannes, nobody wants to hear the word can't. Naturally, the bidding on buying the distribution rights just went up and up.

The film, as it turned out, is Moore's strongest since Roger and Me, his debut documentary 15 years ago which examined the damage wrought by General Motors on his home town of Flint. Whereas the Oscar-winning Bowling for Columbine was hit-and-miss, self-contradictory, and more than a little sanctimonious, Fahrenheit 9/11 seldom loses sight of its target - the Bush administration - or its sense of humour.

It is also, with a couple of exceptions, a triumph of editing. Indeed, Moore is arguably the most ideological and emotive editor since Sergei Eisenstein, the Soviet propagandist who developed a kind of didactic montage. Juxtaposing heroes and villains, he cuts between political comedy and tragic reality with intoxicating glee. There is no information that is vitally new, nor are there any images that are more shocking than those from Abu Ghraib prison, but such is the cumulative force of the film, with its kinetic humour and insistent sentiment, that it is hard to come away from it without concluding a) that George W Bush is not fit to be president of a golf club let alone the world's most powerful nation and b) the war in Iraq was woefully misconceived. In the year of an election that could well prove close, it's the kind of film that could make a historic difference.

In the past, Moore has been accused of twisting chronology and events to suit his agenda. While neither Bowling for Columbine nor Roger and Me can be accused of major factual errors, both trade on a series of misleading implications. For example, in Bowling for Columbine the audience is led to believe that the two teenage killers at Columbine high school may have been inured to violence by the proximity of a local weapons factory. Yet it later emerged that the factory produced nothing more lethal than rockets to launch TV satellites. The film critic Richard Schickel labelled Moore 'the very definition of the unreliable narrator'.

If there is a question mark over the trustworthiness of Moore's work, few can doubt its power, still less its influence. Bowling for Columbine was by far the biggest-grossing documentary in history. Stupid White Men , an easy-read satire, was the bestselling non-fiction book in the US in 2002, with 4 million copies in print worldwide, and 600,000 of those in the UK. At one point the book, and its follow-up, Dude, Where's My Country?, stood at numbers one and two in the German bestseller list. The sales of his films and books have made him known across the planet, as well as very rich, but the image he has sold of himself - fat, bumbling, nerdy, but indefatigable - has made him something else: an international man of the people.

As the limousine carrying Moore to his Cannes press conference pulls out of the Majestic, bound for the Palais less than 200 yards up the road, an Argentinian TV crew rushes out into the road to interview the director. The automatic tinted windows slide down and a few brief words are exchanged before a security guard steps in. The man with the microphone tries to give Moore an Argentinian flag but the security guard won't allow him. 'Put that down,' he warns, as if it were a semi-automatic weapon. The window goes up and the car moves off.

More than Moore's wealth, the question of security is perhaps the issue that most threatens his down-to-earth ordinary Joe persona. In Bowling for Columbine, he posits the theory that America's gun violence problem stems from a culture of fear created by a racist media. Last year, during a residency at the Roundhouse in London, he suggested that if the passengers on 11 September had been black, they would have fought back against the hijackers, and that spoilt whites were too used to having other people look after them.

But during the same series of dates in London, he complained about the lack of security so vehemently that the Roundhouse staff threatened to boycott the show. I got a taste of the air of paranoia surrounding Moore when, because I was without a suitable pass, a friendly PR snuck me into the main press conference alongside his entourage. Suddenly, one of his assistants turned to me and demanded to know who I was. The PR explained that I was with her.

'And who are you with?' asked the assistant.

'You,' replied the perplexed PR. 'I'm working with you.'

'I've never seen you before in my life,' announced the assistant and a security guard duly intervened to bar both of us. It was only when the PR persuaded the assistant that in fact they had been working together all day that the guard relented. On stage, Moore was asked why it was that he was flanked by three security men, who stood with their feet apart, hands clasped at their crotches, in an intimidating military stance. The director did as he always does when asked this question, and claimed that they were his fitness trainer, pilates teacher and masseur, then turned the idea that he needed protection into an elaborate joke. 'I'm not afraid of anything,' he mugged. 'Should I be?' The room broke into laughter.

Moore knows how to field difficult questions before a crowd. When one reporter told him that she had spoken to Icon and they knew nothing of the supposed caller from Washington, Moore told her to speak to his agent - 'He knows all about it.' She told him she had spoken to his agent, that he had professed ignorance of the matter, and had told her that she should speak to Moore. The director simply referred her back to his agent.

After the conference, Moore went to the official screening of his film, which is in competition for the main jury prize. The end of the film brought a standing ovation that, observers estimated, lasted somewhere between 12 and 15 minutes, a Cannes record, and possibly unmatched since Stalin's audiences used to continue clapping for mortal fear of being the first person to stop.

The applause here, though, was genuine. For the Americans who made up a large section of the audience, this was their first opportunity to stand up straight after the shaming horrors of Abu Ghraib, and for the French, well, there is nothing the French love more than an American criticising America. The following evening on French TV, I watched Moore thank the French peo ple for being 'friends who can tell you the truth to your face'. He might have returned the favour and told the French about their government's appalling role in Rwanda a decade before - but there are limits to truth-telling, even among friends.

The charge that Moore, who turned 50 last month, has only ever established a partial relationship with the truth is one that stretches way back into his career. Although he has lived in the rarefied neighbourhood of Manhattan's Upper West Side for the past 14 years, Moore very rarely lets an interview go by without referring to himself as 'working-class'. In fact, he grew up in a middle-class suburb of Flint, in a two-car family. His father was an auto-plant worker who played golf, retired in his fifties, and was well-off enough to send his three children to college.

Moore dropped out of university and, after stints as a hippie DJ, and a period running a crisis centre for teenagers, he set-up an alternative newspaper, the Flint Voice. He edited it with such verve, exposing corrupt officials and racist businesses, that in 1986 the San Francisco-based magazine Mother Jones asked him to become its editor. But just a few months after taking up the position, he was fired. According to the owner of the magazine, the staff said that he was impossible to work with. As far as Moore was concerned, he lost his job because he was set against a piece that was critical of the Sandinistas' record on human rights.

Either way, he won $58,000 damages in a suit for wrongful dismissal, sold his house and put all the money into making Roger and Me . The documentary was a notable critical, if not spectacular commercial, success. Thereafter Moore moved to New York and television, making zany political series such as TV Nation and The Awful Truth, which were full of Moore's trademark stunts designed to mock greed and ignorance and humbug.

Behind the scenes, however, a different picture was forming. Moore's employers were confronted with ever more regal demands. He insisted that Channel 4 house him at the Ritz when he worked in England on The Awful Truth, a fact he now portrays as the revenge of the working class against corporate might. Meanwhile employees grumbled. 'He's a jerk and a hypocrite and didn't treat us right and he was false in all of his dealings,' said one former worker. His former manager, Douglas Urbanski, has said that Moore 'was the most difficult man I've ever met... he's money-obsessed'.

To such complaints, Moore has a stock Nietzschean-cum-Obi-Wan Kenobi answer, which is that whatever attacks his critics launch at him, only make him stronger. 'The readership only expands, the viewership for the movies only expands, and they just look ridiculous.'

And, statistically, he's right. Currently, there is no more powerful anti-war protester in America, and therefore arguably the world, than Moore. In this country, the Mirror named him 'the greatest living American'. Recently, when he called Bush a 'deserter' it caused a scandal in the States, but it also put Bush's dubious record as a National Guardsman during the Vietnam war at the top of the agenda for the first time. He plays sell-out stadiums wherever he travels, and while he has become something of a bogeyman to the American right, and an embarrassment to a small section of the liberal left, he is to many millions the world over the underdogs' most heroic spokesperson. It's a reputation that was cemented by his celebrated Oscar speech at last year's awards ceremony, in which he lambasted Bush and told the assembled actors that they lived in 'fictitious times'.

He would tell interviewers afterwards that he had not planned the speech, assuming that he would not win, but elsewhere he has said that he warned his fellow competitors that he was going to make an anti-war statement. That's the problem with Moore: you can't be certain of the veracity of what he says. Is he the radical who has claimed to give a third of his income to worthy causes or a ruthless self-aggrandising hypocrite, or both?

Now, with my exclusive one-to-one interview, I was, I hoped, about to see the real Michael Moore. But a small cloud had appeared in the brilliant blue Mediterranean sky. The publicity company dealing with Moore in Cannes had resigned, as a fractious working relationship had become intolerable, with the director and Weinstein apparently reducing one of the publicists to tears. The new publicists, drawn and anxious-looking, were at pains to let me know that the interview would still go ahead. They just couldn't be sure what time it would be. And, oh, one other thing, it had been cut to 15 minutes. Fifteen minutes! That was barely enough time to ask a question, let alone hear it answered.

I waited in the lobby of the Majestic, and was finally allotted a time. The hour came and passed. There was no sign of Moore. Was he pulling a Naomi, the no-show interview technique perfected by supermodel Naomi Campbell? He once told an interviewer that he didn't like interviews because he had 'no control over what you're going to write'. One form of control, of course, is not to arrive.

The publicist told me that Moore's lunch meeting had run over but that she was sure everything would be OK. It was clear from her stricken expression that she had no idea where Moore was. She went away and what seemed like a week later returned with a definite slot and disappointing news. Owing to Moore's other engagements, the interviews now had to be compressed, and I would be sharing my limited time with one journalist from Australia and another from Japan.

Inside a well-manned salon, Moore was sporting a baseball cap with the legend 'Made in Canada', a blue hooded tracksuit top, khaki shorts and sandals. Crouched over a circular conference table, he looked like a lumpen tourist at a Vegas blackjack game, uncertain, ill at ease.

'You cool with them being here?' he asked me conspiratorially, though quite brazenly, in front of the Australian and Japanese journalists.

When I told him that it wasn't what was advertised on the brochure, he said: 'Yeah, I don't know what to do here. They've got me so jammed. No offence to you, the Japanese,' he gestured to the Japanese woman, 'but you both deserve your own time,' now gesturing to the Australian woman and myself. Either he doesn't sell too well in Japan or there was a hint of racism in that distinction, but Moore was too caught up in his own drama to notice. 'This is bullshit, you know. Don't they understand the difference between the Observer and a Portuguese magazine, no offence to the Portuguese, but don't they know? I'm just asking, man.'

From being the architect of this farrago, Moore turned himself into the victim, betrayed by the nameless, omnipotent 'they'. He continued in the same vein, currying my favour with his appreciation of the Observer until, to her great credit, the Japanese woman asked if we could begin the interview. At which point Moore burst out laughing, to his credit at himself.

My strategy, given the rushed circumstances, was to dispense with formal inquiries, let the other two ask about the film and general matters, and restrict myself to awkward questions. I wondered if he has any regrets about supporting Ralph Nader, the independent candidate in the previous American presidential election. Most observers think that the votes Nader took from Al Gore were vital in gaining Bush's disputed victory.

'None whatsoever,' he says without hesitation, although he's called on Nader not to stand this time round. What's the difference?

'Wrong year. Even the Green party in the US have said they're not going to campaign in the swing states... I've been very disappointed and very saddened by Ralph, who's a great American who's done many great things. But in his later years he has become, you know, somewhat bitter and vindictive. And I don't want to speak ill of him because he's done so much good, but he has not a single... except I think I heard maybe Patti Smith is supporting him.' His silent ellipses could mean nothing but 'celebrity endorsement'.

I ask him why his old friend and longtime collaborator Ben Hamper, a former Flint auto-worker whom he helped become a writer, told the New Yorker magazine, in among a number of otherwise flattering comments, that Moore 'didn't treat people well'.

'Right,' says Moore, rising to the charge, 'and then he sent me a letter saying that he said that while he was drunk. He has a horrible alcohol problem and I don't really want to talk about it,' he says, going straight on to talk about it, 'because I feel bad because he's a friend. He sent me this painful, painful letter. He hasn't been able to write a book in over 12 years. He's literally had this writer's block that has not been helped by the prescription drugs and the alcohol problem. I care deeply about him. And it's hard for someone like that because here we were putting out this paper in Flint and I've gone on and written my books, made my films, I have this life and, you know, he's struggling. My wife and I have tried to help him [but] at some point in this situation you've got to stop being the enabler and he's got to get it together himself.'

He then tells me how well he pays his employees the best independent film rates around, and even calls in a young assistant and asks him to tell me how much he earns. 'Eight hundred dollars a week?,' he says gingerly. 'What else?' asks Moore. 'You pay for my cell phone.' 'So,' says Moore, 'roughly a thousand a week.' Sounds like roughly $800 to me, but who's quibbling?

The point is, he insists, he's not fallen out with any employees since 1994. I ask if he worked out how to be a better employer.

'I just think I'm a better person,' he says, his head bowed in theatrically solemn contemplation, 'because I'm always struggling to be a better person. I'm a highly flawed individual, as we all are, and because I was raised by Jesuits, I'm constantly, "What is it about me and what I can do to be better?"'

It is doubtless to this mission that he refers in Stupid White Men, when he writes: 'If you're white, and you really want to help change things, why not start with yourself?'

With this thought in mind, I ask him why he decided to send his daughter to a private school in Manhattan.

'Oh,' he says brightly, 'I went to private school. Just a genetic decision. My wife and I, we both went to Catholic schools, we're not public-school [which in the US means state school] people.

So it's not important.

'No, I think it's important and the first five years she went to public school, then we moved to New York and we went to see the local public school and we walked through a metal detector and we said, "We're not putting our child through a metal detector." We'll continue our fight to see to it that our society is such that you don't have to have a metal detector at the entrance to schools. But our daughter is not the one to be sacrificed to make things better. And so she went to a school two blocks away. She just went to the nearest other school.'

He makes it sound as if the other school was just a random choice, but private schools on the Upper West Side are all restrictively expensive, and mostly white, just as the state schools are disproportionately black.

'Is that a bad thing?' he asks rhetorically of his decision, 'I don't know. Every parent wants to do what's best for their child. Whatever I can afford, I'm going to get my kid the best education I can get.'

I suggest that, while that may be a natural instinct, it's hard to see why it's any different from the Republican philosophy of each man for himself and his family.

'I'm not a liberal. When you come from the working class and you do well enough whereby you can provide a little bit better for your family, get a decent roof over their head and send them to a good school, that's considered a good thing. If,' he emphasises, 'you're from the working class. What's bad about it is if you get to do that and then shut the door behind you so nobody else can do that.'

Of course, it's nobody's business but Moore's where he sends his child, except he makes it his business to detail the hereditary privilege of his subjects and tends to make his political arguments personal. In Fahrenheit 9/11 one of his stunts is to attempt to get Congressmen to sign their children up for the war in Iraq.

I ask him finally - the interview has now stretched either side of another with Italian TV - which other documentary film-makers he admires. He names Errol Morris, and a few others, but does not mention Nick Broomfield, whose signature style of putting himself in the frame Moore has to some extent borrowed. I ask how he rates Broomfield.

He pauses. 'I consider him a friend.'

I wait for his answer, as he tucks into a bowl of pickles.

'Do you think he wants to be on camera?' he puts the question back to me. 'Do you think he looks like he's enjoying it?'

What I think, after my short time in his company, is that Moore is a man you would not want as an opponent, but also one you'd think twice about calling a friend. Though a talented film-maker and a clever showman, a populist who knows how to play the maverick, he is too often both big-headed and small-minded. In his desire to be seen as the decent man telling truth to power, he is too ready to blame those less powerful than himself for his shortcomings. He was justly revered in the Palais, but out on the street no one had a kind word to say about him. At Cannes, Moore may have been the star but he was not, it seems, the man of the people.

Copyright © 2004 Guardian Unlimited

Pickin' & Grinnin' or Cuttin' & Runnin'

Let's see. Slick Willy pulled the troops out of Somalia after Black Hawk Down. Dutch pulled the Marines out of Lebanon after the truck bomb debacle. Gerald R. Ford pulled the troops out of Saigon after Vietnamization failed. Now, we are presented with another quagmire in Iraq. Thank goodness for the good sense of Professor John Mueller. I also heard Congressman Dennis Kucinich on "Meet The Press" last weekend. He makes a forceful case for Out Now, If Not Yesterday. The Question of the Week: Has the military occupation of Iraq made the United States safer from terrorism? See the Attorney General and the Director of the FBI. They don't have a clue. If this is (fair & balanced) egress, so be it.



[x HNN]
The Politics of Cutting and Running
By John Mueller

Recently, Secretary of State Colin Powell forcefully declared that the United States would leave Iraq after June 30 if requested to do so by a new interim Iraqi government. This suggests that the United States is now seeking a face-saving method for cutting its loses there--for withdrawing or substantially reducing its presence.

Increasingly the American effort seems to be devolving into a costly, enervating, lonely, and deeply-divisive occupation that the United States would eventually lose. As in Vietnam, the main military problem is to conquer an insurgent force that is tenacious and willing to accept casualties, and the key lies not in American military prowess, but in the willingness of the insurgents to continue their resistance. If this fails to break, it may prove essentially impossible to root them out by military means except by inflicting massive destruction--the Russian approach in Chechnya.

And the only way to keep American casualties from accruing would be to secure the troops in the preventive seclusion of well-protected bases, hardly the best approach for bringing peace, order, and democracy to Iraq. Moreover, if the U.S. can't provide order, ordinary Iraqis will become ever more dismayed at the occupation.

If American forces therefore effectively become more nearly the cause of conflict than its cure, it is entirely sensible to withdraw, passing the burden off to a patched-together domestic government (as in Vietnam), perhaps with some sort of international overseer. The hope would be that this government might be successful in quelling the insurgency not because it would be more militarily effective than the Americans, but because the insurgents would regard it as legitimate and thus stop or reduce their violence.

Withdrawal can be painful, but the process need not be permanently damaging politically. Policing forces that had suffered unacceptable losses were withdrawn from Lebanon in 1984 under Reagan and from Somalia in 1994 under Clinton, and in both cases the issue scarcely came up in ensuing elections.

More to the point may be the resolution of Vietnam. The U.S. plugged on in that war in part because it feared the domestic political consequences of defeat. But failure was substantially accepted at least in electoral politics when a face-saving agreement was crafted and a bit of time passed. Indeed, in 1976, a year after South Vietnam collapsed to Communism, Gerald Ford essentially took credit for it: when he became president, "we were still deeply involved in the problems of Vietnam," he pointed out, but now "we are at peace: not a single young American is fighting or dying on any foreign soil." His electoral challenger, Jimmy Carter, seems to have concluded that it was politically disadvantageous to point out the essential absurdity of Ford's ingenious argument.

With only a few months left until George W. Bush's election, there may not be enough time for Americans to so conveniently wave off the venture. But, while he will presumably continue to lambast the administration for the war, John Kerry is unlikely to advocate sending the troops back no matter how matters develop in the aftermath of withdrawal. Most likely, the public's attention will move on to other things, particularly the economy.

Withdrawal, it is often claimed, means that American prestige and influence will decline. However, it is certainly not clear that the American defeat in Vietnam had a longterm detrimental impact on such vaporous qualities.

A more important consequence might be that Osama bin Laden's theory that the Americans can be defeated, or at least productively inconvenienced, by inflicting comparatively small, but continuously draining, casualties on them will achieve encouraging confirmation. A venture designed and sold in part as a blow against international terrorists would thus end up emboldening and energizing them. A comparison might be made with Israel's orderly, even overdue, withdrawal from Lebanon in 2000 that insurgents there took to be a great triumph for their terrorist tactics--and, most importantly, so did like-minded Palestinians who have since escalated their efforts to use terrorism to destroy Israel itself.

However, people like bin Laden are likely to envision victory in Iraq no matter how the venture comes out. They believe that America invaded Iraq as part of its plan to control the oil in the Persian Gulf area. But the United States does not intend to do that (at least not in the direct sense bin Laden and others doubtless consider to be its goal), nor does it seek to destroy Islam as many others also bitterly assert. Thus just about any kind of American withdrawal will be seen by such people as a victory for the harassing terrorist insurgents, who, they will believe, are due primary credit for forcing the United States to leave without accomplishing what they take to be its key objectives.

Another consequence of withdrawal is that all that self-infatuated talk about a brave new superpowered American "empire" will fade away. So there may be a bit of a bright side to the exercise as well.

John Mueller is professor of political science at Ohio State University. Among his books are WAR, PRESIDENTS AND PUBLIC OPINION, POLICY AND OPINION IN THE GULF WAR, and the forthcoming THE REMNANTS OF WAR.

Copyright © 2004 History News Network



Friday, May 28, 2004

The Kinkster & The Racehorse

The Kinkster maintains that he prefers his nickname to Richard as does his favorite Texas lawyer—Richard (Racehorse) Haynes—because neither wants to be identified with Richard Nixon (or Richard III). The Kinkster manages a plug for his gubernatorial campaign by noting that the Racehorse managed to endorse the Friedman candidacy without endorsing the Friedman candidacy. Even so, a half-endorsement is better than none. The bottom line? If you're in trouble in Texas, call the Racehorse. And bring plenty of money. If this is (fair & balanced) pettifoggery, so be it.



[x Texas Monthly]
Ardor in the Court
by
Richard (Kinky) Friedman

I love Racehorse Haynes for the same reason his clients do: he's one of the most colorful silver-tongued devils to grace Texas since God made trial lawyers.

AT A RECENT BOOK SIGNING OF MINE at Murder by the Book, in Houston, I was pleased to see the legendary defense lawyer Racehorse Haynes making his way through the crowd. Little Jewford, the last surviving member of the Texas Jewboys, had just introduced me as "the next governor of the great state of Texas," and I had assured him that I would keep him on the short list for first lady. It was at that point that Racehorse came up to the microphone and, in true lawyerly fashion, managed to endorse my candidacy without actually endorsing my candidacy. He said his real views on the Kinkster were "privileged and must remain privileged." Then he introduced his wife, Naomi, as "the widow Haynes." I was commenting on what an honor it would be to have Racehorse in a Friedman administration, working pro bono to fix the broken criminal justice system, when David Berg, a protégé of Racehorse's and a brilliant lawyer in his own right, suddenly leaped from his seat. "The words 'Racehorse' and 'pro bono,'" he shouted, "are never used in the same sentence!"

Why are so many legal eagles—or buzzards, as the case may be—big fans of my books? Do they see me as the thinking man's John Grisham? How the hell should I know? All I'm sure of is that quite often at my book signings, a long line of lawyers will surreptitiously snake its way past the little old ladies with their aluminum Jerry Jeff Walkers. Whenever this happens, I find the lawyers in contempt of bookstore and send them to the back of the room. Still, so many of them turn up at these events that I've almost had to standardize my book inscriptions to them. While they are perverse enough to like "From one left-handed Jewish homosexual to another," they appear to more deeply appreciate something that acknowledges their oft-maligned profession. Thusly, to borrow a bit of legalese, two favored inscriptions for lawyers have evolved. The first is "Where there's a will, there's a lawyer." The second is "May all your juries be well hung."

There is, however, a small but growing pantheon of lawyers I have come to know and, yes, admire—from Jim Schropp, a corporate attorney in Washington, D.C., who has spent the past ten years campaigning vigorously to get Max Soffar off death row (The Last Roundup: "Case Open," March 2004), to David Epstein, a Southern Methodist University law school professor who routinely includes a random question about the Kinkster each year in his national bar review courses. But one name on this list has always shone brightly: the aforementioned Racehorse Haynes. If he hadn't owned a yacht, Racehorse would be my candidate to be the Atticus Finch of Texas. In past years, in fact, I sailed with him on said yacht, the Integrity. "For those," insists Racehorse, "who say I haven't any."

Racehorse, indeed, is one of the most successful and most colorful silver-tongued devils to grace Texas since God made trial lawyers. When Racehorse was growing up, his Houston family was so poor that, in Moses-like fashion, they had to leave him in the bulrushes near San Antonio, where his granny—who was just a hair over four feet tall—taught him everything he needed to know while drinking a pint of gin every day. (Just to be clear: She was the one drinking.) When it came time to attend elementary school, Racehorse filled out all the forms himself and bypassed the first and second grades. In case you're wondering, his real name is Richard, which also happens to be the Kinkster's real name. (Possibly because of Richard Nixon, we grasped at any nicknames we could get.) He got his nickname from a disgruntled junior high school football coach after failing to break the line of scrimmage on two consecutive plays and galloping rapidly for the sidelines. "Goddam," said the coach sarcastically. "What do you think you are—a racehorse?"

It would not be possible in this column to do justice to Racehorse's many subsequent victorious battles in the courtroom. "I don't get people off," he once told me. "The jury acquits them." One of the people the jury acquitted was T. Cullen Davis, the richest man—by 1976 standards—ever brought to trial on a murder charge. Davis allegedly shot and wounded his wife, Priscilla, and croaked his stepdaughter and Priscilla's lover with a .38 in his $6 million mansion on 181 acres near little old downtown Fort Worth. At the time, Davis claimed to have been Sirhan Sirhan, party of one, by himself in a movie theater watching The Bad News Bears.

Another famous client of Racehorse's was Dr. John Hill, who allegedly fed his wife an éclair laced with E. coli bacteria. Toxic shock syndrome was undiagnosed in those days, but experts now agree that Racehorse was once again on the right side of the scales of justice. Then there was the infamous Kerrville "slave ranch" trial, involving drifters who were kidnapped, tortured, and in one instance, allegedly killed with a cattle prod. Racehorse put on quite a show for the local Kerrverts in front of the courthouse one afternoon. Ever the dedicated defender, he shocked himself repeatedly with an electric cattle prod. "It hurt," he says, "but it wasn't lethal."

In dealing with his famous and not so famous clients, there is one rule Racehorse holds inviolable: He almost never allows the defendant to say anything in court. He learned this lesson from a personal experience as a young lawyer. "I believed my guy was innocent, and apparently the jury agreed," says Racehorse. "So when the bailiff handed the verdict to the judge, and the judge declared, 'Not guilty,' I shook hands with my guy and told him he could thank the jury if he wished. So he stands up and he says to the jury, 'Thank you. I'll never do it again.'"

Racehorse has become every year's model for what a successful trial lawyer should be. He is part Clarence Darrow, part Perry Mason, and part, well, Racehorse Haynes. Yet despite the fact that he has a mansion in River Oaks, sailed until recently on the yacht, and owned a large former slave ranch in the Hill Country (payment for rebuffing those drifters), his lucrative and high-profile triumphs are not what motivate him. That distinction belongs to the case he can't forget, one in the early fifties that earned him no publicity and no fee. In what Racehorse felt was a frame-up, a black man was charged with stealing construction materials. When the jury rendered a verdict of not guilty, the defendant, accompanied by his six little children and his 250-pound wife, began screaming with joy and rushed to hug the young lawyer.

Racehorse went to a party that night at a company shack on the poor side of town. The hors d'oeuvres consisted of leftover barbecue and Coca-Cola. The man was there with his wife, all the kids, and the old grandma. The place was appropriately decorated. The kids had taken their crayons and written on the walls, "God Bless You, Mr. Racehorse."

Copyright © 2004 Texas Monthly, Inc. All rights reserved.