Friday, December 31, 2004

The Kinkster Is Lookin' Good!

The Kinkster is running well in the Texas Monthly straw poll on the magazine's Web site. However, the chief political analyst for the magazine—Paul Burka—omits the Kinkster from his speculation about the 2006 gubernatorial race. If W can receive a second term while we are going to Hell in a handbasket, anything is possible. Get ready for a Kinky governor. If this is a (fair & balanced) chimera, so be it.


[x Texas Monthly Poll for January 2005]

If Kinky Friedman gets on the ballot in 2006, will you support him for governor?
Yes 57 % 57.85 % (140)
No 42 % 42.15 % (102)
Total Votes: 242

We only count one vote per visitor.
These polls are not conducted scientifically, and represent the opinions of site visitors.





[x Texas Monthly]
The Games Begin
by Paul Burka

In October, a few days before the presidential election, I drove to Lubbock—through the Hill Country, into the empty West Texas ranch lands, and then onto the Plains, where late cotton waited in the fields to be picked. In all that time, from the moment I left Austin to the moment I returned, I did not see a single John Kerry yard sign or bumper sticker. The election returns verified my informal survey: George W. Bush won 61 percent of the vote in Texas, a two-point gain over his showing in 2000, and down-ballot Republican judicial candidates raised their previous floor by the same amount. The state, it’s widely acknowledged, is more Republican than it has ever been.

But the question that always must be asked in politics is, To what end? Elections change the balance of power, but outside the Capitol, our problems remain the same. Texas today faces two intractable crises: an outmoded—and, according to a recent court ruling, unconstitutional—method of financing its public schools and a system of raising the revenue to pay for state government that bears little relation to the modern Texas economy. What is the state’s majority party going to do about them? It is still struggling to create an identity for itself, to find the ideal balance between ideology and governing. If the Republicans’ numerical superiority is assured for the foreseeable future, their aptitude for leading the state is less certain.

This is the dark cloud hovering over the legislative session that begins on January 11. The same question that faced the first GOP legislature two years ago is still awaiting an answer from the second: Can the Republicans govern? The best description of the feud-filled 2003 session was that the Republicans didn’t know how to act like a majority and the Democrats didn’t know how to act like a minority. The GOP was successful in passing its ideological agenda—restrictions on abortion, sweeping tort reforms, deep budget cuts, congressional redistricting—but when the fun and games were over, the eight-hundred-pound-gorilla issues of state government were still on the loose.

In fairness to the Republicans, the Democrats tinkered with school finance and the tax structure for decades without fixing them, but at least their patches on leaky tires did an adequate job of keeping enough air inside to enable the contraption to move forward. So far, in one regular session and one special session on school finance, Republicans have done little more than check the pressure. Guess what: It’s high—and it’s going to get higher. Looming ahead are the 2006 elections, when the GOP faces the prospect of no-holds-barred battles for the major statewide offices in its own primary. Everything the Legislature does or does not do in the upcoming session is a potential hot button for incumbents.

Especially school finance. Even without the adverse court ruling, the calls for changing the system have been loud and long. The politics of the issue should be familiar by now. The impetus for change comes from property-rich school districts, which, under the so-called Robin Hood law, in force since 1993, have had to share their property-tax revenue with poorer school districts. The solution is to reduce the disparity between rich and poor districts by finding other sources of revenue to replace property taxes. Easier said than done. Even a tax shift that is revenue-neutral—a dollar-for-dollar shift from property taxes to something else—might be hard for Republicans to vote for.

Then there is the matter of what the “something else” is going to be, a question that goes to the heart of what’s wrong with the state’s current tax structure. In agricultural Texas and in oil-boom Texas, land was the major source of wealth. But today it is professional services (doctors, dentists, lawyers, architects, accountants, consultants), which depend on brains, not land. A state income tax makes the most sense, but that’s not going to happen on the Republicans’ watch. Raise the sales tax? Getting rid of Robin Hood will require a big increase, and we already have one of the highest rates of any state. Amend the state constitution to allow a statewide property tax? Just try and get a two-thirds favorable vote out of a Republican legislature and then try and get a majority of the voters to approve it. Impose a new business tax? Oh, that’s a swell incentive for new businesses to locate here. There’s a reason this stuff is hard.

It gets harder. All of these scenarios assume a revenue-neutral tax shift. But that is not acceptable to education advocates—particularly superintendents, who have a lot of clout with legislators from midsized towns and rural areas, most of them Republican. Consequently, the usual solidarity of GOP legislators doesn’t exist in school finance. Lawmakers from the suburbs typically want to get rid of Robin Hood and are hesitant to vote for more educational spending, while those from rural Texas, whose districts benefit from Robin Hood, are just the opposite. Their superintendents say that the schools need more money to do the job state law requires them to do.

The recent ruling against the current school finance system backs them up. Noting that the Texas Constitution requires a “general diffusion of knowledge,” state district court judge John Dietz pointed out that the minimum state requirement for student performance on standardized tests—a rating of “acceptable” for schools—requires a passing rate of only 25 percent, which is hardly a “general diffusion.” Dietz also ruled against the state on another key issue: that the high percentage of districts that have reached the maximum property tax rate allowed by state law ($1.50 per $100 of valuation) means that Texas effectively has a statewide property tax, which the state constitution forbids. The Texas Supreme Court will have the final say on the constitutionality of the current system, and it is quite possible that lawmakers will elect to wait for the court to rule before venturing into the political thicket that lies ahead.

The Legislature can afford to play the waiting game, but the governor can’t. No one has more at stake in the 2006 elections than Rick Perry, who vowed to get rid of Robin Hood when he ran and won in 2002. His potential Republican primary opponents include Comptroller Carole Keeton Strayhorn, a frequent Perry critic, and U.S. senator Kay Bailey Hutchison, no Perry admirer herself. The likelihood of a Hutchison challenge soared recently when Congress reinstated a provision that allows funds raised for federal campaigns to be used in state races. The Perry camp has always doubted that Hutchison would run—“Those who are going to run, run,” one Perry insider told me. “Those who aren’t going to run, talk”—but a $6.7 million war chest speaks louder than words. The conventional wisdom among Republicans holds that Strayhorn has hurt herself with the GOP base by her strident attacks on the governor, but the failure of the Legislature to address school finance would give her a formidable issue.

A Perry-Hutchison or a Perry-Strayhorn primary race would be a battle for the soul of the party. Perry has close ties to the GOP ideological and big-donor base, while Hutchison and Strayhorn cast themselves as fiscally conservative but more pragmatic, more supportive of government programs, and more moderate on abortion. Hutchison and Strayhorn have been critical of the governor for supporting cuts in social services that affect children, such as Child Protective Services, where state workers’ caseloads are huge and children have died from abuse, and CHIP, the Children’s Health Insurance Program, where budget cuts caused 159,000 kids to become ineligible for health insurance (although in some instances other health care was available). To overcome Perry’s strength with the Republican base, Hutchison or Strayhorn needs two things to happen: a bigger-than-usual primary turnout, including crossover Democrats and independents, and a two-person race. A three-way battle would likely result in a runoff, in which the low turnout would be dominated by the party faithful who favor Perry.

A decision by Hutchison to challenge Perry would set off a high-stakes game of musical chairs among the major GOP officeholders. Lieutenant Governor David Dewhurst, who has set his sights on succeeding Perry in 2010, would face the decision of whether to stay put, counting on a Perry victory, or run for Hutchison’s Senate seat. That race could also attract George W. Bush’s longtime friend Don Evans, the former Secretary of Commerce, and South Texas congressman Henry Bonilla. Strayhorn could then shift gears and run for Dewhurst’s old job, and Attorney General Greg Abbott might do the same. Agriculture commissioner Susan Combs has already announced her interest in Strayhorn’s current job. What an amazing political year 2006 could turn out to be.

But what happens next year will be heavily influenced by what happens this year. If the upcoming session is like the last one (and the special sessions that followed)—divisive, mean-spirited, and petty, on both sides of the Capitol, on both sides of the aisle—getting something done on the big issues of school finance and tax reform will be all but impossible. It shouldn’t take long to find out. Three Republicans who lost to Democrats in November decided to challenge the outcome of their elections in the House of Representatives, where their party holds an 87—63 majority. The most important of these races took place in Houston, where Democrat Hubert Vo upset Republican Talmadge Heflin, the chief House budget writer, by 33 votes. The process calls for a member to be in charge of gathering evidence and for the House to determine the winner or order a new election. The Republicans have the votes to do whatever they want, evidence or no evidence, but Democrats in the Capitol, outside it, and on the Web will go berserk if they think three seats have been stolen from them. How the House—and especially Speaker Tom Craddick, a polarizing figure who had a politically fruitful but personally rocky first session—handles the challenges will go a long way toward determining the success of the session.

The Republicans have mastered the art of campaigning. I hope they do as well with governing, if only so I won’t have to write another column two years from now asking the same old question: Can they govern?

Paul Burka is the senior executive editor of Texas Monthly.


Copyright © 2004 Texas Monthly

Thursday, December 30, 2004

"How's That Workin' For Ya?"

Dr. Phil is a fraud. If this is (fair & balanced) quackery, so be it.

[x New Republic]
Daddy Knows
by Michelle Cottle

It happens at some point on show after show: The eyes widen. The brows arch. The forehead wrinkles. Then the hulking man in the tall chair leans toward his anxious guest and drawls, "Do you wanna hear what I think?"

The pause for a stammered assent is unnecessary. Everyone knows the question is rhetorical. The folks who ascend the studio's stage are flat-out desperate to hear what the famously opinionated Dr. Phil has to say about whatever problem is mucking up their lives. During a representative week in November, America's favorite psychologist offered his televised counsel to a jilted bride, an aspiring bride with a marriage-phobic fiancé, a runaway bride with five broken engagements, two teenage targets of pedophiles, an engaged couple squabbling over the groom's bachelor party plans, a couple with an out-of-control weimaraner, a wife enraged by her hubby's Web chats with an ex-flame, and a bevy of "real-life 'Desperate Housewives'" harboring shady secrets ranging from alcoholism to kleptomania. Unrequited love, unfulfilled dreams, adultery, addiction, fear of commitment, fear of rejection, parents with violent kids, parents with lazy kids, parents with kids who refuse to wear anything but pajamas--no topic is too serious or too silly for dissection by Dr. Phil.

Since launching his daytime talk show two years ago, Phillip C. McGraw--Oklahoma native, Texas transplant, and self-described "country boy"--has taken the American psyche by storm. His syndicated program is watched by an estimated 6.6 million viewers. (Only Queen Oprah, his mentor, ranks higher in the pantheon of talk-show gods.) CBS/LandovIn the past five years, five of his books have hit number one on The New York Times best-seller list. He publishes an online newsletter, writes a monthly column for O magazine, and has done celebrity endorsements for weight-loss products. When he goes on speaking tours, tens of thousands of fans, mostly women, often pay upward of $100 apiece to bear witness. He is greeted like a rock star; gals have been known to mail him their undergarments. In 2001, People magazine named him one of its sexiest people--quite an achievement for a lumbering, middle-aged bald guy with a silly moustache. The following year, he made the magazine's list of "25 Most Intriguing People" as well as Barbara Walters's list of the "Ten Most Fascinating People." In the midst of this year's presidential race, McGraw scored sit-downs with both President Bush and challenger John Kerry (and their wives, of course) to discuss the joys and horrors of modern parenting. Around the same time, in an arguably more impressive display of clout, McGraw made a guest appearance on "Sesame Street" with his puppet alter ego, Dr. Feel. If having a Muppet created in your own image doesn't signal cultural dominance in America, what does?

advertisement

The foundation of Dr. Phil's multimillion-dollar success is his folksy, tell-it-like-it-is, action-oriented brand of self-help. His website commands visitors to get real. get smart. get going. Although a licensed psychologist, he disdains the touchy-feely, I'm-OK-you're-OK nonjudgmentalism associated with most therapists and talk-show hosts. ("Analysis is paralysis" is a favorite Dr. Philism.) He boasts of having failed in private practice because he had "no patience for my patients." If a guest on his show is living selfishly, self-destructively, or just plain stupidly, Dr. Phil has no problem telling them so. "What were you thinking?"--exclaimed with varying blends of dismay and disgust--has become one of his signature lines. Another, "You either get it or you don't," cleanly divides humanity into those who recognize Dr. Phil's wisdom and those who are beyond hope.

People love the doc's pull-no-punches, good ol' boy shtick. And McGraw's willingness to denounce poor life choices has earned him kudos from conservative quarters. In its spring 2004 issue, the Manhattan Institute's City Journal cheered Dr. Phil's willingness to pass judgment and pointed to his popularity as proof of Americans' "growing thirst for moral direction." Striking a blow for red-staters, the publication crowed that "the host is at his most compelling when addressing Americans' chronic unseriousness about the meaning and obligations of marriage, especially where children are concerned--in short, doing exactly what the elite media has attacked George W. Bush for doing."

But even nonconservatives get a kick out of McGraw's macho sauciness, largely because, let's face it, all but the leftiest of lefties are tired of pretending that all life choices are equally valid. (In Gallup's latest poll of most-admired celebs, McGraw received higher favorability ratings among liberals and moderates than among conservatives.) Truth be told, some folks don't need to learn to love themselves so much as they need to stop being such jackasses. On some level, the joy of watching Dr. Phil is that he does what most of us would love to do: speak truth to idiocy. Thus, when Christie, a singularly talentless Ohio housewife, whined that her husband was thwarting her dream of country music stardom by refusing to move the family to Nashville, what viewer didn't envy Dr. Phil's willingness to tell the yowling twit just how self-deluded she really was? Then there was Bonnie, the Texas pageant mom who boasted about how her four-year-old daughter had proudly worn braces at age two. Dr. Phil clearly spoke for all non-insane parents when he decreed that no tot should be subjected to that sort of creepiness.

So, while critics and other media types periodically grumble about Dr. Phil's penchant for hyping the most salacious elements of his show, they still give him points for dishing up common-sense advice and for not coddling moronic guests. (In a September 2002 editor's note justifying its gushing cover profile of McGraw, Newsweek praised his assault on "the culture of victimology" and noted, "Privately, we all know we have to take responsibility for decisions we control. It may not be revolutionary advice. ... But it's still an important message with clear resonance.") And everyone is simply gaga over his wry wit and colorful sayings. The prevailing sentiment seems to be that, even if the guy can be a bit arrogant and abrasive and bullying--a reputation McGraw clearly relishes--where is the harm?

Oh, where to begin? For starters, McGraw relies on much the same exploitative freak-show format as Jerry Springer or Jenny Jones, with everyone from drug-addicted housewives to love-starved transsexuals spinning their tales of woe for a salivating audience. But to help himself--and his audience--feel less icky about their voyeurism, Dr. Phil exposes America's dark side under the guise of inspiring hope and change. In Dr. Phil's formulation, cheating couples who air every nauseating detail of their sex lives on national television aren't shameless media whores, they are troubled souls courageous enough to seek help. Even in cases so marginal as to have no bearing on 99.9 percent of viewers--such as parents struggling with a child exhibiting homicidal tendencies--Dr. Phil reassures us that the publicity is beneficial to other families because these problems occur "on a continuum": A six-year-old with low-grade behavior problems today could, if left unchecked, turn out to be a serial killer down the road.

Luckily for anxious parents everywhere, Dr. Phil knows exactly how to save a rambunctious child from becoming the next Jeffrey Dahmer. In fact, McGraw has a blueprint for how to overcome virtually every life challenge, the more frightening and complex, the better. His books are subtitled like little flowcharts to happiness--"The seven keys to weight loss freedom"; "A seven-step strategy for reconnecting with your partner"; "Your step-by-step plan for creating a phenomenal family"--and subdivided into a dizzying array of numbered lists, bullet points, charts, and sidebars. This country boy doesn't just "tell it like it is"; he tells people what to do to get the life/love/weight/kids/self-esteem they want. The lure is irresistible: For a nervous, insecure nation, nothing is more seductive than a stern yet benevolent father figure offering to lift the burden of decision-making from our shoulders. Much like Fyodor Dostoevsky's Grand Inquisitor, McGraw has assumed that burden, dispensing direction, certainty, and moral clarity in an increasingly uncertain world. (Think of him as the George W. of daytime television.) The "self-help" label often applied to him is inaccurate: Dr. Phil isn't teaching people to make good decisions so much as he's teaching them to look to him for solutions. This is the real secret to--and the most disheartening aspect of--the Dr. Phil phenomenon. Forget personal responsibility, what McGraw is promoting is sweet submission to his authority. And, as his popularity grows, so do his ego and his ambitions, to the point where it is increasingly hard to tell if Dr. Phil sees himself more as America's daddy or its messiah.



Jennifer may be the quintessential Dr. Phil supplicant. A worn-down, middle-aged divorcée, Jennifer has journeyed to Los Angeles to have McGraw tell her what to do about her son, Tim, an out-of-control 16-year-old who is heavy into pot and porn and has a long history of stealing to support his drug habit.

Like most segments, Jennifer's begins with a pretaped video outlining the basics of her predicament. These "video diaries" include narration by guests, interviews with friends and family members, melodramatic reenactments, and, most disturbing, grainy footage from unmanned cameras that the show installs in people's homes to capture the unguarded "reality" of their daily lives. (As if most Americans could forget for a moment that they're being filmed for television.) Many of the diaries conclude with a guest's plea: "Dr. Phil, can you please help me [insert appeal to fix tawdry, silly, or sad personal problem here]?" Jennifer's ends more dramatically, as she drops her face into her hands and sobs at the thought of her son's grim future.

Next, we flash to Jennifer and Dr. Phil sitting close together in tall, red chairs at the center of his round stage. McGraw runs down a laundry list of Tim's misdeeds--which include slipping a 13-year-old girl into his bedroom and getting arrested the night before the show's crew showed up for taping--and Jennifer's parental failures. As usual, when a particularly juicy tidbit is revealed--such as the fact that Jennifer's last beau liked to knock Tim around--the camera cuts to a visibly shocked member of the studio audience (which tends to be overwhelmingly female and blindingly white). Eventually, McGraw shifts into judgment mode: "You can't be in that denial anymore!" "It isn't about meeting your emotional needs!" "You are in over your head!" As he scolds and dispenses his Solomonic ruling on what must be done--"I can tell you, that kid needs to be in a supportive and therapeutic environment before the sun sets today"--a cow-eyed Jennifer nods obediently, interjecting her acquiescence to McGraw's verdict: "Yes. Yes." "That's exactly it." "I know." "That's exactly what he needs." And, just to ensure that no one misses the import of this TV moment, McGraw repeatedly stresses that "we are fighting to save this young man's life." ("I truly believe that the decisions we make in the next little bit of time here are going to be determinative," he intones.) Don't touch that dial, people! Springer may have strippers and dwarves, but Dr. Phil is saving lives!

Ever the savvy host, McGraw likes to go to commercial with a stay-tuned teaser. ("The question becomes: Is it too late? Can Jennifer's son be saved?") Jennifer's case provides an extra dash of suspense: Unbeknownst to Tim, who is sequestered in the studio's green room, McGraw has arranged to have the boy enrolled in a wilderness-therapy program in North Carolina--by force if necessary. "I'm gonna try to help him see the wisdom of this," McGraw tells Mom. But, if he resists, the program's "transport agent" is "standing by." How exciting! Viewers may experience the thrill of seeing an unruly adolescent, whose every sin they now know by heart, hauled away in restraints--and they can feel good about watching, because this is all being done in the name of helping others.

Toward the end of the segment, the host physically propels his clinging guest toward the moment of truth. The doctor holds Jennifer's hand. He wraps an arm around her shoulder. He leads her (and the cameras, of course) backstage to the waiting Tim, but only after issuing a stern warning: You've shown courage coming this far, he comforts the trembling Jennifer. But, if you can't handle whatever happens next, he chides, "I don't even want you to go back there with me."



Now seems like a good time to pause and address Dr. Phil's particularly disturbing tendency to drag children into the cesspool of daytime television. Despite his oft-professed obsession with protecting kids from adult realities, McGraw is constantly spotlighting the little darlings on his very adult show, frequently featuring them in pretaped or even in-studio interviews about all the yucky things that go on at home. Often, the youngsters are nothing more than props used to up the emotional stakes in parental dramas. Beleaguered wife Kandi, for instance, wanted Dr. Phil's help with a breathtakingly selfish husband, Ed, who had managed to impregnate his mistress/co-worker. Installing cameras in the family's home, Dr. Phil recorded Ed and Kandi shrieking about the affair in front of their three kids; one particularly heart-warming moment featured a small boy running around the house demanding to know what an STD was.

Lest anyone accuse him of exploiting innocents, McGraw justifies such footage as an instructional tool: evidence that parents need to learn to keep the kids out of it. Stressing its sensitivity, the show digitally blurs the faces of most minors shown in the video diaries. But this only applies to full-face frontal shots. These same children are shown in profile, from behind, and in segmented shots (e.g., from the nose up, from the nose down), and their unaltered voices are heard discussing all manner of domestic seediness. Then there's the fact that everyone these children has ever met--friends, teachers, classmates, ministers, scout leaders, grocery clerks--will see (or hear about) their disgusting, pathetic parents on national television, talking about, for instance, how Ed liked to do it with his mistress on his and Kandi's bed when poor Kandi was off tending to her dying father. With that kind of exposure thrust upon youngsters, digital blurring seems beyond pointless.

For children whose problems are featured on the show, the spotlight is even hotter, such as when 14-year-old Chris came in to chat about why he wanted to pursue a relationship with a 28-year-old pedophile. In addition to revealing horrifying sexual details about Chris's childhood, the segment included Dr. Phil's playing a flirty phone conversation between the boy and his sicko suitor that had been secretly taped by Chris's adoptive dad, who happens to be the original "Survivor" winner, pseudo-celebrity Richard Hatch (a man clearly itching for 15 more minutes of fame). Then there was Dr. Phil's now-infamous prime-time special, a parenting-themed program that aired in September in conjunction with the release of McGraw's latest book, Family First. Featured in this two-hour extravaganza was Eric, an out-of-control nine-year-old who, we learned, liked to smear his own feces on the walls and punch his sister in the lip just to see her bleed. Dr. Phil informed Eric's parents that it was time to engage in "commando parenting," to be more consistent with their discipline and to arrange for the attention-starved lad to spend more time with his dad--all reasonable, if not especially inspired, advice. But McGraw also warned the couple that their son exhibited nine of the 14 characteristics associated with serial killers--and that Jeffrey Dahmer had exhibited only seven. (Cue the ominous background music.) Driving the point home, a picture of young Eric appeared on the screen next to a mug shot of Dahmer. How's that for "keeping the kids out of it"?

But back to Jennifer. In the green room, Dr. Phil talks with Tim about his hideous behavior. When Tim admits that he's terrified of going to prison, Jennifer meekly suggests that maybe that won't have to happen. McGraw lunges, berating the now cowering woman for having "rewritten the code" and for "minimizing the situation." "Are you kidding me?" he demands, scolding her like a naughty child. "What are you thinking?" Jennifer is reduced to babbling apologetically, head bowed, eyes downcast in the face of McGraw's scorn--effectively destroying whatever shred of respect Tim might have had for her parental authority. "Well, you can listen to me, or you can listen to your mother," McGraw snaps at the teen. "But you are in a lot of trouble."

And therein lies the core message of Dr. Phil: If you know what's good for you, you'll listen to me. When preparing to voice his opinion, McGraw often self-deprecatingly insists, "I don't expect you to substitute my judgment for yours." But that is precisely what he expects. It is the premise of his entire show. In fact, perhaps the most honest episodes don't involve McGraw giving "advice" so much as simply doling out rewards or threats to get people to clean up their acts. In a segment featuring a compulsive shopper, McGraw offered the young woman a gigantic diamond ring if she would agree not to buy any frivolous items for 30 days. In a more serious vein, he personally arranges for people with drug addictions or emotional problems or eating disorders to be admitted to top-notch treatment facilities that they could otherwise never afford. He often sends these people off with a good-natured warning that, if they "don't do their homework," they'll have him to answer to--the therapeutic equivalent of your dad telling you to clean your room or he'll whip your butt. Hopefully the individuals receiving such intense assistance wind up healthy. But, whatever else they achieve, Dr. Phil's grand interventions sell the idea that what we all really need is a rich, well-connected fairy godfather to swoop in, reorder our lives, and keep us in line--perhaps not the best message for most adults to internalize.



The Dr. Phil phenomenon began humbly enough. McGraw was born "dirt poor" in Vinita, Oklahoma, in 1950. His mother, Terri, was a clerk; his dad, Joe, was a high school football coach turned oil-equipment salesman who went on to earn a psychology degree in his forties. Joe was also an alcoholic, making childhood less than idyllic for Phil and his three sisters. Taking solace in sports, McGraw managed to win a football scholarship to his dad's alma mater, the University of Tulsa. But injury prompted him to transfer to Midwestern State University in northern Texas, from which he graduated in 1975. The following year, McGraw--who had married and divorced young--married his current wife of 28 years, Robin, with whom he fathered sons Jay and Jordon. In 1979, McGraw received his doctorate in clinical psychology from the University of North Texas and went into psychology practice with his dad.

McGraw often talks about how frustrating he found private practice, largely because of his lack of patience with the process. But, in 1988, his career hit a more concrete bump. The Texas State Board of Examiners of Psychologists formally reprimanded McGraw for carrying on an inappropriate relationship with a 19-year-old therapy client. The young woman, whom McGraw had not only counseled but also hired to work in his office (a no-no according to the board), alleged that their relationship had been controlling and, at times, sexually inappropriate. McGraw denied the charges and settled with the board, but, according to the unauthorized biography The Making of Dr. Phil, the board's disciplinary actions included requiring McGraw to take an ethics class and undergo a year of supervision by a licensed psychologist. A year later, McGraw abandoned his practice.

Divorced from psychology, McGraw contracted acute career ADD, fluttering from one field to the next. He ran a pain clinic. He worked in management training. He consulted for airlines. Having frequently testified in court as an expert on the human brain and behavior, in 1989, McGraw co-founded Courtroom Sciences Inc. (CSI) to help prep witnesses for trial. When, in 1996, Oprah Winfrey found herself in legal trouble with Texas cattlemen because of her on-air statements about mad cow disease, she was referred to CSI. As McGraw tells the story in his first book, Life Strategies, the daytime diva was wasting precious time and energy obsessing about why this was happening until, one day, he took her by the hand and commanded, "Oprah, look at me, right now. ... You'd better get over it and get in the game, or these good ol' boys are going to hand you your ass on a platter."

Emerging triumphant, Oprah decided to introduce the world to her new guru, whom she nicknamed Dr. Tell-It-Like-It-Is Phil. McGraw first appeared on Winfrey's show in April 1998. He was soon a regular, and wildly popular, Tuesday feature. At Oprah's urging, McGraw began entertaining offers for a spin-off show, and, in September 2002, "Dr. Phil" debuted on NBC affiliates across the nation, scoring higher ratings that year than any show of its genre since Oprah's freshman season in 1986. The subsequent chapters of McGraw's story read like the turbo-charged fantasy of some dirt-poor kid from rural Oklahoma.

By now, Dr. Phil has gotten so big that he no longer confines his "life makeovers" to individual families. Earlier this year, McGraw adopted the entire town of Elgin, Texas, located a few miles east of Austin. He chose Elgin not because its citizens are so screwed up, but because they are a painfully normal representation of, as the show has labeled it, "Anywhere, USA." But that hasn't stopped Dr. Phil from moving in and lecturing the entire citizenry on how their lives and families are going down the tubes. And it hasn't stopped him from overhyping social problems (the school system was outraged when McGraw inflated teen pregnancy stats) and airing individuals' dirty little secrets--often to the distress of their friends and neighbors. (The episode in which the high school soccer coach admitted to being a wife-beater and a Web porn-cruiser reportedly came as a particular shock to some parents.) More than a few Elginites are annoyed by McGraw's efforts, as one local put it to the Dallas press, "to save us from ourselves." In fact, a recent Web poll by the Elgin Courier found that around half the town would like Dr. Phil to take his salvation services elsewhere. But, to McGraw, such unenlightened grumbling will likely only serve as further proof of how much the town--and all of America, really--needs him.

Perhaps my favorite Dr. Phil guest was Heather, who wanted to know what to do about the fact that her two-and-a-half-year-old son, Connor, was having recurring nightmares in which Dr. Phil crept into his bedroom and put him "in headlocks." The adorable tot was shown on tape (blur-free), recounting the details of how he actually dreams of two Dr. Phils (quelle horreure!): The good Dr. Phil, who is brown and lives in a little house at the local Wal-Mart, and the bad, headlock-prone Dr. Phil, who is blue.

Since Connor's nightmares began around the time his baby sister was born, McGraw posited that the little guy feels threatened by the family newcomer. (What insight!) And since, in addition to spending so much time with the new baby, Heather spends an hour each day glued to the "Dr. Phil" show--which Connor is allowed to watch but during which he must remain appropriately quiet--McGraw ventured that Connor sees him as yet another drain on Mommy's attention. He cautioned Heather and her husband against coddling Connor (allowing him, for instance, to crawl into their bed after a nightmare) but assured them that the dreams will dissipate as Connor adjusts to his new sister.

Perfectly sound advice. Of course, the even more obvious advice would have been to ask what kind of moron lets her two-year-old watch a show that pokes and prods America's nasty underbelly graphically enough to give the average adult nightmares. To quote the good ol' boy himself, "What in the hell is this gal thinking?"

It's a question more of us should be asking about our national surrender to Dr. Phil.

Michelle Cottle is a senior editor at TNR.

Copyright © 2004 The New Republic

Need Something For Your Rear Window?


Window Sticker of the Day
Copyright © 2004 Village Voice
 Posted by Hello

George W. Bush Is The War-Criminal-in-Chief!

Don Imus (the I-Man) routinely refers to the Dickster and the Rumster as war criminals. The I-Man refrains from applying that sobriquet to His Fraudulency, the POTUS. ¡No mas! Enough! If this is (fair & balanced) outrage, so be it.

[x NYTimes]
Washington's New Year War Cry: Party On!
By FRANK RICH

On the fourth day 'til Christmas, the day that news of the slaughter at the mess tent in Mosul slammed into the evening news, CBS had scheduled a special treat. That evening brought the annual broadcast of "The Kennedy Center Honors," the carefree variety show in which Washington's top dogs mingle with visitors from that mysterious land known as the Arts and do a passing (if fashion-challenged) imitation of revelers at the Oscars. This year, like any other, the show was handing out medals to those representing "the very best in American culture," as exemplified by honorees like Australia's Dame Joan Sutherland and Britain's Sir Elton John. Festive bipartisanship reigned. Though Sir Elton had said just three weeks earlier that "Bush and this administration are the worst thing that has ever happened to America," he and his boyfriend joined the president and Mrs. Bush in their box. John Kerry held forth in an orchestra seat below.

"The Kennedy Center Honors" is no ratings powerhouse; this year more adults under 50 elected to watch "The Real Gilligan's Island" on cable instead. But I tuned in, curious to see how this gathering of the capital's finest might be affected by the war. The honors had actually been staged and taped earlier in the month, on Dec. 5. That day the morning newspapers told of more deadly strikes by suicide bombers in Mosul and Baghdad, killing at least 26 Iraqi security officers, including 8 in a police station near the capital's protected Green Zone. There were also reports of at least four American casualties in other firefights.

But if anyone at the Kennedy Center so much as acknowledged this reality unfolding beyond the opera house, it was not to be found in the show presented on television. The only wars evoked were those scored by another honoree, John Williams, whose soundtrack music for "Saving Private Ryan" and "Star Wars" was merrily belted out by a military band. (Our delicate sensibilities were spared the sight of an actual "Private Ryan" battle scene, however, lest the broadcast risk being shut down for "indecency.") The razzle-dazzle Hollywood martial music, the what-me-worry Washington establishment, the glow of money and red plush: everything about the tableau reeked of the disconnect between the war in Iraq and the comfort of all of us at home, starting with those in government who had conceived, planned, rubber-stamped and managed our excellent adventure in spreading democracy.

Ordinary people beyond Washington, red and blue Americans alike, are feeling that disconnect more and more. On the same day that CBS broadcast the Kennedy Center special, an ABC News/Washington Post poll found that 70 percent of Americans believed that any gains in Iraq had come at the cost of "unacceptable" losses in casualties and that 56 percent believed the war wasn't "worth fighting" - up 8 percent since the summer. In other words, most Americans believe that our troops are dying for no good reason, even as a similar majority (58 percent) believes, contradictorily enough, that we should keep them in Iraq.

So the soldiers soldier on, and we party on. As James Dao wrote in The New York Times, "support our troops" became a verbal touchstone in 2004, yet "only for a minuscule portion of the populace, mainly those with loved ones overseas, does it have anything to do with sacrifice." Quite the contrary: we have our tax cuts, and a president who promises to make them permanent. Such is the disconnect between the country and the war that there is no national outrage when the president awards the Medal of Freedom to the clowns who undermined the troops by bungling intelligence (George Tenet) and Iraqi support (Paul Bremer). Such is the disconnect that Washington and the news media react with slack-jawed shock when one of those good soldiers we support so much speaks up at a town hall meeting in Kuwait and asks the secretary of defense why vehicles that take him and his brothers into battle lack proper armor.

Much has been made of this incident, yet it hardly constituted big news. It's no secret to anyone, including Donald Rumsfeld, that the troops have often been undersupplied. Dana Priest of The Washington Post heard soldiers asking the defense secretary "similar questions about their body armor" when traveling with him a year ago. In October, 23 members of an Army Reserve unit disobeyed a direct order to deliver fuel, partly because they decided that the vulnerability of their trucks made the journey tantamount to a suicide mission. As far back as last spring, Stars and Stripes was reporting that desperate troops were using sandbags as makeshift vehicle armor. Even now, reports The Los Angeles Times, National Guard soldiers are saying they have been shipped to war from Fort Bliss with "chronic illnesses, broken guns and trucks with blown transmissions."

When Mr. Rumsfeld told Specialist Thomas Wilson in Kuwait that the only reason the troops lacked armor was "a matter of production and capability," he was lying. The manufacturers that supply the armor were quick to respond that they had been telling the Pentagon for months that they could increase production, in the case of one company (ArmorWorks in Arizona) by as much as 100 percent. But that news was quickly drowned out by cable and talk radio arguments over whether Mr. Wilson should or should not have consulted with an embedded reporter about the phrasing of his question. Soon Mr. Rumsfeld was off to Iraq for a P.R. tour (message: I care) in which he used troops as photo-op accessories and thanked a soldier for asking a softball question "not planted by the media." Washington could go back to worrying about more pressing domestic problems, like how to cook the books so that Social Security can be fixed cost-free.

The truth is that for all the lip service paid to supporting the troops, out of sight is often out of mind. Even the minority that remains gung-ho about the war in Iraq is quick to blame the grunts for anything that goes wrong. Specialist Wilson, Rush Limbaugh said, was guilty of "near insubordination" for his question in Kuwait; the poor defense secretary "was set up," whined The New York Post. The same crowd tells us that a few low-level guards are solely responsible for the criminal abuse of prisoners at Abu Ghraib and in Guantánamo Bay, not any policy-setting higher-ups who may be sitting in that audience at Kennedy Center. President Bush even tried to pass the buck for his premature aircraft carrier victory jig to the troops, telling the press months later that "the 'Mission Accomplished' sign, of course, was put up by the members of the U.S.S. Abraham Lincoln, saying that their mission was accomplished." Of course.

Back then, the Pentagon projected that our military occupation of Iraq would end in December 2004. But two days after appearing in the box at the Kennedy Center Opera House, the president donned a snappy muted green "commander in chief" jacket - a casual Friday version of the full "Top Gun" costume he'd worn on the Lincoln - to address marines at Camp Pendleton in California who were going to war, not coming home. (Slate reported this week that "nearly one-quarter of U.S. combat dead in 2004 were stationed in Camp Pendleton.") It was the anniversary of Pearl Harbor, and Mr. Bush drew the expected analogy: "Just as we defeated the threats of fascism and imperial communism in the 20th century, we will defeat the threat of global terrorism." But three years into it, can we win a war that most of the country senses has gone astray in Iraq and that the party in power regards as a lower priority than lower taxes?

The ethos could hardly have been more different during the World War II so frequently invoked by Mr. Bush. As David Brinkley recounted in his 1988 history, "Washington Goes to War," the Roosevelt administration's first big push "was a tremendous voluntary program to reduce the deficit, encourage saving, trim spending and thus curb inflation - the sale of war bonds." Though bonds would not in the end pay for the war - that would require the sacrifice of paying taxes - F.D.R. believed that his campaign "would give the public a sense of involvement in a war being fought thousands of miles away, a war so distant many Americans had difficulty at times remembering it was there at all." Gen. George Marshall, the Army's chief of staff, took it on himself to write notes by hand to the family of each man killed in battle until the volume forced the use of Western Union telegrams.

Well, Mr. Rumsfeld has sworn he'll stop delegating condolence letters to his Autopen. But otherwise the contrast between the Washington that won World War II and the Washington fighting a war in Iraq is so striking it can even be found in the cultural lineage of the Kennedy Center show. That show's producer, as it happens, is George Stevens Jr., the son of the great Hollywood filmmaker George Stevens. In his day, the elder Stevens created his own wartime Washington entertainment: a glorious 1943 romantic comedy, "The More the Merrier" (just out on DVD), set in the newly mobilized capital, that, though fiction, is in itself a striking document of the difference between then and now. While it portrays a patriotic Washington as frivolously beset by party animals, bureaucrats and lobbyists as today's, there's an underlying ethos of shared sacrifice, literally down to the living arrangements necessitated by a housing shortage. It might as well be a different civilization.

Washington's next celebration will be the inauguration. Roosevelt decreed that the usual gaiety be set aside at his wartime inaugural in January 1945. There will be no such restraint in the $40 million, four-day extravaganza planned this time, with its top ticket package priced at $250,000. The official theme of the show is "Celebrating Freedom, Honoring Service." That's no guarantee that the troops in Iraq will get armor, but Washington will, at least, give home-front military personnel free admission to one of the nine inaugural balls and let them eat cake.

Copyright © 2004 The New York Times Company

Wednesday, December 29, 2004

There You Go Again: Never Lose An Argument In 2005!

Arthur Schopenhauer (1788-1860) was one of the greatest philosophers of the 19th century, Schopenhauer seems to have had more impact on literature (e.g. Thomas Mann) and on people in general than on academic philosophy. Perhaps that is because, first, he wrote very well, simply and intelligibly (unusual, we might say, for a German philosopher, and unusual now for any philosopher), second, he was the first Western philosopher to have access to translations of philosophical material from India, both Vedic and Buddhist, by which he was profoundly affected, to the great interest of many, and, third, his concerns were with the dilemmas and tragedies, in a religious or existential sense, of real life, not just with abstract philosophical problems. If this is (fair & balanced) disputation, so be it.

[x New Statesman]
The Art of Always Being Right
Arthur Schopenhauer; with an introduction by A C Grayling (2004)
Reviewed by George Walden

Schopenhauer's sardonic little book, laying out 38 rhetorical tricks guaranteed to win you the argument even when you are defeated in logical discussion, is a true text for the times. An exercise in irony and realism, humour and melancholy, this is no antiquarian oddity, but an instruction manual in intellectual duplicity that no aspiring parliamentarian, trainee lawyer, wannabe TV interviewer or newspaper columnist can afford to be without.

The melancholy aspect comes in the main premise of the book: that the point of public argument is not to be right, but to win. Truth cannot be the first casualty in our daily war of words, Schopenhauer suggests, because it was never the bone of contention in the first place. "We must regard objective truth as an accidental circumstance, and look only to the defence of our own position and the refutation of the opponent's . . . Dialectic, then, has as little to do with truth as the fencing master considers who is in the right when a quarrel leads to a duel." Such phrases make us wonder whether his book was no more than a bitter satire, an extension of Machiavellian principles of power play from princes to individuals by a disappointed academic whom it took 30 years to get an audience for his major work, The World as Will and Idea. Perhaps, but only partly. With his low view of human nature, Schopenhauer is also saying that we are all in the sophistry business together.

The interest of his squib goes beyond his tricks of rhetoric: "persuade the audience, not the opponent", "put his theory into some odious category", "become personal, insulting, rude". Instinctively, we itch to apply it to our times, whether in politics, the infotainment business or our postmodern tendency to place inverted commas, smirkingly, around the very notion of truth. Examples of jaw-dropping sophistry by public figures (my own favourite is Tony Blair defending his quasi-selective choice of school for his son on the grounds that he did not wish to impose political correctness on his children: see Schopenhauer's rule number 26: "turn the tables") are easy enough to find. It is more entertaining to see his theory in the light of our national peculiarities.

The flip side of our "healthy scepticism" can be a disinclination to trouble ourselves with rational discussion at all, and a tediously moderate people can be bored by its own sobriety. So it is that, in debate, we prefer to be stirred by passions, or simply amused. Hence the rampant nostalgia for the old political order, dominated by orators such as Michael Foot or Enoch Powell. Each did real damage to the country, Foot with his patrician self-abasement in the face of trade union power, Powell on race, and both with their culpable fantasies about Russia.

"Well you say that," comes the predictable response - a handy rhetorical trick in itself - "but let's not get into their policies; we could go round that buoy for ever" (see trick number 12: "choose metaphors favourable to your proposition"). "The point is that they were such wonderfully passionate, col-ourful and entertaining debaters, compared to the managerial drabness of the House of Commons today." (Trick 29 recommends diversion from the point at issue.) The pay-off line follows quickly (draw your conclusions smartly, says trick 20). "If only we had Boris as Tory leader, it would perk the place up no end!" (This is not wholly invention. Tory and Labour columnists have both written in this vein.)

Perhaps because Schopenhauer was so very un-British, his 38 points overlooked our favourite rhetorical trick: coming up with "quirky" or "original" responses to serious questions. (The nearest he gets is trick number 36: "bewilder your opponent".) In Britain, a willed eccentricity, the cheapest form of distinction, works because it is part of our top-down ethos. The game is to dodge the issue in such a way as to show yourself above it - for example, by throwing off dandyish opinions. Take any premise ("Boris Johnson is not a serious contender for prime minister"), invert it, toss it to the herd with a supercilious smile - and the herd will warm to you, because we do so love a maverick, don't we? For similar reasons, "controversialists" (that is, vulgar cynics who argue positions they do not necessarily believe, the better to astound the impressionable masses) are a very British phenomenon.

The anti-intellectualism all this implies is not, however, a uniquely British trait, and is covered in Schopenhauer's list. "If you know that you have no reply to the arguments your opponent advances . . . declare yourself to be an incompetent judge: 'What you say passes my poor powers of comprehension.'" Your opponent stands instantly convicted of pretension, a crime without appeal in democracies, of which Schopenhauer was no admirer. Truth and logic, he comes close to saying, get you nowhere in a mass society. "The only safe rule, therefore, is [to dispute] only with those of your acquaintance of whom you know that they possess sufficient intelligence and self-respect not to advance absurdities."

In a frequently light-hearted book, this is the least amusing message. The suggestion is that the audiences for serious discussion are doomed to shrink - and remember that Schopenhauer never experienced the sophistry of TV images, whose deliberate or, more frequently, casual mendacity a mere 38 points would not suffice to explain. Yet has his lugubrious prediction proved true? Or do we rather get a feeling, not of an absolute decline in standards of public debate, but of missed potential - something even the BBC has apparently begun to recognise? How many times have we listened to a radio or TV debate on art or politics or literature and asked ourselves, even as we are lulled by the undemanding discussion: are these the best people they can come up with? The answer is yes and no. Yes because in media terms they are the best: practised "communicators" with every crowd-pleasing response at the ready. And no because we have all read or heard or known people far more interesting and far more informed about the disciplines in question. Sadly, they tend to be folk who are not up to speed on their 38 points and who think the truth matters, and so, communication-wise, they are deemed useless. Still, they exist.

If your preference is nevertheless for Schopenhauer's tragic vision of a world in thrall to debate that is indifferent to the truth, examples are not lacking, not just in art or politics, but in the allegedly objective and internationalist scientific world. A brief period as minister for science taught me that when it comes to rubbishing a rival's research or inveigling funds for your own, objectivity is out, and foreigners become a joke. Now I hear neo-Darwinian atheists lambasting as primitive and irrational every religion except the most populous and, in its extreme form, the most dangerous. Why are scientists so intellectually dishonest? For the same reason that the Archbishop of Canterbury hides behind procedural sophistry (needless commissions of inquiry and the like, when the need for liberalism is clear) in dealing with homosexuality in the Church: politics, dear boy. Which does rather diminish the right of scientists and churchmen to look down on politics as a scurvy trade.

The palm for rhetorical shamelessness must nevertheless go to US presidents. "There you go again," said Ronald Reagan, annihilating with a grin the very concept of rational debate, and the right loved him for it. "I did not have sexual relations with that woman," Bill Clinton assured us, with his emetic sincerity, and the left - especially women - adore him still. And not even the melancholic German predicted that the world's most powerful democracy would one day be run by a president who cannot be accused of sophistry chiefly because he cannot talk at all. And they say Schopenhauer was a pessimist.

George Walden is the author of The New Elites: Making a Career in the Masses (Penguin) This review first appeared in the New Statesman.

© New Statesman 2004

An Economic Tsunami?

While W rearranges furniture on the deck of the Titanic, we face the economic equivalent of the disaster that recently struck Southeast Asia, India, and the Horn of Africa. Sobering stuff as we slouch into 2005. If this is (fair & balanced) dread, so be it.

[x Washington Post]
The Next Economy
By Robert J. Samuelson

We are undergoing a profound economic transformation that is barely recognized. This quiet upheaval does not originate in some breathtaking technology but rather in the fading power of forces that have shaped American prosperity for decades and, in some cases, since World War II. As their influence diminishes, the economy will depend increasingly on new patterns of spending and investment that are still only dimly apparent. It is unclear whether these will deliver superior increases in living standards and personal security. What is clear is that the old economic order is passing.

By any historical standard, the record of these decades -- despite flaws -- is remarkable. Per capita income (average income per person) is now $40,000, triple the level of 60 years ago. Only a few of the 10 recessions since 1945 have been deep. In the same period, unemployment averaged 5.9 percent. The worst year was 9.7 percent in 1982. There was nothing like the 18 percent of the 1930s. Prosperity has become the norm. Poverty and unemployment are the exceptions.

But the old order is slowly crumbling. Here are four decisive changes:

• The economy is bound to lose the stimulus of rising consumer debt. Household debt -- everything from home mortgages to credit cards -- now totals about $10 trillion, or roughly 115 percent of personal disposable income. In 1945, debt was about 20 percent of disposable income. For six decades, consumer debt and spending have risen faster than income. Home mortgages, auto loans and store credit all became more available. In 1940, the homeownership rate was 44 percent; now it's 69 percent. But debt can't permanently rise faster than income, and we're approaching a turning point. As aging baby boomers repay mortgages and save for retirement, debt burdens may drop. The implication: weaker consumer spending.

• The benefits from defeating double-digit inflation are fading. Remember, in 1979, inflation peaked at 13 percent; now it's 1 to 3 percent, depending on the measure. The steep decline led to big drops in interest rates and big increases in stock prices (as interest rates fell, money shifted to stocks). Stocks are 12 times their 1982 level. Lower interest rates and higher stock prices encouraged borrowing and spending. But these are one-time stimulants. Mortgage rates can't again fall from 15 percent (1982) to today's 5.7 percent. Nor will stocks soon rise twelvefold. The implication: again, weaker consumer spending.

• The welfare state is growing costlier. Since the 1930s, it has expanded rapidly -- for the elderly (Social Security, Medicare), the poor (Medicaid, food stamps) and students (Pell grants). In 2003, federal welfare spending totaled $1.4 trillion. But all these benefits didn't raise taxes significantly, because lower defense spending covered most costs. In 1954, defense accounted for 70 percent of federal spending and "human resources" (aka welfare), 19 percent. By 2003, defense was 19 percent and human resources took 66 percent. Aging baby boomers and higher defense spending now doom this pleasant substitution. Paying for future benefits will require higher taxes, bigger budget deficits or deep cuts in other programs. All could hurt economic growth.

• The global trading system has become less cohesive and more threatening. Until 15 years ago, the major trading partners (the United States, Europe and Japan) were political and military allies. The end of the Cold War and the addition of China, India and the former Soviet Union to the trading system have changed that. India, China and the former Soviet bloc have also effectively doubled the global labor force, from 1.5 billion to 3 billion workers, estimates Harvard economist Richard Freeman. Global markets are more competitive; the Internet -- all modern telecommunications -- means some service jobs can be "outsourced" abroad. China and other Asian countries target the U.S. market with their exports by fixing their exchange rates.

Taken at face value, these are sobering developments. The great workhorse of the U.S. economy -- consumer spending -- will slow. Foreign competition will intensify. Trade agreements, with more countries and fewer alliances, will be harder to reach. And the costs of government will mount.

There are also global implications. The slow-growing European and Japanese economies depend critically on exports. Until now, that demand has come heavily from the United States, which will run an estimated current account deficit of $660 billion in 2004. But if American consumers become less spendthrift -- because debts are high, taxes rise or benefits are cut -- there will be an ominous collision. Diminished demand from Europe, Japan and the United States will meet rising supply from China, India and other developing countries. This would be a formula for downward pressure on prices, wages and profits -- and upward pressure on unemployment and protectionism.

It need not be. China and India are not just export platforms. Billions of people remain to be lifted out of poverty in these countries and in Latin America and Africa. Ideally, their demands -- for raw materials, for technology -- could strengthen world trade and reduce reliance on America's outsize deficits. If so, exports (and manufacturing) could become the U.S. economy's next great growth sector. Already, the dollar has depreciated 15 percent since early 2002; that makes U.S. exports more price-competitive.

What's at issue is the next decade, not the next year. We know that the U.S. economy is resilient and innovative -- and that Americans are generally optimistic. People seek out new opportunities; they adapt to change. These qualities are enduring engines for growth. But they will also increasingly have to contend with new and powerful forces that may hold us back.

Robert J. Samuelson, a contributing editor of Newsweek, has written a column for The Washington Post since 1977. His column generally appears on Wednesdays.

© 2004 The Washington Post Company

The Fraudulent Four

Historians remind me of French foreign minister Charles Maurice de Talleyrand's meditation on the Bourbon kings, "They forget nothing. They learn nothing." Stephen E. Ambrose was a thief. Doris Kearns Goodwin is a thief. Michael Bellesiles is a thief. Joseph Ellis, while not convicted of plagiarism (yet), is a liar. David Greenberg styles these historians, "The Fraudulent Four." If this is (fair & balanced) flimflam x 4, so be it.

[x History News Network]
The Lessons of the History Scandals
By David Greenberg

In 2002, I stumbled across an act of plagiarism by the historian Stephen E. Ambrose that had gone undiscovered, or at least unmentioned, in the reams of pages then being devoted to his scholarly transgressions. In th e third volume of his Nixon biography, Ambrose wrote, "Two wrongs do not make a right, not even in politics, but they do make a precedent." It was a clever aphorism—uncommonly clever, I now realize, for a man normally given to brown-bag prose. The real author was Richard Nixon's longtime pal and apologist Victor Lasky, who in his 1977 best seller It Didn't Start With Watergate had written, "Granted that two wrongs don't make a right, but in law and politics, two wrongs can make a respectable precedent."

At the time, Ambrose was under fire for numerous similar instances of using other people's words without giving credit. But I saw no point in piling on. Ambrose had been sufficiently exposed—stolen phrases were surfacing in book after book—and he wasn't budging from his defense that as a popular historian, he wasn't bound by scholarly rules. And why should he? In academia, Ambrose had become a joke for his mass production of feel-good war stories before the plagiarism, which only sealed his reputation; outside academia, he remained beloved even after the imbroglio. (I did mention the Lasky-Ambrose incident in my book Nixon's Shadow, but to make a larger point.)

Concurrent with the Ambrose scandal, historian Doris Kearns Goodwin was found to have committed similar (though fewer) acts of plagiarism, albeit unintentionally. (Contrary to popular belief, plagiarism needn't be deliberate to warrant the name.) Also that winter, Emory University began investigating charges that Michael Bellesiles, a historian on its faculty, had invented or grossly distorted data to advance the controversial argument, advanced in his prize-winning Arming America: The Origins of a National Gun Culture, that guns weren't prevalent in the antebellum United States. The previous summer, Mount Holyoke historian Joseph Ellis had admitted to lying about his past to students and others, fabricating tales about having served in Vietnam.

Occurring so soon after one another, these flaps struck many commentators as related symptoms of some deeper affliction gripping the historical profession or the academy. Some saw an expression of postmodernism's dangerous relativizing of truth; others discerned a cautionary tale about the perils of writing popular history. Now come two intelligent books about these affairs that implicitly agree that the coincidence of these scandals says something about the state of the profession. Peter Charles Hoffer's Past Imperfect: Facts, Fictions, Fraud—American History From Bancroft and Parkman to Ambrose, Bellesiles, Ellis, and Goodwin and Ron Robin's Scandals and Scoundrels: Seven Cases That Shook the Academy (which omits the Goodwin case but addresses four other flaps involving nonhistorians, such as the "Sokal Hoax" and the fabrications of Nobel Peace Prize-winner Rigoberta Menchú) both try to put these events in historical perspective.

Hoffer frames the scandals as the culmination of long-brewing tensions in the historical profession. Reviewing the history of professional history, he recounts how the New Left scholars of the 1960s overthrew the so-called "consensus history" of their forebears, demolishing myths of a harmonious American past and discrediting the history-as-hero-worship on which generations were weaned. But while the New Left historians won out in academia, they never brought most lay history readers around to their viewpoint. Most of the public not only continues to regard history as a discrete, verifiable body of facts—about presidents, wars, great events, and the like—but they like their history to portray America, as one witticism has it, having been born perfect and improving ever since. The gulf between these two conceptions of history remains: The public tends to prefer affirmative tales of political and military triumph, while scholars like skeptical, critical accounts, often focused on the slighted stories of women, African-Americans, and other minorities.

Hoffer is right to highlight this gulf between two conceptions of history. But it's not clear how that gulf produced these recent brouhahas. Sometimes Hoffer seems to fault the post-1960s historians who, he says unpersuasively, "did not have the same motivation as their predecessors for shielding established historical writers ... from criticism." At other times he talks of a "conservative backlash" eager to trash these historians. And on still other occasions he seems to endorse the facile explanation that in their eagerness to win fame, readers, and wealth, these historians fatally cut corners.

The four historians Hoffer discusses not only committed very different offenses; they were "popular" in very different ways. Goodwin won celebrity not by churning out best sellers—she has spent years on each of her books—but through her sunny punditry on PBS and NBC News. Until Ellis snagged a Pulitzer Prize with his best-selling Founding Brothers in 2000, his work, although elegantly written and published by trade presses, hardly resembled pop history. Bellesiles, despite the critical acclaim initially afforded to Arming America, never attained superstardom within the profession or substantial recognition outside it. Only Ambrose might be fairly accused of jettisoning standards to sell books, but the discovery, by Forbes's Mark Lewis, that Ambrose's plagiarism habit began way back in 1964 with his Ph.D. dissertation, published by Louisiana State University Press, suggests his motives were far more complex. Besides, scores of historians, inside and outside the academy, succeed every year in writing history that finds general readers without sacrificing scholarly rigor. Hoffer's popular/scholarly dichotomy is too simplistic.

More unfortunate, Hoffer turns censorious toward the end of his book, praising what he rightly describes as an "auto-da-fe, complete with stake and faggots" perpetrated by opinion-mongers in the media. As a former member of the American Historical Association's Professional Division, Hoffer is understandably peeved that the organization chose to stop its practice of adjudicating charges like those leveled at the Fraudulent Four. But in a book premised on the idea that this quartet of concurrent scandals stemmed from causes deeper than individual character, his solution—rebukes doled out by a professional body—seems naive. It was wise for the AHA to remove itself from the impossible business of resolving these disputes about culpability and instead to try to spread awareness of what good scholarship entails.

In contrast to Hoffer's stern conclusion, Robin's Scandals and Scoundrels is refreshingly free of moralism and alarmism—a must-read for anyone now fuming that Goodwin is back on television or Ellis back on the best-seller list. Though not condoning his subjects' behavior, Robin is more analytical than judgmental, more interested in understanding the meaning of these offenses that in administering another slap to their sorry culprits. "I find," he notes, "that the debates on academic impropriety discussed in this book suggest vibrancy rather than trauma." They demonstrated, he argues, the continuous process of establishing norms for the profession.

There's no reason to believe that acts of academic impropriety are any more common today than they used to be. What changed is the adjudication of wrongdoing, a task that the popular media appropriated from academia. By 2002, the popularity of Ambrose, Bellesiles, Ellis, and Goodwin had placed them under the watchful eye of an increasingly scandal-obsessed and intolerant media. These authors' "popularity" is relevant not because writing for the public somehow encourages shoddiness—it doesn't—but because their prominence allowed reporters and pundits to inflate their acts of wrongdoing into national scandals.

Arbiters in the media rushed in to enforce norms of behavior when they believed that academics were becoming lax. But where scholars tend to resolve disputes through careful, drawn-out deliberation, the media incline toward sensationalism and black-and-white verdicts. Moreover, in the last decade many Americans, including journalists, have adopted a primitive zero-tolerance moralism—a punitive code that encourages the trying of minors as adults, three-strikes-you're-out sentencing, the Borking of Cabinet nominees for minor mistakes, the regular-as-clockwork feeding frenzies in presidential campaigns, and the impeachment of a president for lying about sex. And they have relished the schadenfreude of the downfall of a famous historian, politician, or other celebrity.

For all the media hysteria that standards had fallen, it should be noted that Bellesiles was stripped of his job, Ellis suspended for a year, and Goodwin bounced from the PBS NewsHour and the Pulitzer Prize board. These were all perfectly appropriate punishments. Ambrose, as an author who simply didn't care about his scholarly reputation anymore and who could get paid handsomely for cookie-cutter best sellers, seemed distressingly beyond penalty. But, a lifelong smoker who had testified in court on behalf of big tobacco, he died of lung cancer in October 2002.



sidebar
The point was that Nixon's apologists, in unflaggingly trumpeting the case for Nixon's innocence, contributed to a dynamic that helped Nixon: If they persuaded seemingly neutral arbiters such as Ambrose of their positions, those arbiters would then be invoked by the apologists to claim victory. So, for example, Rabbi Baruch Korff, one of Nixon's more colorful defenders from his Watergate days, wrote his memoir in 1995, citing the line, "Two wrongs do not make a right … but they do make a precedent." Korff was able to credit the aphorism to Nixon's authoritative "biographer Stephen Ambrose," rather than to his old crony Lasky. Korff used Ambrose's standing to back up his own belief that "the justification for defending Nixon is all the stronger given … what we now know other presidents did." Needless to say, Ambrose hadn't convinced Korff of that belief; he had held it all along and was simply using Ambrose to enhance the credibility of his statement.





sidebar
"Consensus history," a term coined by the late Johns Hopkins historian John Higham, is commonly used by professional historians to refer to a view of American history that was dominant in the 1950s. Unlike the Progressive historians, such as Charles and Mary Beard, who preceded them, or the New Left historians who followed them, consensus historians saw in the American past more unity than conflict. Willing to posit distinctive national traits, they accepted notions of American exceptionalism and an American character. Some consensus historians, such as Daniel Boorstin, celebrated this unity, while others, such as Richard Hofstadter, lamented it. But they generally agreed on the possibility of writing master narratives about a unitary American people, focused on familiar highlights such as the American Revolution and the Civil War.





sidebar
Hoffer agrees with a distinction I made in Slate in March 2002 that Goodwin's behavior—particularly her response to the revelations—was more honorable than Ambrose's. He also recognizes that Bellesiles's wrongs were of a different order of dishonesty than any of the others', and that Ellis's were the mildest. Yet his framework of "popular" versus "scholarly" history unwisely forces all of them into the same category of unscrupulous historians on the make and in search of stardom, thereby neglecting distinctions he otherwise duly notes.



Dan Greenberg is the author of Nixon's Shadow: The History of an Image (2003). He teaches history at Rutgers University and writes the History Lesson column for Slate, where this article first appeared.

This piece first ran in Slate and is reprinted with permission of the author.


Copyright © 2004 Dan Greenberg


Tuesday, December 28, 2004

The Celluloid Trickster

The sole film—of which I am aware—featuring W is Michael Moore's "Fahrenheit 9/11." Now, all of the films that somehow use the Trickster have been brought together in one essay. Someday, W will get his from Hollywood, too. If this is (fair & balanced) expectancy, so be it.

[x Slate]
Tricky Dick Flicks
The trouble with Nixon movies (including Sean Penn's new one).
By David Greenberg

The films in which Richard Nixon has been a character, an icon, a point of reference, or a joke include Nixon, Dick, Secret Honor, All the President's Men, The Killing Fields, The Parallax View, The Ice Storm, Forrest Gump, The Buena Vista Social Club, Point Break, Maid in Manhattan, Tricia's Wedding, Sleeper, Missing, The Big Lebowski, Shampoo, and Star Trek IV: The Undiscovered Country. To name but a few.

Not many of these movies have featured Nixon as a main character. And among those that have done so, only a few—notably Philip Baker Hall's volcanic Nixon in Secret Honor (1984)—have risen above David Frye-style mimicry or hackneyed villainy. Why should this be? In his virtuosic new book Nixon at the Movies, Boston Globe reporter Mark Feeney nails down part of the problem: "As Oliver Stone found out, the movies can hardly do justice to Nixon, for nothing they can show can provide weirder or more compelling images than did the man's own overwhelming actuality." (Philip Roth made a similar point about Nixon in 1961, using him as an example of someone who was "so fantastic, so weird and astonishing, that I found myself beginning to wish I had invented" him.)

It's not surprising, then, that the 37th president most often figures in cinema not as a character but as a touchstone of an era of frustration and corruption in which the American Dream seemed to be grinding to a halt. One classic example is Constantin Costa-Gavras' Missing (1982), about the plight of a left-wing American journalist killed in Chile in 1973 by a U.S.-supported right-wing junta. In the film, a large official portrait of Nixon hangs prominently in the climactic scene, as the U.S. ambassador baldly lies to the missing journalist's father. In Hal Ashby's Shampoo (1975), which is set on Election Day, 1968, Nixon's image pops up on TV screens throughout the film as an omen of America's darkening future. It is in this vein that Tricky Dick appears in The Assassination of Richard Nixon, a movie that opens this week.

Thankfully, Feeney doesn't focus narrowly on Nixon in the movies, so he never gets bogged down explaining the symbolism obvious in some of these films. Instead, Feeney construes his subject far more broadly—hence, Nixon at the movies. Feeney uses many films in which Nixon isn't referenced at all, from Double Indemnity to The Conversation, as lenses for interpreting the president and his times. And most originally, he ponders Nixon's infatuation with the silver screen, revealing the loner president to be a compulsive moviegoer who watched more than 500 pictures while in office. "The moviegoer's fundamental yearning and loneliness," he writes, "… find an unmistakable embodiment in Nixon."

The Assassination of Richard Nixon, the debut film from director Niels Mueller, does traffic in Nixonian cliché, but it would surely have provided some rich material for Feeney. For just as in Shampoo, Nixon's flickering visage on the TV screen recurs as a trope in Assassination, signaling a distant and menacing power whose influence permeates the characters' world.

Assassination clearly owes many debts to Martin Scorsese's Taxi Driver (1976). For starters, there's the last name of Sean Penn's Sam Bicke, the film's aspiring assassin. And although Nixon didn't figure explicitly in Scorsese's film, critic David Thomson, for one, glimpsed his shadow in it. "Put it like this," Thomson wrote (in a quotation Feeney reprints), "two people—Richard Nixon and Travis Bickle—got away with things in the mid-seventies that should not have passed." In Assassination, similarly, Penn's Bicke is nearly a Nixon doppelgänger, a struggling, friendless loser who can't get over his resentment of those who have it easier than he does. Both men, in a bid for immortality, tape themselves, only to have their recordings serve as the ultimate self-incrimination.

Both are bad salesmen, too. Where Nixon inspired the joke, "Would you buy a used car from this man?" Bicke is a down-on-his-luck, inept office-furniture huckster going through a divorce who can't accept or cope with the mounting setbacks and small humiliations in his life: his wife's wish to be rid of him; the lies he must tell customers to sell his wares; the Small Business Administration's rejection of his application for a loan to start his own company. Even his boss's demand that he shave his mustache stings. So he decides to hijack a plane and fly it into the White House to kill the president.

Yet if Assassination's story is trite, it does successfully evoke the dead-end frustration that Nixon's presidency embodied in the early 1970s—a time now safely enough in the past to move beyond nostalgia and into history. The despair of the years from 1963 to about 1975—the years from Dallas to Watergate—seemed to induce in many Americans a shocking readiness to turn to assassination as an answer to personal or political problems. Everyone remembers or knows about the shootings of JFK, RFK, Martin Luther King, and George Wallace, and probably also the attempts on Gerald Ford's life in September 1975. But Nixon had his would-be assassins—and not only Samuel J. Byck, as his name was really spelled. In November 1968, three Yemeni men were arrested for conspiring to kill the newly elected president, and in August 1973 the Secret Service discovered a scheme to murder him on a visit to New Orleans.

The frequency of such assassination plots in these years wasn't mere coincidence; it was one of the scarier symptoms of the erosion of the traditional bonds of political authority that led to Nixon's ouster. On both political extremes, violence seemed like the best way out of a bad fix. The paranoid Nixon and his paranoid staffers mirrored the fringes of the antiwar left: Each side dreaded attacks from the other, and each used that fear to justify its own embrace of violence. White House aides, even professorial types like Daniel Patrick Moynihan, understandably capitulated to doomsday scenarios involving radicalism—"We have simply got to assume that in the near future there will be terrorist attacks on … members of the Cabinet, the Vice President, and the President himself," he wrote in a memo—while underground papers brimmed with blasts at Nixon, the FBI, and the police for employing force as a routine instrument of political repression. Violence was as American as apple pie—on this much H. Rap Brown and Nixon could agree.

The Assassination of Richard Nixon only touches on the period's broader political climate, as when, in one series, Bicke flirts with joining the Black Panthers as an outlet for his frustration. But if the film's politics are underdeveloped, Penn's portrayal of Bicke is rewarding as a study in psychology. Penn makes palpable the loneliness and desperation that overcome Bicke as his life falls apart, making violence an appealing option. In one dark scene, Bicke readies himself for what he deludedly hopes will be his history-making feat by breaking into his ex-wife's house, finding his trusty old golden retriever—at this point his sole remaining friend—and shooting him.

In Nixon at the Movies, Feeney describes how the 1999 comedy Dick underscores Nixon's friendlessness by showing him unable to get his own dog's name right. (He calls King Timahoe "Checkers.") Recalling Harry Truman's quip that those who want a friend in Washington should get a dog, Feeney observes that Nixon couldn't even find companionship in man's time-honored best friend. Nixon, he writes, "was alone, so alone." Sam Bicke is just as lonely but far less powerful. Like a lot of Nixon movies, Assassination relies too much on easy symbolism. Yet in Sam Bicke, a wretched and deranged furniture salesman, Mueller has nonetheless found a fitting metaphor for Richard Nixon's America, a landscape of isolation, violence, and encroaching despair.

When Nixon himself does appear in movies, as Feeney notes, it's usually as a sight gag (a poster of Nixon bowling in The Big Lebowski) or a punch line ("Whenever he used to leave the White House, the Secret Service counted the silverware," says Woody Allen in Sleeper)—or, alternatively, as a stock symbol for political corruption (as in Shampoo), criminality (robbers wear Nixon masks in Best Seller and Point Break), or the crimes of the American system (in Missing). The instant associations that Nixon triggers are so numerous and rich that an intelligent and subtle film like The Ice Storm can use him to invite a range of readings—to betoken debased authority in a universe where parents have affairs while moralizing to their children; to emphasize the characters' inability to be intimate; to underscore the pervasive inauthenticity of the lives of its anomic suburbanites, who have become estranged from their political environment, their families, and even themselves.

David Greenberg writes the "History Lesson" column and teaches at Rutgers University. He is the author of Nixon's Shadow: The History of an Image.

Copyright © 2004 Slate

Why The Hell Not?

If the Terminator can be elected governor, why not the Kinkster? Instead of "I work for a Jewish carpenter," I would rather have a bumper sticker that reads: "My governor is a Jewish cowboy." And that beats hell out of "My governor is Nazi on steroids." If this is (fair & balanced) fantasy, so be it.

[x Texas Monthly]
Dome Improvement
by Richard ("Kinky") Friedman

As your governor, I’ll tell the truth, give power to young people, and send Californians packing. Have I mentioned my plan to invade Oklahoma?

MANY OF US REMEMBER, in that dim and distant corridor of childhood, a book titled If I Ran the Zoo. What’s that? You don’t remember reading it? Okay. Push pause.

There once was a zoo that some folks liked to call Texas politics. In this zoo were doves and hawks, bulls and bears, crocodiles and two-legged snakes, and lots and lots and lots of sheep. But the ones who ran the zoo were not really animals. They were people dressed up in elephant and donkey suits who’d lined their pockets long ago and now went around lying to everybody and making all the rules. Even as a child, I knew I never wanted to be one of them, a perfunctory, political party hack. This did not stop me, of course, from growing up to be a party animal.

Unless you’ve been living in a double-wide deer blind, you know I’m running for governor in 2006. Well, I’m a rather indecisive person, so I’m not entirely sure I’m running yet. I have to weigh the impact the race may have on my family. You may be thinking, “The Kinkster doesn’t have a family.” But that’s not quite right, folks: Texas is my family. And I intend to give Texas a governor who knows how to ride, shoot straight, and tell the truth, a governor as independent-thinking and as colorful as the state itself.

By running as an independent, I plan to demonstrate that even if the governor doesn’t really do any heavy lifting, he can still do some spiritual lifting. There’s a place above politics that has nothing to do with bureaucracy, where good things can get done by an outsider who is in time and in tune with the music flowing from that old, beautiful instrument: the voice of the people. Unfortunately, where I come from, that instrument is an accordion.

Being independent is what Texas is all about. Here, someone running from the outside may, in fact, have more of a chance than elsewhere of being taken seriously. In Minnesota, few people took Jesse Ventura seriously until, in the wink of an eye, he put them in a reverse figure-four leg lock. His confrontational style, however, did not serve him well, and he lasted only one term. He never figured out that wrestling is real and politics is fixed.

Arnold Schwarzenegger is another story. Even far into his campaign, he was written off by a good part of the electorate. Then something happened to change all that: The people of California sensed that the world was watching. (They were right. We were watching Scott Peterson.) Now they’re talking about running Arnold for president. Many Texans are taking note, thinking that if they can get rid of politicians in California, maybe we can get rid of Californians in Texas.

So how does an independent candidate get taken seriously, particularly one who believes that humor is one of the best ways of getting to the truth? And the truth is, if we don’t watch out, Guam is going to pass us in funding of public education. Beyond its obsession with shaking down lobbyists, the Texas Legislature has proved that it is neither a visionary nor an efficient institution. The Fraternal Order of the Bulimic Moose could probably do a better job. A good spay-and-neuter program may be the answer. As my father always said, “Treat children like adults and adults like children.” But for God’s sake, whether you have to go around them or over them, let’s get something done, even if it means invading Oklahoma so we can move up to number 48 in affordability of health care.

Here is where the spiritual lifting comes in. Though the governor of Texas holds a largely ceremonial position, he must be able to inspire people, especially young people, to become more involved in the welfare of our state. As I drive around in my Yom Kippur Clipper—which sports a bumper sticker that reads “My Governor Is a Jewish Cowboy”—I am constantly impressed by the young people I run into, sometimes literally.

Take the fellow I met recently. He was a grocery clerk in Del Rio, and he seemed quite bright. As I was talking to him, a customer walked up and asked for half a head of lettuce. The clerk said he’d have to check with the manager, and he walked to the back of the store. Unnoticed by the clerk, the customer walked back there too, just in time to hear him say to the manager, “Some a—hole wants to buy half a head of lettuce.” The clerk, suddenly seeing the customer standing next to him, then turned and said, “But this kind gentleman has offered to buy the other half.” Later, the manager, complimenting the clerk on his fast thinking, told him that a large Canadian chain was buying the store and suggested he might climb the ladder quickly. “Everyone in Canada,” responded the clerk, “is either a hooker or a hockey player.”

“Just a minute, young man,” said the manager. “My wife is from Canada.”

“No kiddin’,” said the clerk. “Who’s she play for?”

Young people like this, I believe, can be an inspiration to us all. When I’m governor, many of them will be running the place, and no doubt I’ll be running after many of them. Everyone knows we’re not going to get any action or inspiration from career politicians. They’re so busy holding on to their power they never have time to send the elevator back down.

If my petition drive to get on the ballot in March 2006 is successful, I’ll become the first independent candidate to run for governor since they dragged a heavily monstered Sam Houston out from under the bridge. At the very least, I plan to be a candidate people can vote for rather than against. Speaking of which, I was showing a friend around Austin recently, and he was very impressed with what he saw. “That’s a beautiful statue of Rick Perry you all put up,” he said.

“That is Rick Perry,” I said.

Richard ("Kinky") Friedman is a polymath: singer, songwriter, novelist, humorist, and gubernatorial candidate.

Copyright © 2004 Texas Monthly




The Emperor Has No Brain!

This is the guy who proclaims that Social Security is broken? The village of Crawford, TX is missing an idiot (most of the time). Weisberg publishes verifiable W quotes, not hearsay. If this is (fair & balanced) buffoonery, so be it.

[x Slate]
Bushism of the Day
By Jacob Weisberg

"It's a time of sorrow and sadness when we lose a loss of life."—Washington, D.C., Dec. 21, 2004

Jacob Weisberg is the editor of Slate magazine and three previous editions of Bushisms.

Copyright © 2004 Slate



Breaking Up Is Hard To Do

Tom Teepen tells it like it is about the Social Security crisis. W's Chicken Little impersonation is galling. As if he has to worry about living in retirement. If this is (fair & balanced) quackery, so be it.

[x Cox Newspapers]
Social Security ain't broke; why are they trying to fix it?
by Tom Teepen

If it ain't broke, don't fix it.

And all the alarms to the contrary aside, Social Security ain't broke. True, within a dozen or so years it will have to tap its surplus to make up the difference between what it owes retirees and current payroll taxes and by about 2042 it will have to start fudging benefits if it is to stay solvent.

This prospect plainly calls for preventative action but it does not present a catastrophe that therefore calls for, in effect, beginning to dismantle the system altogether.

And don't be misled. Dismantling it in the long run is the real agenda behind President Bush's push to bankroll personal, private investments with part of the Social Security taxes. The resulting Nirvana that we are encouraged to imagine has today's younger workers retaining Social Security's fail-safe benefits while icing the cake with swag from their years in the stock market, a get-rich-slowly scheme born of an ideological animosity, not of necessity.

The political quarters drumming this plan are the current version of the same politics that denounced Social Security originally in the 1930s and have lived with it all these years only with gritted teeth and quiet fuming.

The game now is to misuse the accumulating stress on the system, which is indeed real, to create fear of a total collapse and to sell younger workers on the notion that they somehow should have a right to make investments for their personal gain, which they are led to fancy would be cannier and more remunerative than investments the government makes.

But Social Security is not an investment scheme and was never meant to be. It is a social insurance program, and as that it has succeeded beautifully. Where poverty had been the common condition of old age, it is now a relative rarity, and Social Security operates with awesome efficiency, with only about 1 percent overhead. The real case the right has against Social Security is not that it has failed but that it has worked. The right just hates that.

Social Security can continue working. Substantial adjustments are necessary. We need to move the retirement age up incrementally to reflect modern longevity. Incrementally, too, payroll taxes should be extended to higher earnings. Annual Social Security increases should be pegged to an index that would slow them.

Painful, to be sure, but there is no freebie at hand. Bush's radical proposal would have the government borrowing one to two trillion dollars to float the transition — and that, of course, is in addition to the deep deficits he has already run up and that he is using to begin cutting federal support for education, scientific research, environmental protection and other productive investments in our future.

Privatizing Social Security would turn retirement into a crapshoot, with the brokerage houses the only sure winners. A cohort with the bad luck to be retiring in a recession would face a financially cramped old age or would have to turn to the government in hopes of being rescued by federal revenues after all.

Social Security has confronted shortfalls in the past and each has been forestalled by adjusting the system rather than by overthrowing it. A truly conservative president would learn from that past rather than dismissing it.

Tom Teepen writes an editorial page column for Cox New Service twice a week. He is also a contributing columnist to the publication Liberal Opinion Week. He was editorial page editor of Cox Newspapers' Atlanta Constitution for 10 years until his move to Cox in 1992. Teepen had earlier been editorial page editor of Cox's Dayton Daily News.

Copyright © 2004 Cox Newspapers



Monday, December 27, 2004

'Twas The Night Before Kwanzaa

Huey Freeman is a black nationalist in "Boondocks." However, Huey does not buy into Kwanzaa. In fact, every member of Aaron McGruder's "Boondocks" family doesn't buy into Kwanzaa. Neither does McGruder. If this is (fair & balanced) dubiety, so be it.


Aaron McGruder's take on Kwanzaa in "Boondocks" in the Freeman household in Woodcrest (a fictitious Chicago suburb): Granddad, Riley, and Huey.
Copyright © 2004 Aaron McGruder
Click on image to enlarge
 Posted by Hello


[x Slate]
Kwanzaa
By Carol M. Beach

"I'm still having trouble with Hanukkah," a Texaco executive says on that controversial tape recording. "Now we have Kwanzaa." Although his expression of concern may have been extreme, that executive is not the only American confused about Kwanzaa. Each year we hear more and more about this holiday. What is it, and where did it come from?

Kwanzaa is a holiday honoring African-American heritage and culture. Celebrated from Dec. 26 through Jan. 1, the holiday was created after the 1965 Watts riot by Maulana Ron Karenga, a graduate student and black nationalist, who observed that black Americans had no holiday of their own. In the 1960s, Karenga feuded openly with other black leaders, and some of his followers were convicted in a plot to assassinate members of the Black Panthers. In the 1970s, Karenga himself was imprisoned for ordering and directing the torture of a young woman. Now 54 years old, he is chair of Black Studies at California State University in Long Beach. Last year, apparently rehabilitated in the eyes of many African-American leaders, Karenga served on the national executive committee for the Million Man March.

Karenga's early attempts to popularize the holiday were directed at a relatively small group of activists. Beginning in the late 1970s, however, Kwanzaa began to attract coverage in the mainstream press. The attention grew as Karenga modified his rhetoric to appeal to a broader audience, and as interest in multiculturalism burgeoned in the late 1980s. Kwanzaa was taken up by many in the expanding black middle class, whose buying power has supported such marketing ventures as the "Kwanzaa Expos," convention-center gatherings at which Afrocentric goods and art are sold.

Karenga took the holiday's name from the Swahili phrase matunda yakwanza, meaning "first fruit." Swahili words were chosen because the language, a hybrid of Arabic and Bantu tongues, is tied to no particular African tribe. Although there are no directly analogous African holidays, Karenga drew his inspiration from various African harvest festivals. From these he extracted seven principles--unity, self-determination, collective work, cooperative economics, purpose, creativity, and faith. Each of the holiday's seven days is meant to symbolize one of these principles.

The central Kwanzaa ritual is candle lighting. First, the mkeka (straw mat) is placed on the table, along with the kinara (candleholder) and the mishumaa saba (seven candles). The three candles on the right in the kinara are red, symbolizing the blood of the African people; the three on the left are green, symbolizing the hope of new life; and the black candle in the center represents the African people. Around the candles are placed the mazeo (fruits), the vibunzi (an ear of corn for each child in the family), the zawadi (gifts, preferably handmade), and the kikombe cha umoja (cup) for shared juice or water.

On each day of the celebration, a child lights the appropriate candle, and the principle for that day is discussed. The highlight is the karamu, or feast, on Kuumba, Dec. 31. It celebrates creativity, and is "an opportunity for a confetti storm of cultural expression: dance and music, readings, remembrances," according to Eric Copage, author of Kwanzaa: An African-American Celebration of Culture and Cooking. The food can be highly symbolic. Angela Shelf Medearis, author of a Kwanzaa cookbook, recommends the following karamu menu: Jambalaya Salad, Moroccan Honey Chicken, New-Style Collard Greens, and Fruits of Africa Pie.

How widely celebrated is Kwanzaa? In Karenga's own words, "It's widespread but not mainstream." Some enthusiasts, such as writer Linn Washington Jr., claim that Kwanzaa has as many as 13 million celebrants, and the Detroit News reports that researchers estimate the Kwanzaa-related market at $500 million annually. Despite Karenga's original intention, Kwanzaa takes on a more commercial flavor every year. Hallmark makes Kwanzaa cards, and there are Kwanzaa posters, books, CDs, and mass-produced kinaras. But Kwanzaa was not intended to replace Christmas, and many African-American families celebrate both holidays.

And Hanukkah? It too, if not invented in the United States, has taken on a different shape and gained importance here. Although Hanukkah has been on the Jewish calendar for more than two millennia, it was, until recently, a relatively minor holiday. The pressures of Christmas, however, have elevated Hanukkah for many Jewish families to eight days of celebration and gift-giving. (A new book maintains that even Christmas is a trumped-up holiday. See "Summary Judgment.")

Hanukkah commemorates the victory in 165 B.C. of a small band of Jews, led by Judas Maccabaeus, over the Greeks who ruled Palestine at the time. The Jews reclaimed the Temple from the Greeks, and rededicated it as their place of worship. But they had only one day's supply of oil to light the flame, which was supposed to burn constantly. Miraculously, the flame burned for eight days and nights, until the oil supply was replenished. Following the rebellion, the kingdom of Israel was restored for 200 years.

Some Kwanzaa rituals, most notably the focus on candles, seem to have been borrowed from Hanukkah. The center of the eight-day Hanukkah celebration is the nightly family gathering to light candles in a candelabra known as the "menorah." Oily foods, particularly latkes (potato pancakes), are served during dinner to symbolize the Temple miracle. By tradition, family members play with the dreidel, a four-sided spinning top, and children receive Hanukkah gelt (chocolate coins covered with gold foil) and other presents.

Unlike Kwanzaa, Hanukkah enjoys no agreed-upon spelling in English. The most common variant is "Chanukah," reflecting the proper pronunciation of the opening consonant, which is like the "ch" in "Bach." The spelling employed in this article is from The Associated Press Stylebook and Libel Manual, Slate's guide in such matters.

References:

If you'd like to know more about Kwanzaa, you can read The Complete Kwanzaa: Celebrating our Cultural Harvest, by Dorothy Winbush Riley; A Kwanzaa Keepsake: Celebrating the Holiday With New Traditions and Feasts, by Jessica B. Harris; or Merry Christmas, Baby: A Christmas and Kwanzaa Treasury, edited by Felix H. Liddell and Paula L. Woods. Karenga, Kwanzaa's creator, has also written two books on the celebration, Kwanzaa: Origin, Concepts, Practice and The African American Celebration of Kwanzaa: A Celebration of Family, Community & Culture. And Anna Day Wilde describes how the holiday gained popularity in "Mainstreaming Kwanzaa," in Public Interest, No. 119, Spring 1995.

Carol Beach produces "The Diane Rehm Show" for WAMU-FM (88.5) in Washington, D.C., and National Public Radio.

Copyright © 1996 Slate