Monday, January 31, 2005

The Real Radical Is Dub, Not FDR!

Why is Dub taking on Social Security? He's trying to finish what Dutch didn't get done in the 1980s. Like Barry Goldwater, even 20 years earlier, Dutch wanted to undo the New Deal. Now, Dub is trying to sell hysteria about the Social Security crisis. No one has ever gone broke by underestimating the intelligence of the public in the United States. The State of the Union Address ought to be interesting. We'll hear a gloat over the election in Iraq and then we'll hear another cry of wolf about Social Security. Dub didn't serve in 'Nam, but he will tell us that we need to destroy Social Security to save it. If this is (fair & balanced) demagogy, so be it.

[x Slate]
He's Still "That Man": The Bushies' war on Franklin Roosevelt.
By Daniel Gross

Why are today's Republicans so hellbent on changing Social Security? Clearly they're not driven by concern over government deficits. After all, they've engineered a taxing and spending regime that intentionally created record deficits. And it can't be that they oppose entitlement programs as a matter of principle. Medicare has an unfunded liability larger than Social Security's, and they just expanded it a couple of years ago with the prescription drug benefit.

Maybe it's because Social Security is an opportunity to refight—and perhaps win—a series of arguments the Republicans lost badly 70 years ago. To put it another way, it's a chance to knock down Franklin Roosevelt, finally. "For the first time in six decades, the Social Security battle is one we can win," Peter Wehner, Bush's director of strategic initiatives, wrote in a memo to supporters in early January. In a column advocating the dismantling of Social Security, George Melloan of the Wall Street Journal editorial page last week wrote that "The Social Security Act of 1935 was the worthy achievement of the New Deal—almost the only one of any permanence—that gave relief to a Depression-battered nation." (In an interview, Melloan said he's aware that the Securities and Exchange Commission, the Federal Deposit Insurance Corp., the Tennessee Valley Authority, and Triborough Bridge are all New Deal products of permanence as well.) The Cato Institute, which has been leading the charge against Social Security, includes among its many distinguished fellows Jim Powell, author of the deeply ahistoric history FDR's Folly: How Roosevelt and His New Deal Prolonged the Great Depression.

Dead going on 60 years, FDR still makes self-styled champions of American-style capitalism fulminate, much the same way their counterparts in the 1930s raged against "That Man." Why? The New Deal era reminds national greatness Republicans like Wehner of their party's futility in a time of true national greatness. I also suspect that many Republicans are simply unable to forgive Roosevelt for what may have been his greatest and longest-lasting achievement: saving American capitalism through regulation. And since they can't tear down the Triborough Bridge or the Hoover Dam, these guys act out by going after Social Security.

The economy FDR inherited in March 1933, delivered to him by 12 years of Republican laissez-faire rule, was a shambles. Gross domestic product fell by more than a quarter between 1929 and 1933. "One out of every four American workers lacked a job," writes Arthur M. Schlesinger Jr. in his magisterial Crisis of the Old Order. "Hunger marchers, pinched and bitter, were parading cold streets in New York and Chicago." Only a small percentage of the unemployed received relief. Americans suffered a degree of long-term financial distress that is almost unimaginable.

The distinguishing features of America's economic system—its capital markets and credit system—were broken down. The Dow Jones industrial average fell 90 percent from its 1929 peak. Meanwhile, Senate hearings led by Ferdinand Pecora were exposing rampant abuses. The New York Stock Exchange resisted any reform. "The Exchange is a perfect institution," said Richard Whitney, the head of the NYSE, who would be arrested in 1938 for embezzlement.

The U.S. model of market capitalism combined with democratically elected representative government was under assault. The challenge came from all directions. At home redistributionist Sen. Huey Long and Father Coughlin, whose National League for Social Justice called for "control of private property for the common good," threatened from the populist left. (Such was the allure of left-of-center ideologies that Theodore Bilbo, the revanchist Mississippian, confessed, "in fact, I'm getting a little pink myself.") On the right, Sen. David A. Reed of Pennsylvania declared: "I do not often envy other countries their governments, but I say that if this country ever needed a Mussolini, it needs one now." Abroad, communism and fascism were ascendant in significant economies: Germany, Russia, Italy, and Japan.

FDR's task was to negotiate a course between illiberal solutions in a time of unprecedented stress. "The Securities Act of 1933 was a conservative response to the economic crisis known as the depression," wrote Joel Seligman in his definitive history of the SEC, Transformation of Wall Street. It called for issuers of new securities to register them with the government and provide financial statements. The FDIC, created by the Banking Act of 1933, established insurance on deposits up to $2,500. The Social Security Act established an initial 1 percent on payroll tax on both employers and employees to fund a guaranteed minimum income for seniors. A national minimum wage of 25 cents was established in 1938.

In each instance, FDR angered critics on the left—and in his own party—who wanted the government to go further. On the right, New Deal opponents—the editors of Forbes, Republican congressmen, the U.S. Chamber of Commerce—continually charged that FDR was turning the United States into Amerika. They routinely engaged in appalling moral equivalency between FDR on the one hand and Stalin and Mussolini on the other. Some of FDR's current critics do the same. Powell's book is larded with quotes like this: "The New Deal was the American version of the collectivity trend that became fashionable around the world, so it perhaps shouldn't be surprising that New Deal utterances by FDR and his advisers sometimes sounded similar to fascist doctrines associated with Italian dictator Benito Mussolini."

The theory that new taxes and regulation would inevitably hamper economic growth and destroy America exerted a powerful hold on the minds of the business establishment and the economic right in the 1930s—just as it does today. FDR's proposals seemed to fly in the face of everything these experts knew about how the economy works. In particular, FDR upended the hallowed equation: taxes and regulation equals tyranny and depression.

But a funny thing happened on the road to serfdom. FDR may have gone too far on occasion. He was great, not perfect. And the consumer-based economy that defines our age emerged only after World War II. But the economy did come back to life. Gross domestic product rose 90 percent between 1933 and 1941. Far from turning the United States into a Western version of the Soviet Union or Nazi Germany, the New Deal allowed the United States to function as the world's bulwark against both. The institutions that stood at the heart of the American experiment—representative democracy, the separation of powers, a system of managerial capitalism, liquid capital markets—survived in a world gone mad.

It's difficult to discern the short-term political gain for Republicans to try to dismantle Social Security now. So the payoff must be more psychological or intellectual. Now that they indisputably control all three branches of government, Republicans finally have the opportunity to slay some of the liberal demons that have been bedeviling them for so long.

For 70 years, conservatives have been telling us that the American economy—whether it's in recession or whether it's booming—is laboring under the shackles of the burdensome taxation and misguided regulation placed upon it by FDR and his successors. Somehow, stocks would do better if the SEC were weaker and we'd all be wealthier if seniors weren't guaranteed a minimum income, funded through payroll taxes. But America's economic mastery since 1945 has served as an ongoing and constant refutation of their most dearly held beliefs. It still does today. As George Melloan concedes, "The New Deal basically expanded the reach of government, and things worked out OK." Actually, they worked out great. Some people still can't get over it.

Daniel Gross (Clink on this link for his Web site.) writes Slate's "Moneybox" column.

Copyright © 2005 Slate Magazine

Sunday, January 30, 2005

Shhhhhh! The Kinkster Is An Atheist?

When I encountered Natalie Angier this morning, I had no idea that she would turn up on a Weblist of celebrity non-believers. I knew that Robert G. Ingersoll was the celebrity atheist of the late 19th-century and that Clarence Darrow was the celebrity atheist of the early 20th century and that Madalyn Murray O'Hair was the celebrity atheist of the late 20th century. However, I was shocked to see the Kinkster in the F-list of atheists. This is going to make for a difficult gubernatorial (goobernatorial?) campaign in early 21st century Texas. The Kinkster is not only weird, but he's an atheist? The Republican Right is going to have a field day. If this is (fair & balanced) doubt, so be it.

[x Celebrity Atheists Web Site]
THE ATHEIST AND THE MATERIALIST
Those who have no need for gods and some who have no need for the supernatural

Forrest J. Ackerman
Phillip Adams
Brandy Alexandre
Woody Allen
Shulamit Aloni
Thomas J. Altizer
Natalie Angier
Liv Arnesen
Madison Arnold
Peter William Atkins

Russell Baker
Iain M. Banks
Clive Barker
Dan Barker
MC Paul Barman
Dave Barry
Richard Bartle
Steve Benson
Ingmar Bergman
Björk
Bill Blass
Jim Bohanan
Sir Herman Bondi
Pierre Boulez
T. Coraghessan Boyle
Nathaniel Branden
Marlon Brando
Richard Branson
Rodney Brooks
Andrew Brown
Peter Buck
Warren Buffett
John Byrne

Dean Cameron
George Carlin
John Carmack
Adam Carolla
John Carpenter
Asia Carrera
Fidel Castro
Dick Cavett
Stephen Chapman
Vic Chesnutt
Noam Chomsky
Mohammed Choukri
Chumbawamba
Paul and Patricia Churchland
Alexander Cockburn
John Conway
Alex Cox
Francis Crick
David Cronenberg
David Cross
Alan Cumming
Justin Currie

Ron Dakron
Julia Darling
William B. Davis
Richard Dawkins
Daniel Dennett
David Deutsch
Ani DiFranco
Micky Dolenz
Amanda Donohoe
Roddy Doyle
Paul Draper
Patrick Duffy

Dean Edell
Paul Edwards
Greg Egan
Barbara Ehrenreich
Paul Ehrlich
Albert Ellis
Warren Ellis
Harlan Ellison
Garth Ennis
Brian Eno
Diane Farr
David Feherty
Jules Feiffer
Larry Fessenden
Harvey Fierstein
Nuno Filipe
Filter
Bob Fingerman
Antony Flew
Larry Flynt
Dario Fo
Dave Foley
James Forman
Jodie Foster
John Fowles
Robin Lane Fox
Kinky Friedman

Janeane Garofalo
Bill Gates
Bob Geldof
Jack Germond
Ira Glass
Jean Luc Godard
Al Goldstein
Nadine Gordimer
Greg Graffin
Spalding Gray
Seth Green
Stephen Greenblatt
Rachel Griffiths

Joe Haldeman
Alan Hale
Kathleen Hanna
Harry Harrison
Nina Hartley
Roy Hattersley
James A. Haught
Bill Hayden
Judith Hayes
Nat Hentoff
Katharine Hepburn
Paul Hester
Christopher Hitchens
General Choi Hong-Hi
Nicholas Humphrey
Derek Humphry

Stephan Jenkins
Penn Jillette
Angelina Jolie
Neil Jordan

Joachim Kahl
Jonathan Katz
Kawaljeet Kaur
Ludovic Kennedy
Margot Kidder
Florence King
Neil Kinnock
W. P. Kinsella
Michael Kinsley
Melvin Konner
Frank Kozik
Kramer
Paul Krassner
Milan Kundera
Paul Kurtz

Ring Lardner Jr.
Mr. Lavanam
Richard Leakey
Alexander I. Lebed
Tom Lehrer
Mike Leigh
Stanislaw Lem
Gerda Lerner
Michael Lewis
Tom Leykis
John Lydon

John Malkovich
Barry Manilow
Shirley Manson
Michael Martin
Nick Mason
John McCarthy
Malachy McCourt
Ian McEwan
Todd McFarlane
Montana McGlynn
Sir Ian McKellen
Alexander McQueen
Jonathan Meades
Antonio Mendoza
Tom Metzger
Arthur Miller
Mike Mills
Marvin Minsky
Warren Mitchell
Momus
John Money
Hans Moravec
Max More
Henry Morgentaler
Desmond Morris
James Morrow
John Mortimer
Frank Mullen

Taslima Nasrin
Ramendra Nath
Ted Nelson
Randy Newman
Mike Nichols
Jack Nicholson
Kai Nielsen
Oscar Niemeyer
Robert Nozick
Gary Numan
Ronald Numbers

Bob Odenkirk

Camille Paglia
Andy Partridge
Robert Patrick
Mark Pauline
Leonard Peikoff
Paul Pfalzner
Julia Phillips
Ferdinand Piech
Katha Pollitt
Paula Poundstone
Vladimir Pozner
Terry Pratchett
Paul Provenza

Ted Rall
James Randi
Ron Reagan Jr.
Christopher Reeve
Rick Reynolds
Griff Rhys-Jones
Mordecai Richler
Matt Ridley
Brian Ritchie
Brad Roberts
Chris Robinson
Neil Rogers
Richard Rorty
Arundhati Roy
Jane Rule
Salman Rushdie

Mona Sahlin
Sebastião Salgado
Robert Sapolsky
José Saramago
Pamela Sargent
John Sayles
Eugenie Scott
Captain Sensible
Nick Seymour
Robert I. Sherman
Michael Shermer
Claude Simon
Slayer
J.J.C. Smart
George H. Smith
Robert Smith
Lee Smolin
Steven Soderbergh
Ed Sorel
Annika Sörenstam
George Soros
Richard Stallman
Peter Steele
Bruce Sterling
Howard Stern
J. Michael Straczynski
Ken Stringfellow
Donald Sutherland
Julia Sweeney
Matthew Sweet

Teller
Studs Terkel
Tool
Linus Torvalds
Ted Turner

Eddie Vedder
Gore Vidal
Kurt Vonnegut Jr.
Sarah Vowell

Matt Wagner
Annika Walter
James Watson
Steven Weinberg
Joss Whedon
Harland Williams
Ian Wilmut
Lewis Wolpert
Steve Wozniak
Bruce Wright

Zarkov
Nick Zedd

Copyright © 2005 Celebrity Atheists

Caveat Lector

I heard Madalyn Murray O'Hair speak at UT-Austin in the early 1970s when I was doing dissertation research. The most famous atheist of her time was witty and articulate. Natalie Angier is even more witty and articulate. Read her speech to the Ethical Culture Society if you dare, on a Sunday in January. Nothing is unworthy of thoughtful consideration. Angier won me by quoting Lewis Black in her discourse. If this is (fair & balanced) skepticism, so be it.

[x Center For Inquiry]
Atheism and children
By Natalie Angier

Thank you, and it’s an honor to be speaking here at the Ethical Culture Society, in what I understand to be the Ceremonial Hall. According to my beloved American Heritage Dictionary, ceremonial means “formal or ritual,” and though I don’t go in for terribly many rituals, I did start the holiday season with the ritual viewing of the atheist’s favorite Christmas movie, “Coincidence on 34th Street.”

This is also the time of year, of course, when Jesus invariably screws up and commits some sort of felony. How else to explain why so many people seem to find him in jail?

You see? This is what happens when they flush people like me out of our foxholes. And because I’m here to talk about raising healthy, 100 percent guaranteed god free children, I will happily give full credit for the aforementioned remarks, and all that is to follow, to my eight-year-old daughter, Katherine. Yes, this is an atheist’s idea of responsible parenting. Can you see the horns growing out of the top of my head? Actually, last night I was reading in the New England Journal of Medicine about a condition in which people grow these horn-like projections from the top of their head, benign tumors called cylindromas. And just to show you how ecumenical the condition can be, in some cases, the doctors wrote, the cylindromas may “coalesce to form a hat-like growth, giving rise to the term ‘turban tumor.’”

But seriously. I’m here to talk about why my husband and I are raising our daughter as an atheist. The short, snappy answer is, We don’t believe in god. The longer, self-exculpating answer that is the theme du noir is, We believe it is the right thing to do. First, let me talk a little bit about why I use the term atheist rather than a more pastel-inflected phrase like agnostic or secular humanist, or the latest offering, Bright. Now when it comes to any of the mainstream deities proposed to date, I am absolutely atheistic. I can understand the literary and metaphoric value of any number of characters from mythology and religion. During this last election, we all felt like Sisyphus, we pushed that boulder and pushed and pushed, and we were just about at the top of the mountain, well, you know the rest. Or maybe we were Prometheus, with the vulture forever pecking away at our liver, or Job, or the dry run for the Lazarus bit. Yet however legitimate it may be to view any of our religious books as we would the works of Shakespeare or Henry James , I don’t take them seriously as descriptions of how the universe came to be or how any of us will re-be in some posthumous setting, or what god is or wants or whines about. So I am an unalloyed atheist by the standards of the mainstream sects.

Nevertheless, what of the hugeness of the universe, and of the possibility that there are other universes beyond this one, or even that the universe in some sense desires to know itself, and that we are the I and the eyes of the universe? This idea has philosophical appeal, and it certainly offers me some inspiration, a belief that we have a moral imperative, if you will, to understand the universe to the furthest extent our brains can manage. I was moved recently by a letter I read in “Freethought Today,” published by the Freedom from Religion Foundation. It was a response to some questions by a Navy ensign, from none other than Albert Einstein. “I have repeatedly said that in my opinion the idea of a personal God is a childlike one,” Einstein wrote. But rather than be billed as a “professional atheist,” Einstein added, “I prefer an attitude of humility corresponding to the weakness of our intellectual understanding of nature and of our own being.”

So, yes, of course, humility in the face of cosmic grandeur is always warranted; but let us not forget that Einstein sought to the very end of his long life to honor that grandeur by seeking to understand it, bit by bit, with his weak little intellect. How much better, in my view, is that approach, of humility crossed with an unslakable curiosity to delve the majesties of nature; over the sort of hooey humility that we benighted and defeated “liberals” are supposed to be mastering, that preached by the evangelical superstar John Stott, who, according to David Brooks, does not believe that “truth is something humans are working toward. Instead, Truth has been revealed.” As Stott writes:


"It is because we love Jesus Christ [that] we are determined…to bear witness to his unique glory and absolute sufficiency. In Christ and in the biblical witness to Christ God's revelation is complete; to add any words of our own to his finished work is derogatory to Christ."


Just as Lewis Black said on “The Daily Show” about the proposal that gays should be barred from teaching, “Well, there goes the school play!” so with Stott we can bid the NSF, the NIH, MIT goodbye. Who needs Heisenberg’s uncertainty or Einstein’s relativity when we’ve got two ox, two mules and the nativity?

Oy vey, these are values? These and a subway token won’t get you on the subway.

And so, to me, atheism means what it says – without god or gods, living your life without recourse to a large chiaroscuro of a supreme being to credit or to explain or to excuse. Now I’ll be the proud mother and say that my daughter understands this. A couple of days ago, in preparation for this talk, I was interviewing her, asking her a few questions about how she viewed her heathen heritage. First I asked her if she believed in god. She crinkled up her nose at me like I had mentioned something distasteful, like spinach and liver, or kissing a boy, and said, No! I asked her if she was sorry she’d been raised as an atheist, and she said no, she liked it. I asked why. First, she said, you don’t have to waste Sundays going to pray. Also I’d rather do things myself than have somebody else do them for me. If somebody gets sick, I wouldn’t just pray to god he or she gets better, I would try to buy some medicine for them, to help them get better.

Oh, I liked that answer. I couldn’t help it. This sounded to me like, what do you call it, a value system. She also said that she likes to see things for herself before believing in them. If a friend told me, guess what, I’ve got a flying dog, I’d say, can I see it. Katherine said she has friends who claim they’ve seen god. One of her close friends told her she’s seen bright lights in the middle of the night that she knows were signs from Jesus. So Katherine asked her if she could do a sleepover, to check out the light for herself. Oh, you’d never see it, her friend replied. Only people who believe in god can see it.

As Richard Dawkins has said, “With religion, there’s always an escape clause.”

Admittedly, Katherine is lucky. She lives in a very liberal community, Takoma Park, Maryland, which went 91.8% for Kerry; and a lot of other kids, she told me, share her views about god. A couple of times she’s been told she’s going to go to hell – or, as she phrased it, the opposite of heaven; she’s remarkably curse-averse – but she says she doesn’t care because she doesn’t believe in either destination anyway. But in some places in the United States, it’s extremely tough to be an atheist, even fatal. Last October, in Taylor, Michigan, a former Eagle Scout shot another man to death because, he said, the man was “evil; he was not a believer.” We all know the sort of tolerance they teach in the Boy Scouts and Eagle Scouts of America, of course. No gays allowed – guess you don’t expect them to be very good at pitching tents and tying knots, right? – and no atheists. They kicked out Darrell Lambert, a model scout if there ever was one, because he refused to say he believed in God, remember? At which point, I’m proud to say, my husband, who was a boy scout and an eagle scout and learned many skills as a scout and had earned many patches and badges, decided to send back his eagle scout medal to the Boy Scouts of America; and he wrote a beautiful essay about his decision for the Washington Post. The director of public affairs at the organization sent him an answer, saying, We accept your decision, but we hope that someday, you will come to be more open-minded in your views.

So, what advice do I have for nonbelievers trying to raise their children in a rigidly religious, small town environment? Move.

I kid you not. I went to high school in a small Michigan town, very religious, lots of baptists, also lots of drunk drivers, and believe me, they were the worst four years of my life. Move to a big city in just about any state, or move to a medium-sized city in a blue state, move to Takoma Park, or move to Canada if you can stay awake. Move to a university town. Because there are plenty of secularists out there, oh yes. Sure, we’ve been told repeatedly, we’ve been beaten practically comatose, with the notion that we live in an extremely religious country.

We’ve all read the statistics on how people would elect as president a member of any other oppressed group – a woman, a Jew, a Muslim, even that very same gay person they’d rather not see in their schools and certainly not at the wedding altar – before they’d vote for an atheist. Anywhere from 90 to 95 percent of Americans say they believe in god. But how meaningful are these statistics? Are they any more reliable than the poll result I saw recently, apocryphal I hope, which showed that 55% of American Christians believed Noah to be a relative of Joan of Arc? As John Horgan pointed out in Sunday’s New York Times, a Harvard University study has found that the number of Americans with no religious affiliation has grown sharply over the past 10 years, to as many as 39 million, twice the number of Muslims, Jews, Buddhists, Hindus and Episcopalians combined. Yes, the secularists are out there, but they tend to prefer large cities and other places with an active cultural and intellectual life. Which brings me to why I think raising a child as an atheist, or a committed secularist, is the right thing to do, and should be done without apology, indeed with pride.

I’m a science writer. I’m fond of evidence, and I’m a serious devotee of the scientific method, and the entire scientific enterprise. Let me tell you, scientists as individuals can be as petty, insecure, vain, arrogant and opinionated as the rest of us. The myth of the noble, self-sacrificing scientist should never have been allowed to grow beyond the embryonic stem cell stage, and most scientists will tell you as much. But science as a discipline weeds out most of the bluster and blarmy, because it asks for proof. “One of the first things you learn in science,” one Caltech biologist told me, “is that how you want it to be doesn’t make any difference.” This is a powerful principle, and a very good thing, even a beautiful thing. This is something we should embrace as the best part of ourselves, our willingness to see the world as it is, not as we’re told it is, nor as our confectionary fantasies might wish it to be. Science is also extraordinarily unifying. You go to a great lab or to a scientific meeting, and you will see scientists from around the world, talking to each other and forming international collaborations. This is something we should be proud of, even if we ourselves are not scientists – that our species, our collective minds, our heads knocked together, are capable of making sense of the universe. So to me, this, more than anything, is what being an atheist means, an ongoing devotion to exploration, a giving of pride of place to evidence. And much to my dismay, religion often is at odds with the evidence-based portrait of reality that science has begun, yes, only just begun, fleshing out. The biggest example of this is in the ongoing debate over evolution. This is like Rasputin, or the character from the horror movie Halloween – it refuses to die. The statistics are appalling. This year, according to the Washington Post, some 40 states are dealing with new or ongoing challenges to the teaching of evolution in the schools. Four-fifths of our states. According to a recent CBS poll, 55 percent of Americans believe that god created humans in their present form – and that includes, I’m sorry to say, 47 percent of Kerry voters. Only 13 percent of Americans say that humans evolved from ancestral species, no god involved. Only 13 percent. The evidence that humans evolved from prehominid primates, and they from earlier mammals, and so on back to the first cell on earth some 3.8 billion years ago is incontrovertible, is based on a Himalayan chain’s worth of data. The evidence for divine intervention is, to date, non-existent. Yet here we have people talking about it as though they were discussing whether they prefer chocolate praline ice cream or rocky road, as though it were a matter of taste.

To me, this borders on being, well, unethical. And to me, instilling in my daughter an appreciation for the difference between evidence and opinion is a critical part of childrearing. So when I tell my daughter why I’m an atheist, I explain it is because I see no evidence for a god, a divinity, a big bearded mega-king in the sky. And you know something – she gets that. She got it way back when, and I think once you get it, it’s pretty hard to lose it. People sometimes say to me, jokingly or otherwise, just you wait. She’s going to grow up and join a cult, be a moonie or a jew for jesus. But in fact the data argue against it. The overwhelming majority of people who join cults, more than three-quarters, were raised as one or another type of Christian, including Methodists, Episcopalians, Baptists, the works; and no greater percentage of atheists than in the general population. I’m sure Katherine will figure out a way to drive me nuts some day, but I don’t think the Rahjneeshi route is it.

Ah, but what of values, of learning the difference between right and wrong, good and bad? What about tradition, what about ritual, what about the holidays that children love so much? How will a child learn to be good without religious training? Well, damn. Do you really need formal religion to teach a child to be good, to be honest, to try not to hurt other people’s feelings, to care about something other than yourself? These are all variants on the golden rule, and there is nothing more powerful, in my experience, than sitting down with your kid and saying, how would you feel if somebody did that to you? There is a growing body of scientific research that demonstrates we are by nature inclined to cooperate, to trust others, even strangers, to an extraordinary degree. Even strangers we can’t see, over the internet, and even strangers that we’ll never meet again. None of this owes anything to the ten commandments. Which of those commandments tell you to help a stranger who looks lost, or jump into a river to help saving a drowning kid, or donate blood, maybe even a kidney or a slice of liver? Sure, people also do terrible things, scam you, betray you, steal from you, on and on. But sheesh, Rush Limbaugh was and for all I know still is a junkie, and priests abuse choir boys, and on and on.

I’ve talked to Katherine about the struggles we all go through, a desire to hurt others, to get revenge. She wrote a book report recently in which she talked about wanting to get revenge on people who do bad things to her, but that, alas, it’s not always easy. And when I saw that, whoa, we had a mother of a conversation. About how the two most powerful human impulses are love and revenge, and how one is a great strength that we should nurture, and the other one is a natural feeling, and we all have it, but we must fight against it with everything we can muster. Because when we don’t, we get wars, wars that can go on for years, for centuries, and we reviewed the story of Romeo and Juliet, which she loves, and that got to her, I think, that made it come alive.

And as one who believes strongly in peace, I’ve taken her on march after march, before the Iraq war, during the republican convention. I had her miss her first day of third grade this year, so she could participate in a ceremony downtown, the reading of the names of people who have died in the Iraq war. She read the names of the children. I know I’m sounding pious here, and I’m sorry about that, but these are just some of the examples of things I’ve tried to do to make her a good person, to give her a sense of meaning larger than herself. And yes, we celebrate the holidays. We buy and decorate a Christmas tree, light the menorah, our house is encrusted with lights, including a big peace sign. I’ve told Katherine about how Christmas predates jesus, and how people have long felt the need, in the darkest, coldest time of the year, to battle the blackness with lights, music, family, the evergreen tree to symbolize life, and, oh, yes, presents. None of this seems like hypocrisy to me. It’s common sense. It is magic, it is ours, and godness has nothing to do with it.

I’d like to make one final point, an admission of the biggest challenge we faced when we decided to go the godfree route: what to talk about when you talk about death. For a while, Katherine was terrified about death. We’d be driving along in the car, and all of a sudden she’d start screaming in the back seat. What’s wrong, what’s wrong? We’d ask, thinking we had to pull over for a medical emergency. I’ve just been thinking about death! She’d cry. I don’t want to just disappear! To die forever and that’s all, that’s the end. This happened a few times, each time, out of nowhere, she’d start to wail. We’d tell her whatever we could to comfort her, that she will live a long, long time, and that they’re inventing new drugs that will, by the time she grows up, help her live even longer, a couple of hundred years, who knows; she’d live until she was pig-sick of it. And we’d tell her that nothing really disappears, it just changes form, and that she could become part of a dolphin, or an eagle, or a cheetah, a praying mantis. She’d have none of it. She knew she wouldn’t be aware of her new incarnation. She knew she probably wouldn’t remember her life as Katherine, and that loss of self she found impossibly sad. As do I, the loss of her, the loss of myself. As do all of us. Learning how to die is one of the greatest tasks of life, and it’s one that most us never quite get the hang of, until we realize, whoops, not much of a trick here, is there. Not much of a choice, either.

Still, I didn’t go with the stories, of the angels, of the harps, the eternal reciting of that old Monty Python routine, o lord you are so big, so absolutely huge. We’re all really impressed her, Lord, I can tell you that.” And lately Katherine seems to have gotten past those terror jags. She hasn’t had an outburst for the past year or two.

I don’t know the answer to fear of death, surprise surprise. But I find it interesting that religious people, who talk ceaselessly of finding in their religion a larger sense of purpose, a meaning greater than themselves, at the same time are the ones who insist their personal, copyrighted souls, presumably with their 70-odd years of memory intact, will survive in perpetuity. Maybe that’s the real ethic of atheism. By confronting the inevitability of your personal expiration date, you know there is a meaning much grander than yourself. The river of life will go on, as it has for nearly 4 billion years on our planet, and who knows for how long and how abundantly on others. Matter is neither created nor destroyed, and we, as matter, will always matter, and the universe will forever be our home.

Natalie Angier has been a science reporter at the New York Times since 1990. Angier specializes in what she calls "conceptual breakthrough stories." She follows professional science journals closely until she sees the same subject mentioned in four or five different studies. "Then I go to work," she says. Angier has won both Pulitzer and American Association for the Advancement of Science-Westinghouse Writing prizes.


Copyright © 2005 Center For Inquiry

Saturday, January 29, 2005

Another Brick In The Wall

Professor James B. Twitchell teaches at the University of Florida, an Enormous State University. Each ESU is very similar: football factory, recipient of endless federal research grants, huge enrollment, and huge undergraduate classes. Twitchell creates a metaphor for the ESU: the university as department store. I labored for three decades and change in a discount store. The business terminology (of the big stores) was still rampant at the cut-rate level: mission statements, stretegic plans, enrollment management, and on and on. A discount store mimics the top-line department store, but it's still a tacky place. The Collegium Excellens had no alma mater, no iconic symbols, no sense of community. It is one of the great ironies of higher education that the so-called community colleges have no sense of community. Two-year college campuses are ghost towns in the afternoon. The emptiness is worse on weekends. Choose your poison: football factory or discount outlet. If this is (fair & balanced) alienation, so be it.

[x Wilson Quarterly]
Higher Ed, Inc.
by James B. Twitchell

In the early afternoon of December 2, 1964, Mario Savio took off his shoes and climbed onto the hood of a car. Savio was a junior majoring in philosophy at the University of California, Berkeley, and he was upset that the administration of the university had arrested a handful of students and forbidden student groups to set up tables promoting various political and social causes. So he put himself “upon the gears” of the machine:

If this is a firm, and if the Board of Regents are the board of directors, and if President Kerr in fact is the manager, then I’ll tell you something: The faculty are a bunch of employees, and we’re the raw material! But we’re a bunch of raw material[s] that don’t mean to have any process upon us, don’t mean to be made into any product, don’t mean to end up being bought by some clients of the university, be they the government, be they industry, be they organized labor, be they anyone! We’re human beings!

In the four decades since Savio’s expression of defiance, Higher Ed, Inc., has become a huge business indeed. And as is typical of absorbent capitalism, it does not deny its struggles so much as market them. Mario Savio died in 1996. To honor his activism and insight, the academic senate at Berkeley agreed to name a set of steps in Sproul Plaza, the site of many political speeches, the Savio Steps. In an interesting bit of corporate assimilation, Savio became a lasting part of his own observations: He himself got branded.

Although Mario Savio didn’t mention it, the success story of Higher Ed, Inc., is based foursquare on the very transformation that allowed him access to Berkeley. For each generation since World War II, the doors to higher education have opened wider. Unquestionably, university education is the key component in a meritocracy, the sine qua non of an open market. A university degree is the stamp that says—whether it’s true or not—this kid is educated, qualified, smart. The more prestigious the university, in theory, the smarter the kid. And increased access to university life has succeeded beyond anyone’s wildest expectation. In fact, the current dilemma is the price of success. There are too many seats, too much supply, and not enough Marios. The boom is over. Now the marketing begins.

Counting everything but its huge endowment holdings, Higher Ed, Inc., is a $250 to $270 billion business—bigger than religion, much bigger than art. And though no one in the business will openly admit it, getting into college is a cinch. The problem, of course, is that too many students want to get into the same handful of nameplate colleges, making it seem that the entire market is tight. It most certainly is not. Here’s the crucial statistic: There are about 2,500 four-year colleges in this country, and only about 100 of them refuse more applicants than they accept. Most schools accept 80 percent or more of those who apply. It’s the rare student who can’t get in somewhere.

The explosive growth of Higher Ed, Inc., is evident in increasing enrollments, new construction, expanding statewide university systems, more federal monies, and changes in the professoriate. In the 1950 census, for example, there were 190,000 faculty members. A decade later, shortly before Savio took to the hood of the car, there were 281,000. In 1970, when I entered the ranks, there were 532,000, and in 1998, the latest year for which figures are available from the U.S. Department of Education, some 1,074,000. And remember, what distinguishes the academic world is a lifetime hold on employment. About 70 percent of today’s faculty have tenured or tenure-track jobs. Even ministers get furloughed. Museum directors get canned. But make it through the tenure process, and you’re set forever.

At the turn of the 20th century, one percent of high school graduates attended college; that figure is now close to 70 percent. This is an industry that produces a yearly revenue flow more than six times the revenue generated by the steel industry. Woe to the state without a special funding program (with the word merit in it) that assures middle-class kids who graduate in the upper half of their high school class a pass to State U. College has become what high school used to be, and thanks to grade inflation, it’s almost impossible to flunk out.

If real estate’s motto is “location, location, location,” higher education’s is “enrollment, enrollment, enrollment.” College enrollment hit a record level of 14.5 million in fall 1998, fell off slightly, and then reached a new high of 15.3 million in 2000. How did this happen, when the qualified applicant pool remained relatively stable? Despite decreases in the traditional college-age population during the 1980s and early 1990s, total enrollment increased because of the high enrollment rate of students who previously had been excluded. What has really helped Higher Ed, Inc., is its ability to open up new markets. Although affirmative action was certainly part of court-mandated fair play, it was also a godsend. It insulated higher education from the market shocks suffered by other cultural institutions. In addition, universities have been able to extend their product line upward, into graduate and professional schools. Another growth market? Foreign students. No one talks about it much, but this market has been profoundly affected by 9/11. Foreign students have stopped coming. There are enough rabbits still in the python that universities haven’t been affected yet. But they will be.

What makes this enrollment explosion interesting from a marketing point of view is that Savio’s observations (“the faculty are a bunch of employees, and we’re the raw material”) have been confirmed. What he didn’t appreciate is that instead of eating up raw material and spitting it out, Higher Ed, Inc., has done something far more interesting. As it has grown, its content has been profoundly changed—dumbed down, some would say. There’s a reason for that. At the undergraduate level, it’s now in the business of delivering consumer satisfaction.

I teach at a large public university, the University of Florida. As I leave the campus to go home, I bike past massive new construction. Here’s what’s being built. On my distant left, the student union is doubling in size: food court, ballrooms, cineplex, bowling alley, three-story hotel, student legal services and bicycle repair (both free), career counseling, and all manner of stuff that used to belong in the mall, including a store half the size of a football field with a floor devoted to selling what is called spiritware (everything you can imagine with the school logo and mascot), an art gallery, video games, an optical store, a travel agency, a frame store, an outdoor outfitter, and a huge aquarium filled with only orange and blue (the school colors) fish. On a normal day some 20,000 patrons pass through the building. The student union is looking eerily like a department store. So is the university.

On my immediate left, I pass the football stadium. One side of it is being torn apart to add a cluster of skyboxes. Skyboxes are a valuable resource, as they are almost pure profit. The state is not paying for them. The athletic department is. They will be rented mainly to corporations to allow their VIPs air-conditioned splendor high above the hoi polloi. The skyboxes have granite countertops, curved ceilings, and express elevators. In a skybox, you watch the football game on television. Better yet, the skyboxes allow what’s forbidden to the groundlings: alcohol. How expensive are these splendid aeries? There are 347 padded 21-inch seats in the Bull Gator Deck. They’ll run you $14,000 a person, and you get only four games in the box. For the other four, you’re in the stands. Don’t worry about doing the math. The boxes are already sold out. I teach in a huge building that looks like the starship Enterprise. It houses classrooms and faculty offices and cost $10 million when it was built a few years ago. These skyboxes and some club seats are coming in at $50 million. Everyone agrees, the skyboxes are a good idea. They’ll make money. Better yet, they’ll build the brand.

Across from the football stadium, at the edge of the campus on my right, is the future of my institution. I pass an enormous new building with a vast atrium of aggressively wasted space. This building houses the headquarters of the University of Florida Foun­dation. The foundation funnels millions of dollars of private money the state will never know about into and through various parts of the university. I don’t complain. No one does. Two decades ago, the foundation gave nothing to the English department; now, about a hundred grand a year comes our way. In front of the foundation, where a statue of some illustrious donor or beloved professor would stand at an elite school, is a bronze statue of the athletic department’s trademarked mascots, Albert and Alberta Alligator.

On this side of campus, enrollment, enrollment, enrollment is becoming endowment, endowment, endowment. Americans donate more money to higher education than to any other cause except religion. And Florida, with its millions of retirees looking for “memorial opportunities,” is a cash cow just waiting for the farmer’s gentle hands. The residents of Florida have almost no interest in funding education, especially not K-12 education, which really is in dire shape. But there are wads of money to fund bits and pieces of the campus in exchange for good feelings and occasional naming rights.

American colleges and universities raise about $25 billion a year from private sources. Public universities are new to this game, but they’ve learned that it’s where the action is. Private dollars now account for about 30 percent of the University of Illinois’ annual budget, about 20 percent of Berkeley’s, and about 10 percent of Florida’s. In a sense, tuition-paying undergrads are now the loss leaders in the enterprise. What used to be the knowledge business has become the business of selling an experience, an affiliation, a commodity that can be manufactured, packaged, bought, and sold. Don’t misunderstand. The intellectual work of universities is still going on and has never been stronger. Great creative acts still occur, and discoveries are made. But the experience of higher education, all the accessories, the amenities, the aura, has been commercialized, outsourced, franchised, branded. The professional manager has replaced the professor as the central figure in delivering the goods.

From a branding point of view, what happens in the classroom is beside the point. I mean that literally. The old image of the classroom as fulfillment of the Socratic ideal is no longer even invoked. Higher Ed, Inc., is more like a sawmill. A few years ago, Harvard University started a small department called the Instructional Computing Group, which employs several people to videotape about 30 courses a semester. Although it was intended for students who unavoidably missed class, it soon became a way not to attend class. Any enrolled student could attend on the Web, fast-forwarding through all the dull parts. This is “distance education” from a dorm room, at an advertised $37,928 a year.

Elite schools are no longer in the traditional education business. They are in the sponsored research and edutainment business. What they offer is just one more thing that you shop for, one more thing you consume, one more story you tell and are told. It’s no accident that you hear students talking about how much the degree costs and how much it’s worth. That’s very much how the schools themselves talk as they look for new sources of research or developmental funding. In many schools there’s even a period called shopping around, in which the student attends as many classes as possible looking for a “fit,” almost like channel surfing.

So we do college as we do lunch or do shopping or do church. That’s because for most students in the upper-tier schools the real activity is getting in and then continuing on into the professional schools. No one cares what’s taught in grades 13–16. How many times have I heard my nonacademic friends complain that there’s no coherence in the courses their kids are exposed to? Back in the 1950s, introductory courses used the same textbooks, not just intramurally but extramurally. So Introduction to Writing (freshman English) used the same half-dozen handbooks all across the country. No longer. The writing courses are a free-for-all. Ditto the upper-level courses. Here are some subjects my department covers in what used to be English 101, the vanilla composition course: attitudes toward marriage, business, bestsellers, carnivals, computer games, fashion, horror films, The Simp­sons, homophobia, living arrangements, rap music, soap operas, Elvis, sports, theme parks, AIDS, play, and the ever-popular marginalization of this or that group.

But cries that the classroom is being dumbed down or politicized miss the point. Hardly anyone in Higher Ed, Inc., cares about what is taught, because that is not our charge. We are not in the business of transmitting what E. D. Hirsch would call cultural literacy; nor are we in the business of teaching the difference between the right word and the almost right word, as Mark Twain might have thought important. We’re in the business of creating a total environment, delivering an experience, gaining satisfied customers, and applying the “smart” stamp when they head for the exits. The classroom reflects this. Our real business is being transacted elsewhere on campus.

The most far-reaching changes in postsecondary education are not seen on the playing fields or in the classroom or even in the admissions office. They’re inside the administration, in an area murkily called development. If you don’t believe it, enter the administration building of any school that enrolls more than 10,000 students (10 percent of campuses of that size or larger now account for a shade less than 50 percent of all students) and ask for the university development office. You’ll notice how, on this part of the campus, the carpets are thick, the wainscoting is polished, and the lights are dimmed. Often, the development office has a new name picked up from the corporate model. Sometimes it’s hidden inside Public Affairs, or, more commonly, Public Relations. My favorite: University Advance­ment. The driving force at my university is now the University of Florida Foundation.

Development is both PR and fundraising, the intersection of getting the brand out and the contributions in, and daily it becomes more crucial. That’s because schools like mine have four basic revenue streams: student tuition, research funding, public (state) support, and private giving. The least important is tuition; the most prestigious is external research dollars; the most fickle is state support; and the most remunerative is what passes through the development office. Leaf through The Chronicle of Higher Education, the weekly journal of the industry, and you’ll see how much newsprint is devoted to the comings and goings of development. Consider where the development office is housed on most campuses, often right beside the president’s office, and note how many people it employs.

At many schools, there’s also a buried pipeline that connects the development office with the admissions office. Most academic administrators prefer that it be buried deep, but from time to time someone digs it up. In The Wall Street Journal for February 3, 2003, Daniel Golden reported on how the formal practice of giving preference to students whose parents are wealthy—called “development admits”—has profound implications not just for affirmative action but for the vaunted academic ideal of fair play.

Remember the scene in the third season of The Sopranos when Carmella has a lunch meeting with the dean of Columbia University’s undergraduate school? She thinks the lunch is about her daughter Meadow, but the dean wants a little development money. Carmella listens to his charming patter before being hit with the magic number of $50,000. She goes to Tony, who protests that the Ivy League is extorting them and says he won’t give more than five g’s. But the dean eventually gets his 50 g’s; Tony, the consummate shakedown artist, has met his match.

When enrollments began to escalate in the 1960s, what used to be a pyramid system—with rich, selective schools at the top (read Ivy League and a handful of other elites) and then a gradation downward through increasing supply and deceasing rigor to junior and community college systems at the base—became an hourglass lying on its side. There’s now a small bubble of excellent small schools on one side (Ivy League schools qualify as small) that are really indistinguishable, and, on the other, a big bubble of huge schools of varying quality. The most interesting branding is occurring on the small-bubble side, as premier schools vie for dominance, but the process is almost exactly the same, although less intense, for the big suppliers.

Good schools have little interest in the bachelor’s degree. In fact, the better the school, the less important the terminal undergraduate degree. The job of the student is to get in, and the job of the elite school is to get the student out into graduate school. The schools certify students as worthy of further education, in law, medicine, the arts, or business.

Premier schools have to separate their students from the rest of the pack by generating a story about how special they are. We have the smart ones, they say. That’s why they care little about such hot-button issues as grade inflation, teaching quality, student recommendations, or even the curriculum. It’s not in their interest to tarnish the brand by drawing distinctions among their students. These schools essentially let the various tests—LSAT, MCAT, GRE—make the distinctions for them. And, if you notice, they never divulge how well their students do on those tests to the outside world. They have this information, but they keep it to themselves. They’re not stupid; they have to protect the brand for incoming consumers because that’s where they really compete.

In one of the few candid assessments of the branding of Higher Ed, Inc., Robert L. Woodbury, former chancellor of the University of Maine system, noted the folly of the current institutional U.S. News and World Report rankings:

When Consumer Reports rates and compares cars, it measures them on the basis of categories such as performance, safety, reliability, and value. It tries to measure “outputs”—in short, what the car does. U.S. News mostly looks at “inputs” (money spent, class size, test scores of students, degrees held by faculty), rather than assessing what the college or university actually accomplishes for students over the lives of their enrollment. If Consumer Reports functioned like U.S. News, it would rank cars on the amount of steel and plastic used in their construction, the opinions of competing car dealers, the driving skills of customers, the percentage of managers and sales people with MBAs, and the sticker price on the vehicle (the higher, the better).

The emphasis on “inputs” explains why the elite schools aren’t threatened by what others fear: the much-ballyhooed “click” universities, such as the University of Phoenix and Sylvan Learning Systems, Inc., because those schools generate no peer effects. So, too, there’s no threat from corporate universities, such as those put together by Microsoft, Motorola, and Ford, or even from the Open University of England and The Learning Annex. The industrial schools have not yet made their presence felt, though they will. The upper tier on the small side of the hourglass is not threatened by “learning at a distance” or “drive-through schools,” because the elites are not as concerned with learning as they are with maintaining selectivity at the front door and safe passage to still-higher education at the back door.

So what’s it like at the upper end among the deluxe brand-name schools, where Harry Winston competes with Tiffany, where Louis Vuitton elbows Prada, where Lexus dukes it out with Mercedes? In a word, it’s brutal, an academic arms race.

How did the competition become so intense? Until 1991, the Ivy League schools and the Massachusetts Institute of Tecnology met around a conference table each April to fix financial aid packages for students who had been admitted to more than one school. That year, after the Justice Department sued the schools, accusing them of antitrust violations, the universities agreed to stop the practice. As happened with Major League Baseball after television contracts made the teams rich, bidding pandemonium broke out. Finite number of players + almost infinite cash = market bubble. Here’s the staggering result. Over the past three decades, tuition at the most select schools has increased fivefold, nearly double the rate of inflation. Yet precious few students pay the full fare. The war is fought over who gets in and how much they’re going to have to be paid to attend.

The fact of the matter is that the cost of tuition has become unimportant in the Ivy League. Like grade inflation, it’s uncontrollable—and hardly anyone in Higher Ed, Inc., really cares. As with other luxury providers, the higher the advertised price, the longer the line. The other nifty irony is that, among elite schools, the more the consumer pays for formal education (or at least is charged), the less of it he or she gets. The mandated class time necessary to qualify for a degree is often less at Stanford than at State U. As a general rule, the better the school, the shorter the week. At many good schools, the weekend starts on Thursday.

Ask almost anyone in the education industry what’s the most overrated brand and they’ll tell you “Harvard.” It’s one of the most timid and derivative schools in the country, yet it has been able to maintain a reputation as the über-brand. Think of any important change in higher education, and you can bet (1) that it didn’t originate at Harvard, and (2) that if it’s central to popular recognition, Harvard now owns it. Why is Harvard synonymous with the ne plus ultra? Not because of what comes out of the place but because of what goes in: namely, the best students, the most contributed money, and, especially, the deepest faith in the brand. Everyone knows that Harvard is the most selective university, with a refusal rate of almost 90 percent. But more important, the school is obscenely rich, with an endowment of almost $20 billion. Remember that number. It’s key to the brand. The endowment is greater than the assets of the Dell computer company, the gross domestic product of Libya, the net worth of all but five of the Forbes 400, or the holdings of every nonprofit in the world except the Roman Catholic Church.

In a marketing sense, the value of the endowment is not monetary but psychological: Any place with that many zeros after the dollar sign has got to be good. The huge endowments of the nameplate schools force other schools, the second-tier schools, to spend themselves into penury. So your gift to Harvard does more harm than good to the general weal of Higher Ed, Inc. It does, however, maintain the Harvard brand.

With the possible exception of Harvard, the best schools are about as interchangeable as the second-tier ones. All premier schools have essentially the same teaching staff, the same student amenities, the same library books, the same wondrous athletic facilities, the same carefully trimmed lawns, the same broadband connection lines in the dorms. Look at the websites for the most selective schools, and you’ll see almost exactly the same images irrespective of place, supposed mission, etc. True, they may attempt to slide in some attention-getting fact (“If you use our library, you may notice our Gutenberg Bible,” or “The nuclear accelerator is buried beneath the butterfly collection”), but by and large the websites are like the soap aisle at Safeway.

If you really want evidence of the indistinguishability of the elites, consider the so-called viewbook, the newest marketing tool sent to prospective applicants. The viewbook is a glossy come-on, bigger than a prospectus and smaller than a catalog, that sets the brand. As with the websites, what you see in almost every view is a never-ending loop of smiling faces of diverse backgrounds, classrooms filled with eager beavers, endless falling leaves in a blue-sky autumn, lush pictures of lacrosse, squash, and rugby (because football, basketball, and baseball are part of the mass-supplier brands), and a collection of students whose interests are just like yours. From a branding point of view, the viewbook is additionally interesting because it illustrates how repeating a claim is the hallmark of undifferentiated producers. Here’s what Nicolaus Mills, an American studies professor at Sarah Lawrence College, found a decade ago, just as the viewbook was starting to become standardized. Every school had the same sort of glossy photographs proving the same claim of diversity:

“Diversity is the hallmark of the Harvard/Radcliffe experience,” the first sentence in the Harvard University register declares. “Diversity is the virtual core of University life,” the University of Michigan bulletin announces. “Diversity is rooted deeply in the liberal arts tradition and is key to our educational philosophy,” Connecticut College insists. “Duke’s 5,800 undergraduates come from regions which are truly diverse,” the Duke University bulletin declares. “Stanford values a class that is both ethnically and economically diverse,” the Stanford University bulletin notes. Brown University says, “When asked to describe the undergraduate life at The College—and particularly their first strongest impression of Brown as freshmen—students consistently bring up the same topic: the diversity of the student body.”

In this kind of marketing, Higher Ed, Inc., is like the crowd in Monty Python’s Life of Brian. Graham Chapman as Brian, the man mistaken for the Messiah, exhorts a crowd of devotees: “Don’t follow me! Don’t follow anyone! Think for yourselves! You are all individuals!” To which the crowd replies in perfect unison, “Yes, Master, we are all individuals. We are all individuals. We are all individuals.”

The elite schools have to produce an entering class that’s not just the best and brightest they can gather, but one that will dem­on­strate an unbridgeable quality gap between themselves and other schools. They need this entering class because it’s precisely what they will sell to the next crop of consumers. It’s the annuity that gives them financial security. In other words, what makes Higher Ed, Inc., unlike other American industries is that its consumer value is based almost entirely on who is consuming the product. At the point of admissions, the goal is not money. The goal is to publicize who’s getting in. That’s the product. Who sits next to you in class generates value.

So it’s to the advantage of a good school to exploit the appearance of customer merit, not customer need. But how to pay for this competitive largesse if tuition is not the income spigot? At four-year private colleges and universities, fully three-quarters of all undergraduates get aid of some sort. In fact, 44 percent of all “dependent” students, a technical term that refers to young, single undergraduates with annual family incomes of $100,000 or less, get aid. What elite schools lose on tuition they recover elsewhere. Take Williams College, for example. The average school spends about $11,000 a student and takes in $3,500 in tuition and fees; Williams, a superbrand, spends about $75,000 per student and charges, after accounting for scholarships and other items, a net of $22,000. Why? Because Williams figures that to maintain its brand value, to protect its franchise, it can superdiscount fees and make up the difference with the cash that’s to come in the future. In theory, if an elite school could get the right student body, it would be in its best interest to give the product away: no tuition in exchange for the very best students. (That’s a policy not without risk, as Williams found last year when Moody’s lowered its credit rating because the college had dipped too deeply into endowment to fund its extraordinary incoming class.)

How does the brand sensitivity of the elite institutions affect the quality of the educational experience for the rest of us? How dangerous is it that schools follow the corporate model of marketing? The prestige school has other money pots than tuition. Every two weeks, for example, Harvard’s endowment throws off enough cash to cover all undergraduate tuition. But what happens to schools below the privileged top tier? They, too, have to discount their sticker prices to maintain perceived value. So competition at the top essentially raises costs everywhere, though only some schools have pockets deep enough to afford the increase. The escalation in competitive amenities is especially acute in venues where a wannabe school is next to an elite one.

Things get worse the further you move from the top. To get the students it needs to achieve a higher ranking in annual surveys—and thereby draw better students, who boost external giving, which finances new projects, raises salaries, and increases the endowment needed for getting better students, who’ll win the institution a higher national ranking, which . . . etc.—the second-tier school must perpetually treat students as transient consumers.

Really good schools have all those so-called competitive amenities, all those things that attract students but have nothing to do with their oft-stated lofty mission and often get little use—Olympic-quality gyms, Broadway-style theaters, personal trainers, glitzy student unions with movie theaters, and endless playing fields, mostly covered with grass, not athletes. This marketing madness is now occurring among the mass-supplier insti­tu­tions. So the University of Houston has a $53 million wellness center with a five-story climbing wall; Washing­ton State University has the largest Jacuzzi on the West Coast (it holds 53 students); Ohio State University is building a $140 million complex featuring batting cages, ropes courses, and the now-essential climbing wall; and the University of Southern Mis­sis­sippi is planning a full-fledged water park. These schools, according to Moody’s, are selling billions of dollars of bonds for construction that has nothing whatsoever to do with education. It’s all about branding.

The commercialization of higher education has had many salutary effects: wider access, the dismantling of discriminatory practices, in­creased breadth and sophistication in many fields of research, and an intense, often refreshing, concern about customer relations. But consider other consequences for a place such as the University of Florida, which is a typical mass-provider campus. To get the student body we need for a respectable spot in the national rankings, we essentially give the product away. We have no choice. Other states will take our best students if we don’t. Ivy League monies come from endowment and have the promise of being replenished if the school retains its reputation. But state universities are heavily dependent on the largesse of state legislatures, and to keep the money coming they need to be able to boast about their ability to attract the state’s best and brightest. So about half of them have been sucked into simple-minded plans that are essentially a subvention of education for middle-class kids. Everyone admits that most of these kids would go to college anyway. But would they go to the state system? Who wants to find out the hard way?

Mario Savio was right. Before all else, the modern university is a business selling a branded product. “The Age of Money has reshaped the terrain of higher education,” writes David Kirp, of the Goldman School of Public Policy at the University of California, Berkeley. “Gone, except in the rosy reminiscences of retired university presidents, is any commitment to maintaining a community of scholars, an intellectual city on a hill free to engage critically with the conventional wisdom of the day. The hoary call for a ‘marketplace of ideas’ has turned into a double-entendre.”

Administrators and the professoriate have not just allowed this transformation of the academy, they’ve willingly, often gleefully, collaborated in it. The results have not been all bad. But the fact is that we’ve gone from artisanal guild to department store, from gatekeeper to ticket taker, from page turner to video clicker. This commodification, selling out, commercialization, corporatization—whatever you want to call it—is what happens when marketing becomes an end, not a means.

Universities are making money by lending their names to credit card companies, selling their alumni lists, offering their buildings for “naming rights,” and extending their campuses to include retirement communities and graveyards. It’s past time for the participants in Higher Ed, Inc., to recall what Savio said years ago: The university is being industrialized not by outside forces but by internal ones. Rather like the child who, after murdering his parents, asks for leniency because he’s an orphan, universities grown plump feeding at the commercial trough now complain that they’ve been victimized by the market. This contention of victimization is, of course, a central part of the modern Higher Ed, Inc., brand. The next words you’ll hear will be “Please give. We desperately need your support!”

James B. Twitchell is a professor of English and advertising at the University of Florida, Gainesville. He is the author of many books, including most recently Living It Up: America’s Love Affair with Luxury (2002). This essay is drawn from his forthcoming book Branded Nation, to be published by Simon & Schuster. Printed by permission.

Copyright © 2004 Wilson Quarterly

This article may not be resold, reprinted, or redistributed for compensation of any kind without prior written permission from the author.

Friday, January 28, 2005

Auschwitz And Holocaust Denial ("Revisionist History")

It was the 60th anniversary of the liberation of Auschwitz by the Red Army this week. The most prominent Holocaust denier in this country is Pat Buchanan. It is bad enough that Tim Russert and Don Imus both give Doris K. Goodwin, the plagiarist, a pass and feature her prominently on both their shows. However, it is even worse that Imus and MSNBC give program time to Pat Buchanan. Whatever her faults, Doris K. Goodwin is not a Nazi apologist nor a Holocaust denier. That honor belongs to Pat Buchanan. I have long been uneasy with Buchanan's defense of Nazi war criminals. Recently, one of his chief advisers from his ill-fated presidential effort spoke at a major Holocaust denier/revisionist history conference. Birds of a feather. If this is (fair & balanced) guilt by association, so be it.

[x Wikipedia]
Holocaust Revisionism

Holocaust deniers prefer to be called Holocaust revisionists. Most people contend that the latter term is misleading. Historical revisionism is the reexamination of accepted history, with an eye towards updating it with newly discovered, more accurate, and/or less biased information. Broadly, it is the approach that history as it has been traditionally told may not be entirely accurate and should be revised accordingly. Historical revisionism in this sense is a well-accepted and mainstream part of history studies. It may be applied to the Holocaust as well, as new facts emerge and change our understanding of its events.

Holocaust deniers maintain that they apply proper revisionist principles to Holocaust history, and therefore the term Holocaust revisionism is appropriate for their point of view. However, their critics disagree and prefer the term Holocaust denial. Gordon McFee writes in his essay "Why Revisionism isn't" that:



"Revisionists" depart from the conclusion that the Holocaust did not occur and work backwards through the facts to adapt them to that preordained conclusion. Put another way, they reverse the proper methodology [...], thus turning the proper historical method of investigation and analysis on its head."


In general, the term Holocaust denial fits the description at the beginning of this article, while Holocaust revisionism ranges from holocaust denial through the belief that only minor corrections are required to Holocaust history. However, because the latter term has become associated with Holocaust deniers, mainstream historians today generally avoid using it to describe themselves. Thus Holocaust revisionism has come to be understood as revisionist history, rather than historical revisionism.

Copyright © 2005 Wikipedia


[x Frontpage Magazine]
Pat Buchanan, His Fans, and Anti-Semitism
By Jamie Glazov

MY RECENT REVIEW of Pat Buchanan’s new book, The Death of the West, has triggered some angry letters from Buchanan supporters.

Offended at various remarks that I made, my critics are mostly upset at my implication that Buchanan is a racist. One reader writes to me,

"Your paranoid feelings are coming out. I read Buchanan’s book, The Death of the West, and I do not get out of it any racial feelings."

For a person to read The Death of the West and not "get out of it any racial feelings" is unquestionably quite a feat. This is like spending an entire day hanging around with members of the flat earth society and never getting the hint that something might be a little bit, well, not altogether right.

I have studied Pat Buchanan’s philosophy of life for quite a while. Aside from his anti-communism and Catholicism, both of which I deeply respect, his views on other issues do more than just raise my eyebrows. There is one particular realm of Buchanan’s world vision that troubles me the most. I would like to take this opportunity to offer all the Buchanan supporters a summary of this realm. It will probably serve as a great inspiration to them.

Let’s begin with an illuminating fact: if you read the criticisms of my review in the Go Postal section, you will find that several Buchanan supporters keep accusingly inquiring if I am a Jew. What does this say about them?

Let me give you a clue:

Buchanan wrote a real charming book before The Death of the West. In A Republic, Not An Empire, he denied that Adolf Hitler had any malicious intentions toward the West, let alone toward the Jews living there. He also argued that Hitler was forced into pursuing the Final Solution because of British and American intervention in the war. Buchanan’s implication, in other words, was that Hitler wasn’t really responsible for what he did.


Buchanan has described Hitler as a "genius" and "an individual of great courage, a soldier's soldier in the Great War."

What feelings or beliefs would motivate a person to make such a tribute to Hitler?

Buchanan’s words have always implied that, if Hitler had only entertained designs on Eastern European Jews for his Final Solution, and that as long as this did not affect American interests, then America had no obligation to intervene on purely humane grounds. That’s what Buchanan’s "America First" policy is all about.

I can’t help from wondering: what exactly is Buchanan saying about the Holocaust?

Buchanan has also shown an obsessive predilection for defending accused Nazi war criminals, every one of whom somehow appear to be innocent in his eyes.

What rests behind a man’s passion to distinguish himself in this light?

During his infamous defense of John Demjanjuk, Buchanan claimed that Demjanjuk was not the guard he was alleged to be at Treblinka. Buchanan turned out to be right: Demjanjuk was a guard in a different concentration camp.

The non-existence of a forthcoming Buchanan apology on Demjanjuk implied that Buchanan believed that he had actually won on this issue.

During his defense of Demjanjuk, Buchanan made the intriguing statement that the diesel gas fumes used at Treblinka could not have killed anyone. These diesel gas fumes were used not only at Treblinka, but also at a number of other death camps. Hundreds of thousands of Jews died in these camps. If these victims did not die from diesel gas fumes, then how and why did they die? Would Buchanan be willing to expose his family members, as well as himself, to the same fumes in order to demonstrate his point?

During Ronald Reagan’s presidential visit to the Bitburg cemetery in Germany, Buchanan wrote, for Reagan's controversial speech, that the Germans buried there, who included members of SS units and Nazis who participated in Hitler's extermination of the Jews, were "victims of the Nazis just as surely as the victims in concentration camps."

Fascinating.

Buchanan has also compared the Nazi camps with those set up by Gen. Eisenhower for German prisoners of war. This is a comparison between POWs being held because they are an enemy in war and a group of people who are liquidated because of their race.

Buchanan has drawn a parallel between Andrei Sakharov, the great Soviet dissident who was persecuted for, among other things, his courage in standing up for human rights in a totalitarian regime, and Arthur Rudolph, a German rocket scientist who admitted his involvement with slave labor and other atrocities of the Nazi regime.

Why would Buchanan do this?

During the Gulf War, Buchanan charged that the American intervention was caused by a Jewish conspiracy, which consisted of American Jews conspiring with the Israeli Defense Ministry. On other occasions, he has talked about the "Holocaust survivor syndrome" which, in his view, involves "group fantasies of martyrdom and heroics." During these particular interpretations, he put himself in the same league with Holocaust deniers and Holocaust perpetrators by using their favorite vocabulary.

Holocaust deniers consistently talk about the "Jewish conspiracy," that pathological fantasy that involves the Jewish control of the media and the banks, the Jewish assault on culture, the Jewish poisoning of the Aryan race, etc. We've heard this all before: in Mein Kampf and in the terminology of Nazi spokesmen who engineered Auschwitz, Dachau, Buchenwald and, yes, Treblinka.

What is it that possesses a man to use this vocabulary when he knows full well the ugly context in which it has already been used?

After being confronted about the anti-Semitic implications of his words, Buchanan has stated, several times: "I don't retract a single word."

Not a single word? Not even a single one?

Why?

Perhaps Buchanan’s fans can enlighten me.

Jamie Glazov is Frontpage Magazine's managing editor. He holds a Ph.D. in History with a specialty in Soviet Studies. He edited and wrote the introduction to David Horowitz’s new book, Left Illusions. He is also the co-editor (with David Horowitz) of the new book The Hate America Left and the author of Canadian Policy Toward Khrushchev’s Soviet Union (McGill-Queens University Press, 2002) and 15 Tips on How to be a Good Leftist.

Copyright © 2002 Frontpage Magazine


Steal This Copyrighted Stuff!

Abbie Hoffman demonstrated contempt for "the System" by entitling his 1971 book, Steal This Book! The perennial question I am asked about this blog runs to the effect of violating copyright by posting the work of other writers to the blog without permission. There's an old Texas aphorism: "It's easier to beg forgiveness than to ask for permission." Pre-blog, I was invariably forwarding stuff I read on the Internet by e-mail to long-suffering friends because I thought that the stuff would be interesting, irritating, or both. Along came the blogging movement and I thought, Bada bing! I will post the stuff I was e-mailing to a blog and my friends can go to the blog and read it. That is what I have been doing since June 2003. Google (owners of Blogger.com) allows me to post as much or as little as I like. I can post visual material. I can post audio material. Readers can post comments to this blog (unused to date) or readers can send me e-mail out of the blog (rarely used). So, here I am: cyber-outlaw. I do include the copyright of every item posted to this blog. If this is (fair & balanced) thievery, so be it.

[x BookForum]
Righting Copyright: Fair Use and "Digital Environmentalism"
by Robert S. Boynton

Who owns the words you're reading right now? if you're holding a copy of Bookforum in your hands, the law permits you to lend or sell it to whomever you like. If you're reading this article on the Internet, you are allowed to link to it, but are prohibited from duplicating it on your web site or chat room without permission. You are free to make copies of it for teaching purposes, but aren't allowed to sell those copies to your students without permission. A critic who misrepresents my ideas or uses some of my words to attack me in an article of his own is well within his rights to do so. But were I to fashion these pages into a work of collage art and sell it, my customer would be breaking the law if he altered it. Furthermore, were I to set these words to music, I'd receive royalties when it was played on the radio; the band performing it, however, would get nothing. In the end, the copyright to these words belongs to me, and I've given Bookforum the right to publish them. But even my ownership is limited. Unlike a house, which I may pass on to my heirs (and they to theirs), my copyright will expire seventy years after my death, and these words will enter the public domain, where anyone is free to use them. But those doodles you're drawing in the margins of this page? Have no fear: They belong entirely to you.

While it was once believed that Marxism would overhaul notions of ownership, the combination of capitalism and the Internet has transformed our ideas of property to an extent far beyond the dreams of even the most fervent revolutionary. Which is not to say that anything resembling a collectivist utopia has come to pass. Quite the opposite. In fact, the laws regulating property—and intellectual property, in particular—have never before been so complex, onerous, and rigid.

Copyright protection has been growing in fits and starts since the early days of the Republic. In 1790, a copyright lasted for fourteen years and could be renewed once before the work entered the public domain. Between 1831 and 1909, the maximum term was increased from twenty-eight to fifty-six years. It was extended several more times during the twentieth century until 1998, when the Sonny Bono Copyright Term Extension Act added twenty additional years (to both existing and future intellectual property), increasing copyright protection to seventy years after the death of an author.

Some of the most significant changes in intellectual property law took place in the Copyright Act of 1976, after which it was no longer required to register one's work in order to protect it. Anything "fixed in a tangible medium"—e-mail messages, those doodles in the margins of this magazine—automatically became copyrighted. Recent laws—like the 1998 Digital Millennium Copyright Act, which increased protection of copyrighted material on the Internet, and the Sonny Bono Act—have elevated intellectual property's status to such a degree that many courts and corporations often treat it in virtually the same way as they do physical property.

This is a category mistake, and one explicitly forbidden according to Article 1, Section 8 of the Constitution, which gives Congress the authority to "promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries.'' Unlike Europe, whose laws center on the "moral rights" of the author to control his creation, American copyright law has always had the strictly utilitarian goal of providing just enough incentive for someone to create. Copyright is a bargain: The government grants a limited right to profit from your intellectual property in exchange for your agreement to give the public limited access to it during that period (such as the "fair use" right of a teacher to make class copies of an essay), and, eventually, for it to lapse into the public domain.

But as copyright terms lengthened and intellectual property became a larger part of American industry, the logic of incentive has been overshadowed by the logic of reward, the thinking being that if my work continues to have value, why shouldn't I profit from it for as long as I want? "In our tradition, intellectual property is an instrument. It sets the groundwork for a richly creative society but remains subservient to the value of creativity," writes Stanford law professor Lawrence Lessig in his most recent book, Free Culture: How Big Media Uses Technology and the Law to Lock Down Culture and Control Creativity. "Yet the current debate has this turned around. We have become so concerned with protecting the instrument that we are losing sight of the value."

But if we have fallen into what New York University communications professor Siva Vaidhyanathan calls "the property-talk trap," it has had the unintended effect of mobilizing citizens by demonstrating the stake we all have in the debate over how intellectual property should be considered. Once an arcane part of the American legal system, intellectual property law is now at the center of major disputes in the arts, sciences, and politics. People are increasingly aware of the role intellectual property plays in their everyday lives; they bump up against it every time they discover they can't print a passage from an e-book or transfer a song from their computer to their iPod. These days, it is not uncommon to hear people casually conversing about legal concepts like "fair use" and the "first sale doctrine."

Much of this awareness results from the well-publicized lawsuits the Recording Industry Association of America has brought against music downloaders. This is unfortunate, because it has created the impression that those in favor of liberalizing copyright law condone the theft of intellectual property. Leaving aside questions about the appropriate legal remedies for, and the economic implications of, downloading, taking copyrighted material for which one has not paid is simply illegal. The fact that illegal downloading is a mass phenomenon indicates that our intellectual property laws aren't working in much the same way that the speakeasies of the '20s and '30s pointed out the irrationality of Prohibition. Neither downloading nor drinking, however, made the activities more legal.

It is in more common—and only marginally illegal—pursuits that ordinary citizens are realizing they have a legitimate stake in the debate over the scope of copyright law. As the price of digital video cameras and editing software plummets, the number of people who sync home movies to music, splice together clips from favorite television shows, and even produce documentaries has soared. TiVo and other digital video recorders have made it possible to trade programs over the broadband Internet connections that are finding their way into homes across the country. Young fathers are practically required to transplant images of their newborns into great works of art by way of Photoshop.

In December 2004, Google announced "Google Print," a project to bring millions of easily searchable, digitized books to the Internet. The project, which has already begun and may take a decade to complete, will further heighten awareness of our vexed relationship to intellectual property. After digitizing the entire holdings of Stanford and the University of Michigan libraries (as well as sections of the libraries of Harvard, Oxford and the New York Public Library), Google Print will search the texts of these books—although one will only be able to read the entire text of those works whose copyright has lapsed and are therefore in the public domain. As for copyrighted titles, one will be able to search their text for names and key phrases but won't be allowed to read the books themselves (a function like Amazon's helpful, but similarly limited, "Search inside this book" service). Instead, one will be directed to a library or bookstore where the book can be located.

As amazing an effort as Google Print is (creating nothing less than a virtual "universal library of knowledge"), its logical goal—giving readers full access to the entire contents of that library—will be undercut by our intellectual property laws. It is an inherently unstable situation, and it is only a matter of time before someone (Amazon? Random House?) develops software to link this vast cache of literature to a convenient print-on-demand service (for which the hardware already exists). When it becomes possible to hold an inexpensive, physical copy of one of Google's digitized titles in one's hands—but only if it was first published prior to 1923 and is therefore in the public domain—people will begin to understand the implications of having something so obviously beneficial (universal access to universal knowledge) tethered to laws from another era. Google Print may be the Trojan Horse of the copyright wars.
* * *

While a range of copyright-infringing technologies has been changing the way we interact with our culture, critics of excessive copyright protection have been forging a coalition to demand that the law be brought more in line with the capabilities of these technologies. The challenge is considerable. Individual intellectual property rights are often in conflict with one another, and the only groups with a common interest in the direction of such laws are those corporations who want to lock up culture in perpetuity (or "forever minus a day," as former Motion Picture Association of America head Jack Valenti once suggested). Even following the twists and turns of the debate is difficult, since negotiations are seldom held in public. "This cultural war is almost invisible," writes David Bollier in Brand Name Bullies: The Quest to Own and Control Culture. "It is happening quietly and incrementally—in rulings by distant courts, in hearing rooms on Capital Hill and obscure federal agencies, in the digital code that Hollywood and record labels surreptitiously implant into DVDs and CDs."

One of the most suggestive responses to this dilemma has come from Duke University law professor James Boyle, who, in his landmark book Shamans, Software and Spleens: Law and the Construction of the Information Society (1996), diagnosed the problem succinctly. "What we have right now is an exponentially expanding intellectual land grab, a land grab that is not only bad but dumb, about which the progressive community is largely silent, the center overly sanguine, and the right wing short-sighted." Boyle's subsequent work is an extended plea that we value the public domain. "Our art, our culture, our science depend on this public domain every bit as much as they depend on intellectual property,'' he writes.

Boyle is one of the founders of "digital environmentalism," the movement that is fashioning a new understanding of what the public domain—the "commons," as Boyle and others have called it—might be. The great achievement of the environmental movement, from which Boyle draws inspiration, was its ability to convince a swath of the population—consumers and industrialists alike—that they all had a stake in this thing called "the environment," rather than just the small patch of land where they lived. Similarly, digital environmentalists are raising our awareness of the intellectual "land" to which people ought to feel entitled.

Digital environmentalism is a two-pronged movement, with one group raising the awareness of the cultural stakes of intellectual property among everyday citizens, and the other pressing for legislative and legal change. The difference between the two is one of emphasis, with each participating in the battles of the other. Neither are anarchists or utopians; rather, both perceive of themselves as conservatives in the traditional sense of the term. "The point is not that copyright and trademark law needs to be overthrown," writes Bollier. "It is that its original goals need to be restored. Individual creators need to be empowered more than ever. The volume and free flow of information and creativity need to be protected. The public's rights of access and use must be honored. We must strike a new balance of private and public interests that takes account of the special dynamics of the Internet and digital technology."

For those in the legal camp, the central event of recent years was Eldred v. Ashcroft, the 2002 Supreme Court case that challenged the constitutionality of the 1998 Sonny Bono Copyright Term Extension Act. Appearing before the court, Lessig argued that perpetually extending the term of copyright violated the Constitution's stipulation that copyright exist for only "a limited time.'' The court rejected Lessig's position by a vote of seven to two, holding that while the extension was perhaps unwise on policy grounds, it was still within Congress's constitutional authority. A second legal challenge, which Lessig brought in 2004, went nowhere.

Developments on the legislative front have been, if anything, more discouraging. Laws that strengthen copyright and increase penalties for infringement are introduced, and reintroduced, in Congress every year. In 2004, the Induce Act, a bill so broadly drawn that it would have held manufacturers of TiVo and iPods legally responsible if their customers used them for infringing copyright, died in committee, but it is only a matter of time before a similar piece of legislation passes.

The cultural prong of digital environmentalism has had somewhat more success. Represented by writers like Bollier, Vaidhyanathan (Copyrights and Copywrongs: The Rise of Intellectual Property and How It Threatens Creativity and The Anarchist in the Library: How the Clash Between Freedom and Control is Hacking the Real World and Crashing the System), Kembrew McLeod (Freedom of Expression: Overzealous Copyright Bozos and Other Enemies of Creativity), and others, they all advocate the path of activism and resistance. Working within existing law, they propose that artists and authors aggressively exercise their intellectual property rights in the face of threats and legal challenges from overbearing copyright holders. Bollier, for one, perceives the work of digital environmentalists as benefiting from the momentum generated by legal challenges like Lessig's. "Acts of civil disobedience against the antisocial, personally intrusive claims of copyright law have only grown since the Eldred ruling, in part because of it," he writes.

Their premise is that, like a muscle, intellectual rights grow stronger only when exercised. "For the most part, we don't need any new legislation. Fair use is a great solution, but for it to have any real impact on our culture we need to vigorously and confidently (though not carelessly) employ this legal doctrine in daily life," writes McLeod. The problem, they contend, is less the laws than the lawyers. Lawyers representing copyright holders encourage their clients to limit access to their intellectual property as much as possible. "The lawyers tell us 'You may gaze upon and buy the products of American culture,'" Bollier writes in Brand Name Bullies. "'But don't be so naïve as to think that you can actually use them for your own purposes. We own them.'" And the lawyers representing creators (artists, writers, and filmmakers, for example) who want access to copyrighted material for their work have decided that the transaction cost of boldly exercising fair-use rights is simply too high. Their primary goal is to avoid confrontation, even when they know that the outcome—should the case come to court—would favor their clients. The strategy of the cultural digital environmentalists is twofold. First, they challenge the lawyers at cultural institutions, whether they are book publishers, Internet providers, or movie distributors. Second, they spread the word about how poorly the current intellectual property system balances the rights of individuals and society.

This tactic has given birth to the genre of the "copyright horror story." These are tales of intellectual property laws run amok: The artist who receives a cease-and-desist letter from the Vatican for using an image from the Sistine Chapel in a collage titled "The Sistine Bowl-Off." The company that was sued for devising software to teach tricks to a robot dog. McDonald's claim to own phrases like "Play and fun for everyone" and "Hey, it could happen." An Adobe e-book of Alice's Adventures in Wonderland that bears a warning forbidding one to read it aloud.

In telling such stories, digital-environmentalist writers are trying to do for intellectual property what muckrakers like Lincoln Steffens did for corrupt governments and Eric Schlosser did for fast food: Go behind the curtain to reveal how something we take for granted—in this case, the cultural commons—really works. "We, as citizens, own these commons. They include resources that we have paid for as taxpayers and resources that we have inherited from previous generations," Bollier writes in his previous book, Silent Theft: The Private Plunder of Our Common Wealth. "They are not just an inventory of marketable assets, but social institutions and cultural traditions that define us as Americans and enliven us as human beings."

Some copyright horror stories read like science fiction, depicting life in an anticommons in which everything is owned: letters of the alphabet, familiar phrases, and popular songs like "God Bless America" and "Happy Birthday" (which won't enter the public domain until 2030). And like the best science fiction, these stories pose a serious question: To what extent do we already live in such a place? Is our world an intellectual property version of The Matrix where, despite the illusion of freedom, we are little more than digital sharecroppers, licensers of a culture we mistakenly assume is ours?

The science-fiction metaphor helps explain a tension central to the intellectual property wars. We do, in a sense, live in the space between two competing realities: According to the letter of the law, intellectual property is well protected, but legitimate access to it (by artists, parodists, critics) is guaranteed. In practice, however, our rights to access are ambiguously drawn and, as a result, prohibitively expensive to exercise. The difference in views between the commons and the anticommons is one of perspective. Can an artist who spends a fortune in legal fees successfully defending his legitimate fair use of a copyrighted image really be said to have won? "Fuck fair use," Lessig is fond of saying. "Fair use in America simply means the right to hire a lawyer to defend your right to create."
* * *

The line between science fiction and reality is often difficult to discern, as exhibited by the case of the college student who received trademark #2,127,381 for the phrase "freedom of expression." Fortunately, the student was Kembrew McLeod, who applied for it in order to make a point. McLeod, now professor of communication studies at the University of Iowa, is no stranger to using media pranks to exploit the absurdities of the system. In fact, he even once sold his soul in a glass jar on eBay.

McLeod may be the most optimistic of the digital environmentalists. "We can fight back and win, especially because many recent court decisions have upheld free-speech rights in the age of intellectual property," he writes. Getting people to exercise those rights is another issue. "The problem is that many individuals and companies either don't know this or don't want to take a risk." McLeod's and Bollier's books are full of inspirational stories of those who have taken such risks and successfully faced down the corporations who have improperly used their copyrights, such as artist Tom Forsythe (creator of "Food Chain Barbie"), who was awarded $1.8 million in legal fees after Mattel pursued an "unreasonable and frivolous" suit against him. In September 2003, a group of Swarthmore College students posted on the Internet damning copies of internal memos written by employees of Diebold, the largest producer of electronic voting machines. The memos detailed various security flaws in Diebold's machines, and it wasn't long before the students received cease-and-desist letters demanding that they remove the memos from their websites. Although Diebold withdrew its legal threats in the wake of bad publicity, the students sued the company for falsely accusing them of copyright infringement. On September 30, 2004, a judge agreed that Diebold had deliberately misrepresented its copyright claims and awarded the students legal fees and damages. This past summer, director Robert Greenwald made "fair use" of a substantial amount of Fox News footage in order to document its conservative bias in his documentary Outfoxed: Rupert Murdoch's War on Journalism. Fox grumbled about the movie but never sued Greenwald for copyright infringement. In 2004, underground hip-hop artist DJ Danger Mouse edited together the vocals from Jay-Z's Black Album with selections of the Beatles' White Album to produce The Grey Album. Despite a flurry of cease-and-desist letters from EMI/Capitol (which owns the copyright to The White Album), over 170 websites continued to host The Grey Album in support of DJ Danger Mouse's right to create. It went on to become one of the most frequently downloaded independent albums of all time. The Boston Globe called it "the most creatively captivating" album of the year.

If anything, Bollier's "bullies" and McLeod's "bozos" are their own worst enemies. "As we look back twenty years from now, Mattel and other businesses like Fox News may ironically be remembered as some of the greatest promoters of fair use," writes McLeod. "Virtually every time these companies try to step on freedom of expression® in court they end up expanding the parameters of fair use in case law, and they also intensify the backlash against this kind of behavior."

Recent stirrings in legal theory may give some comfort to the activist wing of digital environmentalism. Taking for granted the fact that the problem is less the letter of intellectual property law than the spirit in which it is interpreted, Richard Posner, a federal appeals judge and prolific legal theorist, and others have suggested some ways to remedy this problem.

Foremost among them is the doctrine of "copyright misuse." In his California Law Review article "Fair Use and Statutory Reform in the Wake of Eldred," Posner argues that it is more valuable, and feasible, to strengthen fair-use practices than to lobby for new copyright laws. The problem with the current system, according to Posner, is that copyright owners systematically make improperly broad claims to their rights. The book, DVD, or baseball-game broadcast that comes with a notice stating that no part of the work may be copied without permission is, in fact, in violation of the doctrine of fair use (for which one doesn't need permission). Posner argues that when a copyright holder affixes a warning on copies of his work that "grossly and intentionally exaggerates the copyright holder's substantive or remedial rights, to the prejudice of publishers of public-domain works, the case for invoking the doctrine of copyright misuse" has been made.

The copyright misuse doctrine is attractive for a number of reasons. It is a flexible approach to protecting the public-policy goals underlying copyright law (promoting "the progress of science and useful arts") without having to pass new laws every time a technical innovation—radio, movies, television, copy machines, VCR, the Internet—creates a new set of challenges for copyright holders. And it is especially valuable to users of copyright because it is "one of the only copyright-limiting doctrines that arise from actions taken by the copyright holder," writes Kathryn Judge in her Stanford Law Review article "Rethinking Copyright Misuse." Aside from the possibility of being sued, the primary problem for those who want to make fair use of copyrighted material is the uncertainty of their position; while the law seems to support them, their backers and/or insurers may deem the cost of exercising their rights excessive. The doctrine of copyright misuse might provide a mechanism for a creator to address that uncertainty. For example, employing the principle of copyright misuse, an artist who believes he has a legitimate right to make fair use of a copyrighted work can proactively challenge a copyright holder who he believes is protecting his work more broadly than required by copyright law. While such a maneuver wouldn't necessarily guarantee that the artist will prevail (he might of course be wrong), copyright misuse is one way the claims of the copyright holder might be tested without enduring an expensive lawsuit.

Copyright misuse isn't as satisfying as a Supreme Court victory or the passing of a new set of intellectual property laws. And it isn't clear that it is robust enough to protect fair use in the way that Posner and others want it to. But perhaps by bolstering the practices of everyday people it will help reclaim a familiar cultural landscape. Because in the end, the goal of digital environmentalism is quite modest: a world in which, as McLeod writes, the digital future looks "a lot like the analog past."

Robert S. Boynton is director of New York University's magazine journalism program. His new book, The New New Journalism: Conversations with America's Best Nonfiction Writers on Their Craft, is being published this month by Vintage.

Copyright © 2005 BookForum