The Amarillo fishwrap carries the latest effort by Berke Breathed (lame) in the Sunday funnies (Opus just ain't funny.) and eschews Doonesbury. Give me Mark Slackmeyer lambasting Rush Limbaugh (hypocritical, pill-popping, lard mountain) any time. Spare me an unfunny penguin. If this be (fair & balanced) redress, so be it.
Sunday, December 21, 2003
Faith & Reason
Faith & Reason is the basic issue of our time. The followers of Faith v. the followers of Reason. Which shall prevail? If this be (fair & balanced) ambivalence, so be it.
Reason and Faith, Eternally Bound
By EDWARD ROTHSTEIN
One might have expected the forces of Reason to be a bit weary after a generation of battling postmodernism and having its power and authority under constant scrutiny. Reason's battles, though, continue unabated. Only now it finds its opposition in the more unyielding claims of religious faith. This latest conflict is over seemingly incompatible ways of knowing the world. It is a conflict between competing certainties: between followers of Faith, who know because they believe, and followers of Reason, who believe because they know.
This battle echoes others taking place between fundamentalist terror, which claims the authority of Faith, and Western modernity, which claims the authority of Reason. But some of Reason's combatants — as if reading from the postmodernist strategy book — are also challenging the heritage of the West, arguing that it, too, has been riddled with absolutist faith, that the reasoned achievements of the Enlightenment are still under threat and that a new understanding of the past must take shape, in which Reason's martyrdom and trials take center stage.
One motivation for Reason's latest salvos is political. A Gallup poll last year said that about 40 percent of Americans considered themselves evangelicals or born-again Christians. They include the president, the attorney general, the speaker of the House and the House majority leader.
Critics of the Bush administration's policies sometimes cite such beliefs as evidence of the administration's potential fundamentalism and intolerance. In the recent book "A Devil's Chaplain" (Houghton Mifflin, $24), for example, Richard Dawkins, the Oxford University evolutionary biologist, worries about American responses to the attacks of 9/11 because "the United States is the most religiose country in Christendom, and its born-again leader is eyeball to eyeball with the most religiose people on Earth."
Mr. Dawkins has long been a harsh critic of religion, which he considers a form of infectious virus that readily replicates, spreading its distortions. Last summer he lobbied in The Guardian for adopting "bright" as a noun to mean atheist (as in "I'm a bright. You're a bright").
The philosopher Daniel C. Dennett echoed his urgings in an Op-Ed article in The New York Times. Mr. Dawkins and Mr. Dennett argue that brights are a beleaguered group confronting a growing religious right; they urge brights to emerge from their closet and boldly proclaim their identity.
"So, what's the opposite of a bright?" Mr. Dawkins imagines someone asking, "What would you call a religious person?"
"What would you suggest?" he coyly responds.
There are of course approaches that are less blunt and more liberal minded, but the sense of embattlement and polemic has become familiar. In the recent book "The Closing of the Western Mind" (Knopf, $30), for example, Charles Freeman argues that Western history has to be retold. Over the course of centuries, he points out, the ancient Greeks recognized the importance of reason, giving birth to the techniques of modern science and mathematics, and establishing the foundations of the modern state. But then, he writes, came "the closing of the Western mind."
In the fourth and fifth century, he writes, the Greek intellectual tradition "was destroyed by the political and religious forces which made up the highly authoritarian government of the late Roman empire," particularly with the imposition of Christian orthodoxy. For a millennium doctrine ruled. Reason became heresy.
It is precisely this sort of heresy that Jennifer Michael Hecht celebrates in "Doubt: A History" (HarperSanFrancisco, $27.95), which outlines the views of those who rejected dominant doctrines of faith or proclaimed disbelief in the existence of God. Her loosely defined roster of doubters ranges from the ancient Greeks to Zen Buddhists, along with such familiar figures as Galileo, Hobbes, Gibbon, Tom Paine and Thomas Jefferson.
Ms. Hecht is more generous than Mr. Dawkins, noting that just as there are believers who "refuse to consider the reasonableness of doubt," so, too, there are nonbelievers who "refuse to consider the feeling of faith." But her sympathies are committed to the doubters, including such unusual figures as the Islamic philosopher and physician Abu Bakr al-Razi (854-925) and Annie Besant, who wrote a "Gospel of Atheism" in 1876, helped reform London schools with free meals and medical care, and later in life became a theosophist and a translator of the Bhagavad-Gita.
Ms. Hecht's goal is to provide an affirmative history for doubters. "To be a doubter," she writes, "is a great old allegiance, deserving quiet respect and open pride."
What, though, is the nature of this doubt? Its demarcation from faith is not as precise as these descriptions suggest. Doubt can become a rigid orthodoxy in its own right. In contemporary life, as Ms. Hecht seems to know, doubt has become almost axiomatic (as if it were a matter of faith).
Meanwhile faith itself is riddled with doubt. As Ms. Hecht points out, many religious texts (like Job or Augustine's "Confessions") are also accounts of doubt.
Yet in these arguments faith is often portrayed as monolithic, a host for intolerance and inquisition. And while that has been part of many religions' history — and is, as Mr. Freeman shows, part of the history of Christianity — the nature of faith is far more complex.
In his recent book, "The Transformation of American Religion" (Free Press, $26) for example, the sociologist Alan Wolfe suggests that evangelical Christians in the United States cannot be thought of as they once were. Religion, he argues, has been transformed by American culture to become therapeutic, individualistic and less interested in doctrine than in faith.
Nor is faith always unreasonable. Religious beliefs were fundamental to the abolition of slavery in the 19th century and to the civil rights movement in the 20th. Faith may even be latent in some of science's triumphs, inspiring such figures as Newton and Kepler. The conviction that there is an order to things, that the mind can comprehend that order and that this order is not infinitely malleable, those scientific beliefs may include elements of faith.
Reason also has its own problems. Isaiah Berlin argued that the Enlightenment led to the belief that human beings could be reshaped according to reason's dictates. And out of that science of human society, he argued, came such totalitarian dystopias as the Soviet Union.
Reason then, has its limits. The philosopher Robert Fogelin's new book, "Walking the Tightrope of Reason" (Oxford, $22) is subtitled "The Precarious Life of a Rational Animal" because, he argues, reason's own processes negotiate a precipice. Mr. Fogelin quotes Kant, who described a dove who "cleaving the air in her free flight, and feeling its resistance, might imagine that its flight would be still easier in empty space."
Failing to understand what keeps her aloft and taking a leap of faith, the dove might set off in "empty space" — a vacuum — and plummet. But reason might lead to the same end: if something offers resistance then logically can't one proceed more easily if it is eliminated? So why not try?
The problem is that the bird can never fully comprehend the medium through which it experiences the world. In many ways, Kant argued, neither could the mind. Reason is still the only tool available for certain knowledge, but it also presents questions it is unable to answer fully.
Some of those questions may remain even after contemporary battles cease: how much faith is involved in the workings of reason and how much reason lies in the assertions of faith?
Edward Rothstein is cultural critic at large for the New York Times and writes the paper's "Connections" column on technology. He is the author of Emblems of Mind: The Inner Life of Music and Mathematics.
Copyright 2003 The New York Times Company
What Color Is Your Paradigm?
More people used file-sharing software in 2000 than voted for W in the presidential election. It's the economy. stupid! was the campaign mantra of 1992. It's the Internet, stupid! will be the campaign mantra of 2004? If this be (fair & balanced) geekiness, so be it.
[x NYtimes]
FRANK RICH
Napster Runs for President in '04
Even after Saddam Hussein was captured last weekend, all that some people could talk about was Howard Dean. Neither John Kerry nor Joe Lieberman could resist punctuating their cheers for an American victory with sour sideswipes at the front-runner they still cannot fathom (or catch up to). Pundits had a nearly unanimous take on the capture's political fallout: Dr. Dean, the one-issue candidate tethered to Iraq, was toast — or, as The Washington Post's Tom Shales memorably put it, "left looking like a monkey whose organ grinder had run away."
I am not a partisan of Dr. Dean or any other Democratic candidate. I don't know what will happen on Election Day 2004. But I do know this: the rise of Howard Dean is not your typical political Cinderella story. The constant comparisons made between him and George McGovern and Barry Goldwater — each of whom rode a wave of anger within his party to his doomed nomination — are facile. Yes, Dr. Dean's followers are angry about his signature issue, the war. Dr. Dean is marginalized in other ways as well: a heretofore obscure governor from a tiny state best known for its left-wing ice cream and gay civil unions, a flip-flopper on some pivotal issues and something of a hothead. This litany of flaws has been repeated at every juncture of the campaign this far, just as it is now. And yet the guy keeps coming back, surprising those in Washington and his own party who misunderstand the phenomenon and dismiss him.
The elusive piece of this phenomenon is cultural: the Internet. Rather than compare Dr. Dean to McGovern or Goldwater, it may make more sense to recall Franklin Roosevelt and John Kennedy. It was not until F.D.R.'s fireside chats on radio in 1933 that a medium in mass use for years became a political force. J.F.K. did the same for television, not only by vanquishing the camera-challenged Richard Nixon during the 1960 debates but by replacing the Eisenhower White House's prerecorded TV news conferences (which could be cleaned up with editing) with live broadcasts. Until Kennedy proved otherwise, most of Washington's wise men thought, as The New York Times columnist James Reston wrote in 1961, that a spontaneous televised press conference was "the goofiest idea since the Hula Hoop."
Such has been much of the reaction to the Dean campaign's breakthrough use of its chosen medium. In Washington, the Internet is still seen mainly as a high-velocity disseminator of gossip (Drudge) and rabidly partisan sharpshooting by self-publishing excoriators of the left and right. When used by campaigns, the Internet becomes a synonym for "the young," "geeks," "small contributors" and "upper middle class," as if it were an eccentric electronic cousin to direct-mail fund-raising run by the acne-prone members of a suburban high school's computer club. In other words, the political establishment has been blindsided by the Internet's growing sophistication as a political tool — and therefore blindsided by the Dean campaign — much as the music industry establishment was by file sharing and the major movie studios were by "The Blair Witch Project," the amateurish under-$100,000 movie that turned viral marketing on the Web into a financial mother lode.
The condescending reaction to the Dean insurgency by television's political correspondents can be reminiscent of that hilarious party scene in the movie "Singin' in the Rain," where Hollywood's silent-era elite greets the advent of talkies with dismissive bafflement. "The Internet has yet to mature as a political tool," intoned Carl Cameron of Fox News last summer as he reported that the runner-up group to Dean supporters on the meetup.com site was witches. "If you want to be a Deaniac," ABC News's Claire Shipman said this fall, "you've got to know the lingo," as she dutifully gave her viewers an uninformed definition of "blogging."
In Washington, the only place in America where HBO's now-canceled "K Street" aroused histrionic debate, TV remains all. No one knew what to make of the mixed message sent by Dr. Dean's performance on "Meet the Press" in June: though the candidate flunked a pop quiz about American troop strength (just as George W. Bush flunked a pop quiz about world leaders in 1999), his Internet site broke its previous Sunday record for contributions by a factor of more than 10. More recently, the dean of capital journalists, David Broder, dyspeptically wrote that "Dean failed to dominate any of the Democratic candidate debates." True, but those few Americans who watched the debates didn't exactly rush to the candidate who did effortlessly dominate most of them, Al Sharpton. (Mr. Sharpton's reward for his performance wasn't poll numbers or contributions but, appropriately enough, a gig as a guest host on "Saturday Night Live.")
"People don't realize what's happened since 2000," said Joe Trippi, the Dean campaign manager, when I spoke to him shortly after Al Gore, the Democrats' would-be technopresident, impulsively crowned Dr. Dean as his heir. "Since 2000, many more millions have bought a book at Amazon and held an auction on e-Bay. John McCain's Internet campaign was amazing three years ago but looks primitive now." The Dean campaign, Mr. Trippi explained, is "not just people e-mailing each other and chatting in chat rooms." His campaign has those and more — all served by countless sites, many of them awash in multi-media, that link the personal (photos included) to the political as tightly as they link to each other.
They are efficient: type in a ZIP code and you meet Dean-inclined neighbors. Search tools instantly locate postings on subjects both practical (a book to give as a present to a Dean supporter?) and ideological. The official bloggers update the news and spin it as obsessively as independent bloggers do. To while away an afternoon, go to the left-hand column of the official blogforamerica.com page and tour the unofficial sites. On one of three Mormon-centric pages, you can find the answer to the question "Can Mormons be Democrats?" (Yes, they can, and yes, they can vote for Howard Dean.) At www.projectdeanlight.com, volunteers compete at their own expense to outdo each other with slick Dean commercials.
But the big Dean innovation is to empower passionate supporters to leave their computer screens entirely to hunt down unwired supporters as well and to gather together in real time at face-to-face meetings they organize on their own with no help from (or cost to) the campaign hierarchy. Meetup.com, the for-profit Web site that the Dean campaign contracted to facilitate these meetings, didn't even exist until last year. (It is not to be confused with the symbiotic but more conventional liberal advocacy and fund-raising site,
MoveOn.org.) Its success is part of the same cultural wave as last summer's "flash mob" craze (crowds using the Internet to converge at the same public place at the same time as a prank) and, more substantially, the spike in real rather than virtual social networks, for dating and otherwise, through sites like match.com and friendster.com. From Mr. Trippi's perspective, "The Internet puts back into the campaign what TV took out — people."
To say that the competing campaigns don't get it is an understatement. A tough new anti-Dean attack ad has been put up on the campaign's own site, where it's a magnet for hundreds of thousands of dollars in new contributions. The twice-divorced Dennis Kucinich's most effective use of the Web thus far has been to have a public date with the winner of a "Who Wants to Be a First Lady?" Internet contest. Though others have caught up with meetup.com, only the Wesley Clark campaign is racing to mirror Dr. Dean's in most particulars. The other Democratic Web sites are very 2000, despite all their blogs and other gizmos.
"The term blog is now so ubiquitous everyone has to use it," says the author Steven Johnson, whose prescient 2001 book "Emergence" is essential reading for anyone seeking to understand this culture. On some candidates' sites, he observes, "there is no difference between a blog and a chronological list of press releases." And the presence of a poll on a site hardly constitutes interactivity. The underlying principles of the Dean Internet campaign "are the opposite of a poll," Mr. Johnson says. Much as thousands of connected techies perfected the Linux operating system's code through open collaboration, so Dean online followers collaborate on organizing and perfecting the campaign, their ideas trickling up from the bottom rather than being superimposed from national headquarters. (Or at least their campaign ideas trickle up; policy is still concentrated at the top.) It's almost as if Dr. Dean is "a system running for president," in Mr. Johnson's view, as opposed to a person.
In that sense, the candidate is a perfect fit for his chosen medium. Though his campaign's Internet dependence was initially dictated by necessity when he had little organization and no money, it still serves his no-frills personality even when he's the fund-raising champ. Dr. Dean runs the least personal of campaigns; his wife avoids the stump. That's a strategy befitting an online, not an on-TV, personality. Dr. Dean's irascible polemical tone is made for the Web, too. Jonah Peretti, a new media specialist at Eyebeam, an arts organization in New York, observes that boldness is to the Internet what F.D.R.'s voice was to radio and J.F.K.'s image to television: "A moderate message is not the kind of thing that friends want to e-mail to each other and say, `You gotta take a look at this!' "
Unlike Al Gore, Dr. Dean doesn't aspire to be hip about computers. "The Internet is a tool, not a campaign platform," he has rightly said, and he needn't be a techie any more than pilot his own campaign plane. But if no tool, however powerful, can make anyone president in itself, it can smash opponents hard when it draws a ton of cash. Money talks to the old media and buys its advertising. Dr. Dean's message has already upstaged the official Democratic party and its presumed rulers, the Clintons. Thanks to the Supreme Court's upholding of the McCain-Feingold campaign finance reform, he also holds a strategic advantage over the Democratic National Committee in fund-raising, at least for now.
Should Dr. Dean actually end up running against President Bush next year, an utterly asymmetrical battle will be joined. The Bush-Cheney machine is a centralized hierarchy reflecting its pre-digital C.E.O. ethos (and the political training of Karl Rove); it is accustomed to broadcasting to voters from on high rather than drawing most of its grass-roots power from what bubbles up from insurgents below.
For all sorts of real-world reasons, stretching from Baghdad to Wall Street, Mr. Bush could squish Dr. Dean like a bug next November. But just as anything can happen in politics, anything can happen on the Internet. The music industry thought tough talk, hard-knuckle litigation and lobbying Congress could stop the forces unleashed by Shawn Fanning, the teenager behind Napster. Today the record business is in meltdown, and more Americans use file-sharing software than voted for Mr. Bush in the last presidential election. The luckiest thing that could happen to the Dean campaign is that its opponents remain oblivious to recent digital history and keep focusing on analog analogies to McGovern and Goldwater instead.
Copyright © 2003 The New York Times Company
No Standing O By Default
I attended the community theater performance of the The Man of La Mancha recently. The audience lept to its feet and gave the cast a standing O. I didn't think the production was good high school work, but I got to my feet along with everyone else. Mob psychology? Cliché? Cheapened taste? Philistinism? Pity? If this is (fair & balanced) candor, so be it.
[x NYTimes]
The Tyranny of the Standing Ovation
By JESSE McKINLEY
A few weeks back, just after "Taboo" opened to harsh reviews, lackluster ticket sales and rumors of its imminent demise, a press representative for Rosie O'Donnell, the show's producer, proudly announced that the production was a success.
The proof? "We've played 21 performances," the press rep said. "And have received 21 standing ovations."
Well, not to be a Grinch, but that and two bucks would get Ms. O'Donnell on the subway. Go to nearly any Broadway house, any night, and you can catch a crowd jumping up for the curtain call like politicians at a State of the Union address. And just as in politics, the intensity of the ovation doesn't necessarily reflect the quality of the performance.
The phenomenon has become so exaggerated, in fact, that audiences now rise to their feet for even the very least successful shows. Recent Broadway flops like "Jackie Mason's Laughing Room Only," which closed in less than two weeks, "The Oldest Living Confederate Widow Tells All," which closed on opening night, and "Bobbi Boland," which closed in previews, all received standing ovations.
This sort of effusive praise can also be witnessed far from Broadway. Opera fans have never shied away from huzzahs (not to mention boos), but lately even classical music crowds have been getting in on the act — and out of their seats. Modern dance and ballet fans might be a tad more discerning, but they regularly rise up when the curtain falls. Even British audiences, who used to insist that they "only stand for the Queen," have been seen leaping to their feet on the West End like junior high school drama students on a class trip to "Cats."
"It's gotten totally out of hand," said Chita Rivera, the 70-year-old phenom who made her Broadway debut in 1955, when she sweated out curtain call after curtain call without ever seeing the audience rise. "It's become a bit of audience participation. What does it mean anymore?"
Liz Smith, the syndicated columnist who has seen a few shows in her day, agrees. "Now the standing ovation is de rigueur," she said. "They would give a standing ovation to `Moose Murders' if they could revive it," she added, referring to the infamous 1983 Broadway flop, which — just for the record — should never, ever be revived.
Most Broadway veterans trace the change to the steep rise, over the last decade or so, in the cost of a ticket. "I guess the audience just feels having paid $75 to sit down, it's their time to stand up," said the playwright Arthur Miller. "I don't mean to be a cynic but it probably all changed when the price went up."
Just how those rising prices produce rising audiences is not, however, an easy question. John Lahr, the theater critic for The New Yorker magazine, sees a complex psychological dynamic at work. "I think it's generally an attempt by the audience at self-hypnosis," he said. "They think if they go to a show and stand at the end they've had a good time. They're trying to give themselves the experience they thought they should have."
If it starts out as a kind of personal therapy, though, it quickly exerts an effect on group psychology. Kathy Duprey, who was visiting from Dallas, saw "The Phantom of the Opera" on a recent Monday, and was surprised to find herself on her feet during the curtain call. "I saw other people standing up, so I did too," she said. Her friend Samantha Hall, also from Dallas, agreed: "The emotion of the curtain call just gets to you."
Emotion, or maybe just plain old peer pressure. "Unless I'm terribly bowled over I don't want to stand," said Jonathan Sale, 30, a New York actor. "At the same time, I'm also not going to be the only one who sits down. That makes a statement." And dooms one, of course, to spend long minutes staring at a stranger's backside rather than a favorite performer.
Those who admit to being first on their feet, however, see no need for theorizing or socioeconomic speculation. They say they give standing ovations for one very simple reason: to show their appreciation. "A good audience can help a show," said Doug Friedman, 40, who in addition to seeing a lot of shows, has appeared in "Tommy" and "A Chorus Line." "A standing ovation can really make it easier to do eight shows a week."
It can also be a way to single out an exceptional performance. Kevin Kline, now appearing in "Henry IV," and Bernadette Peters, in "Gypsy," regularly get audiences to their feet, but so does Hugh Jackman, whose show, "The Boy From Oz," got mediocre reviews. "I generally stand for specific performers," said Susan Schmidt, 34, a mother of three from Westport, Conn. She most recently leaped up for one of the puppeteers at a performance of "Avenue Q," though, she later admitted, it was difficult to pinpoint which one, "when there are puppets involved."
The gesture can mean a great deal to performers — so much so that they've been known to actively solicit it. Harvey Fierstein, the star of "Hairspray," reports that through more than 550 performances, his show has received a standing ovation every single time. But years ago, during the last days of "Torch Song Trilogy," he found the audience to be a bit less forthcoming. So he learned how to coax them along: with a broad sweep of the arm he described as "Quentin Crisp doing Eva Perón." And sure enough, he said, "they would rise, rise, rise."
Producers have been known to work a few tricks, too, though in their case the motivation is less ego than business. To impress critics who see a show in previews, producers sometimes pack the house with claques and friends of the production who are all the more willing to pour it on. The likelihood of a standing ovation can even become a selling point for a show, as with "Blood Brothers," which ran on Broadway during the early 1990's, and advertised that it always got a standing ovation. There are even ways to write that imperative into the show itself. In "Mamma Mia!" for example, the Abba spectacle at the Winter Garden Theater, the last number is one for which the whole audience tends to get up and dance. "They get you up under false pretenses," Mr. Lahr said. "And you stay up."
To Mr. Lahr, it's all part of the effort to "enchant and infantalize" an audience. "Essentially, when you're talking about a standing ovation," he said, "you're talking about the spellbound."
When did all this standupmanship begin? The Greeks invented theater, of course, and the Romans — with those lions at the Colosseum — probably invented criticism. Much of the audience in Shakespeare's time stood up through the whole play because they didn't have seats in the first place. "Maybe when they really liked something," Mr. Miller said, "they sat down."
Opera fans probably began standing for exceptional performances sometime around the 17th century. But theater historians tend to agree that the standing ovation emerged in its current form on Broadway in the years after World War II. At first, it was reserved as a special honor for classic dramatic performances — Fredric March and Florence Eldridge in "Long Day's Journey Into Night" (1956), Zero Mostel in "Rhinoceros" (1961). Then in the mid-1960's, all that changed.
Ethan Mordden, a scholar of the history of the American musical, cites what he calls the Big Lady Theory to explain the shift. In classic 1950's musicals like "My Fair Lady," Mr. Mordden said, "the music for the bows is so short there's barely time for the ensemble to come running out and the leads to take their bows before the curtain comes down." But a new generation of musicals designed to showcase a diva — like "Hello, Dolly!" (produced in 1964, with Carol Channing) and "Mame" (1966, with Angela Lansbury) — saw, he says, "the advent of the staged, sung curtain call."
"The whole curtain call is built to a climax," he explained. "The ensemble bows and sings. The male leads bow, and supporting women, and everything builds and builds and builds, and then when everyone's attention is focused, the star comes out in her 37th Bob Mackie gown of the evening. By that point, you have no choice but get to your feet."
Audiences in other mediums found their own ways of showing special approval. Fans of the 19th-century ballerina Fanny Elssler were said to have boiled and eaten her slipper. More recently, jazz fans snapped their fingers and nodded, and rock 'n' roll fans lit their lighters (before they quit smoking). Fans in Eastern Europe still clap rhythmically or stomp their feet, while grateful Japanese audiences might applaud for as many as 20 curtain calls, all without leaving their seats.
In the realm of classical music, standing ovations emerged somewhat organically. Zarin Mehta, the New York Philharmonic's executive director, says that some classical pieces — say, Rachmaninoff's Second Piano Concerto or Beethoven's Ninth Symphony — build to a climax that prompts the audience to its feet in a great rush. But it no longer takes that kind of swelling score to achieve that result. "The real, true standing ovations haven't become devalued," said Ara Guzelimian, senior director and artistic adviser of Carnegie Hall. "But we have seen more of the `I have to get out to Connecticut' type. The half — and half-hearted — standing ovation has become more common."
Still, Dr. Bertram Schaffner, a 91-year-old psychiatrist who has been attending classical music concerts in the city since the 1930's, says he finds it heartening, both as a ticket holder and as a trained professional. "I wouldn't call it mob mentality," he said. "I would say that people nowadays are simply more free about showing how they feel. People enjoy showing their pleasure."
WHATEVER the motivation, the effect of the rampant increase in standing ovations has been accompanied by — as with any other form of inflation — a decrease in value. If almost every performance receives one, then it ceases to be a meaningful compliment — and actors who don't get one cease to be able to console themselves.
Have we really reached the point in this crazy mixed-up world where even thunderous applause means nothing unless delivered from a standing position? Even actors — who never met a fan they didn't like — say that in an age when everything is worthy of the highest accolade, it's hard to tell how much an audience actually likes you. "It's a phenomenon that I think we can all do very well without," said Brian Murray, the South African-born actor currently appearing in "Beckett/Albee" Off Broadway. "If they do it automatically, they might as well not bother."
Tovah Feldshuh, who plays Golda Meir in "Golda's Balcony," on Broadway at the Helen Hayes Theater, says that now her favorite reception is the one that suggests the audience is hanging on her every word. "I like when we get what I call the stunned ovation," she said. "The other night, I counted 14 seconds before the first clap."
Her appreciation of that long pause, and her audience's self-restraint, may point the way to a less effusive, more meaningful future of dramatic feedback. But for the time being, though standing ovations may be overused, overexposed and downright cheap, most performers will still happily admit that they feel, well, awfully nice.
"Maybe it's become rather common, but I can't say it feels any worse," Mr. Fierstein said. "It's kind of like sex. Just because the guy down the street got it doesn't mean it doesn't feel good."
Copyright © 2003 The New York Times Company