Arthur Schopenhauer (1788-1860) was one of the greatest philosophers of the 19th century, Schopenhauer seems to have had more impact on literature (e.g. Thomas Mann) and on people in general than on academic philosophy. Perhaps that is because, first, he wrote very well, simply and intelligibly (unusual, we might say, for a German philosopher, and unusual now for any philosopher), second, he was the first Western philosopher to have access to translations of philosophical material from India, both Vedic and Buddhist, by which he was profoundly affected, to the great interest of many, and, third, his concerns were with the dilemmas and tragedies, in a religious or existential sense, of real life, not just with abstract philosophical problems. If this is (fair & balanced) disputation, so be it.
[x New Statesman]
The Art of Always Being Right
Arthur Schopenhauer; with an introduction by A C Grayling (2004)
Reviewed by George Walden
Schopenhauer's sardonic little book, laying out 38 rhetorical tricks guaranteed to win you the argument even when you are defeated in logical discussion, is a true text for the times. An exercise in irony and realism, humour and melancholy, this is no antiquarian oddity, but an instruction manual in intellectual duplicity that no aspiring parliamentarian, trainee lawyer, wannabe TV interviewer or newspaper columnist can afford to be without.
The melancholy aspect comes in the main premise of the book: that the point of public argument is not to be right, but to win. Truth cannot be the first casualty in our daily war of words, Schopenhauer suggests, because it was never the bone of contention in the first place. "We must regard objective truth as an accidental circumstance, and look only to the defence of our own position and the refutation of the opponent's . . . Dialectic, then, has as little to do with truth as the fencing master considers who is in the right when a quarrel leads to a duel." Such phrases make us wonder whether his book was no more than a bitter satire, an extension of Machiavellian principles of power play from princes to individuals by a disappointed academic whom it took 30 years to get an audience for his major work, The World as Will and Idea. Perhaps, but only partly. With his low view of human nature, Schopenhauer is also saying that we are all in the sophistry business together.
The interest of his squib goes beyond his tricks of rhetoric: "persuade the audience, not the opponent", "put his theory into some odious category", "become personal, insulting, rude". Instinctively, we itch to apply it to our times, whether in politics, the infotainment business or our postmodern tendency to place inverted commas, smirkingly, around the very notion of truth. Examples of jaw-dropping sophistry by public figures (my own favourite is Tony Blair defending his quasi-selective choice of school for his son on the grounds that he did not wish to impose political correctness on his children: see Schopenhauer's rule number 26: "turn the tables") are easy enough to find. It is more entertaining to see his theory in the light of our national peculiarities.
The flip side of our "healthy scepticism" can be a disinclination to trouble ourselves with rational discussion at all, and a tediously moderate people can be bored by its own sobriety. So it is that, in debate, we prefer to be stirred by passions, or simply amused. Hence the rampant nostalgia for the old political order, dominated by orators such as Michael Foot or Enoch Powell. Each did real damage to the country, Foot with his patrician self-abasement in the face of trade union power, Powell on race, and both with their culpable fantasies about Russia.
"Well you say that," comes the predictable response - a handy rhetorical trick in itself - "but let's not get into their policies; we could go round that buoy for ever" (see trick number 12: "choose metaphors favourable to your proposition"). "The point is that they were such wonderfully passionate, col-ourful and entertaining debaters, compared to the managerial drabness of the House of Commons today." (Trick 29 recommends diversion from the point at issue.) The pay-off line follows quickly (draw your conclusions smartly, says trick 20). "If only we had Boris as Tory leader, it would perk the place up no end!" (This is not wholly invention. Tory and Labour columnists have both written in this vein.)
Perhaps because Schopenhauer was so very un-British, his 38 points overlooked our favourite rhetorical trick: coming up with "quirky" or "original" responses to serious questions. (The nearest he gets is trick number 36: "bewilder your opponent".) In Britain, a willed eccentricity, the cheapest form of distinction, works because it is part of our top-down ethos. The game is to dodge the issue in such a way as to show yourself above it - for example, by throwing off dandyish opinions. Take any premise ("Boris Johnson is not a serious contender for prime minister"), invert it, toss it to the herd with a supercilious smile - and the herd will warm to you, because we do so love a maverick, don't we? For similar reasons, "controversialists" (that is, vulgar cynics who argue positions they do not necessarily believe, the better to astound the impressionable masses) are a very British phenomenon.
The anti-intellectualism all this implies is not, however, a uniquely British trait, and is covered in Schopenhauer's list. "If you know that you have no reply to the arguments your opponent advances . . . declare yourself to be an incompetent judge: 'What you say passes my poor powers of comprehension.'" Your opponent stands instantly convicted of pretension, a crime without appeal in democracies, of which Schopenhauer was no admirer. Truth and logic, he comes close to saying, get you nowhere in a mass society. "The only safe rule, therefore, is [to dispute] only with those of your acquaintance of whom you know that they possess sufficient intelligence and self-respect not to advance absurdities."
In a frequently light-hearted book, this is the least amusing message. The suggestion is that the audiences for serious discussion are doomed to shrink - and remember that Schopenhauer never experienced the sophistry of TV images, whose deliberate or, more frequently, casual mendacity a mere 38 points would not suffice to explain. Yet has his lugubrious prediction proved true? Or do we rather get a feeling, not of an absolute decline in standards of public debate, but of missed potential - something even the BBC has apparently begun to recognise? How many times have we listened to a radio or TV debate on art or politics or literature and asked ourselves, even as we are lulled by the undemanding discussion: are these the best people they can come up with? The answer is yes and no. Yes because in media terms they are the best: practised "communicators" with every crowd-pleasing response at the ready. And no because we have all read or heard or known people far more interesting and far more informed about the disciplines in question. Sadly, they tend to be folk who are not up to speed on their 38 points and who think the truth matters, and so, communication-wise, they are deemed useless. Still, they exist.
If your preference is nevertheless for Schopenhauer's tragic vision of a world in thrall to debate that is indifferent to the truth, examples are not lacking, not just in art or politics, but in the allegedly objective and internationalist scientific world. A brief period as minister for science taught me that when it comes to rubbishing a rival's research or inveigling funds for your own, objectivity is out, and foreigners become a joke. Now I hear neo-Darwinian atheists lambasting as primitive and irrational every religion except the most populous and, in its extreme form, the most dangerous. Why are scientists so intellectually dishonest? For the same reason that the Archbishop of Canterbury hides behind procedural sophistry (needless commissions of inquiry and the like, when the need for liberalism is clear) in dealing with homosexuality in the Church: politics, dear boy. Which does rather diminish the right of scientists and churchmen to look down on politics as a scurvy trade.
The palm for rhetorical shamelessness must nevertheless go to US presidents. "There you go again," said Ronald Reagan, annihilating with a grin the very concept of rational debate, and the right loved him for it. "I did not have sexual relations with that woman," Bill Clinton assured us, with his emetic sincerity, and the left - especially women - adore him still. And not even the melancholic German predicted that the world's most powerful democracy would one day be run by a president who cannot be accused of sophistry chiefly because he cannot talk at all. And they say Schopenhauer was a pessimist.
George Walden is the author of The New Elites: Making a Career in the Masses (Penguin) This review first appeared in the New Statesman.
© New Statesman 2004
Wednesday, December 29, 2004
There You Go Again: Never Lose An Argument In 2005!
An Economic Tsunami?
While W rearranges furniture on the deck of the Titanic, we face the economic equivalent of the disaster that recently struck Southeast Asia, India, and the Horn of Africa. Sobering stuff as we slouch into 2005. If this is (fair & balanced) dread, so be it.
[x Washington Post]
The Next Economy
By Robert J. Samuelson
We are undergoing a profound economic transformation that is barely recognized. This quiet upheaval does not originate in some breathtaking technology but rather in the fading power of forces that have shaped American prosperity for decades and, in some cases, since World War II. As their influence diminishes, the economy will depend increasingly on new patterns of spending and investment that are still only dimly apparent. It is unclear whether these will deliver superior increases in living standards and personal security. What is clear is that the old economic order is passing.
By any historical standard, the record of these decades -- despite flaws -- is remarkable. Per capita income (average income per person) is now $40,000, triple the level of 60 years ago. Only a few of the 10 recessions since 1945 have been deep. In the same period, unemployment averaged 5.9 percent. The worst year was 9.7 percent in 1982. There was nothing like the 18 percent of the 1930s. Prosperity has become the norm. Poverty and unemployment are the exceptions.
But the old order is slowly crumbling. Here are four decisive changes:
• The economy is bound to lose the stimulus of rising consumer debt. Household debt -- everything from home mortgages to credit cards -- now totals about $10 trillion, or roughly 115 percent of personal disposable income. In 1945, debt was about 20 percent of disposable income. For six decades, consumer debt and spending have risen faster than income. Home mortgages, auto loans and store credit all became more available. In 1940, the homeownership rate was 44 percent; now it's 69 percent. But debt can't permanently rise faster than income, and we're approaching a turning point. As aging baby boomers repay mortgages and save for retirement, debt burdens may drop. The implication: weaker consumer spending.
• The benefits from defeating double-digit inflation are fading. Remember, in 1979, inflation peaked at 13 percent; now it's 1 to 3 percent, depending on the measure. The steep decline led to big drops in interest rates and big increases in stock prices (as interest rates fell, money shifted to stocks). Stocks are 12 times their 1982 level. Lower interest rates and higher stock prices encouraged borrowing and spending. But these are one-time stimulants. Mortgage rates can't again fall from 15 percent (1982) to today's 5.7 percent. Nor will stocks soon rise twelvefold. The implication: again, weaker consumer spending.
• The welfare state is growing costlier. Since the 1930s, it has expanded rapidly -- for the elderly (Social Security, Medicare), the poor (Medicaid, food stamps) and students (Pell grants). In 2003, federal welfare spending totaled $1.4 trillion. But all these benefits didn't raise taxes significantly, because lower defense spending covered most costs. In 1954, defense accounted for 70 percent of federal spending and "human resources" (aka welfare), 19 percent. By 2003, defense was 19 percent and human resources took 66 percent. Aging baby boomers and higher defense spending now doom this pleasant substitution. Paying for future benefits will require higher taxes, bigger budget deficits or deep cuts in other programs. All could hurt economic growth.
• The global trading system has become less cohesive and more threatening. Until 15 years ago, the major trading partners (the United States, Europe and Japan) were political and military allies. The end of the Cold War and the addition of China, India and the former Soviet Union to the trading system have changed that. India, China and the former Soviet bloc have also effectively doubled the global labor force, from 1.5 billion to 3 billion workers, estimates Harvard economist Richard Freeman. Global markets are more competitive; the Internet -- all modern telecommunications -- means some service jobs can be "outsourced" abroad. China and other Asian countries target the U.S. market with their exports by fixing their exchange rates.
Taken at face value, these are sobering developments. The great workhorse of the U.S. economy -- consumer spending -- will slow. Foreign competition will intensify. Trade agreements, with more countries and fewer alliances, will be harder to reach. And the costs of government will mount.
There are also global implications. The slow-growing European and Japanese economies depend critically on exports. Until now, that demand has come heavily from the United States, which will run an estimated current account deficit of $660 billion in 2004. But if American consumers become less spendthrift -- because debts are high, taxes rise or benefits are cut -- there will be an ominous collision. Diminished demand from Europe, Japan and the United States will meet rising supply from China, India and other developing countries. This would be a formula for downward pressure on prices, wages and profits -- and upward pressure on unemployment and protectionism.
It need not be. China and India are not just export platforms. Billions of people remain to be lifted out of poverty in these countries and in Latin America and Africa. Ideally, their demands -- for raw materials, for technology -- could strengthen world trade and reduce reliance on America's outsize deficits. If so, exports (and manufacturing) could become the U.S. economy's next great growth sector. Already, the dollar has depreciated 15 percent since early 2002; that makes U.S. exports more price-competitive.
What's at issue is the next decade, not the next year. We know that the U.S. economy is resilient and innovative -- and that Americans are generally optimistic. People seek out new opportunities; they adapt to change. These qualities are enduring engines for growth. But they will also increasingly have to contend with new and powerful forces that may hold us back.
Robert J. Samuelson, a contributing editor of Newsweek, has written a column for The Washington Post since 1977. His column generally appears on Wednesdays.
© 2004 The Washington Post Company
The Fraudulent Four
Historians remind me of French foreign minister Charles Maurice de Talleyrand's meditation on the Bourbon kings, "They forget nothing. They learn nothing." Stephen E. Ambrose was a thief. Doris Kearns Goodwin is a thief. Michael Bellesiles is a thief. Joseph Ellis, while not convicted of plagiarism (yet), is a liar. David Greenberg styles these historians, "The Fraudulent Four." If this is (fair & balanced) flimflam x 4, so be it.
[x History News Network]
The Lessons of the History Scandals
By David Greenberg
In 2002, I stumbled across an act of plagiarism by the historian Stephen E. Ambrose that had gone undiscovered, or at least unmentioned, in the reams of pages then being devoted to his scholarly transgressions. In th e third volume of his Nixon biography, Ambrose wrote, "Two wrongs do not make a right, not even in politics, but they do make a precedent." It was a clever aphorism—uncommonly clever, I now realize, for a man normally given to brown-bag prose. The real author was Richard Nixon's longtime pal and apologist Victor Lasky, who in his 1977 best seller It Didn't Start With Watergate had written, "Granted that two wrongs don't make a right, but in law and politics, two wrongs can make a respectable precedent."
At the time, Ambrose was under fire for numerous similar instances of using other people's words without giving credit. But I saw no point in piling on. Ambrose had been sufficiently exposed—stolen phrases were surfacing in book after book—and he wasn't budging from his defense that as a popular historian, he wasn't bound by scholarly rules. And why should he? In academia, Ambrose had become a joke for his mass production of feel-good war stories before the plagiarism, which only sealed his reputation; outside academia, he remained beloved even after the imbroglio. (I did mention the Lasky-Ambrose incident in my book Nixon's Shadow, but to make a larger point.)
Concurrent with the Ambrose scandal, historian Doris Kearns Goodwin was found to have committed similar (though fewer) acts of plagiarism, albeit unintentionally. (Contrary to popular belief, plagiarism needn't be deliberate to warrant the name.) Also that winter, Emory University began investigating charges that Michael Bellesiles, a historian on its faculty, had invented or grossly distorted data to advance the controversial argument, advanced in his prize-winning Arming America: The Origins of a National Gun Culture, that guns weren't prevalent in the antebellum United States. The previous summer, Mount Holyoke historian Joseph Ellis had admitted to lying about his past to students and others, fabricating tales about having served in Vietnam.
Occurring so soon after one another, these flaps struck many commentators as related symptoms of some deeper affliction gripping the historical profession or the academy. Some saw an expression of postmodernism's dangerous relativizing of truth; others discerned a cautionary tale about the perils of writing popular history. Now come two intelligent books about these affairs that implicitly agree that the coincidence of these scandals says something about the state of the profession. Peter Charles Hoffer's Past Imperfect: Facts, Fictions, Fraud—American History From Bancroft and Parkman to Ambrose, Bellesiles, Ellis, and Goodwin and Ron Robin's Scandals and Scoundrels: Seven Cases That Shook the Academy (which omits the Goodwin case but addresses four other flaps involving nonhistorians, such as the "Sokal Hoax" and the fabrications of Nobel Peace Prize-winner Rigoberta MenchĂș) both try to put these events in historical perspective.
Hoffer frames the scandals as the culmination of long-brewing tensions in the historical profession. Reviewing the history of professional history, he recounts how the New Left scholars of the 1960s overthrew the so-called "consensus history" of their forebears, demolishing myths of a harmonious American past and discrediting the history-as-hero-worship on which generations were weaned. But while the New Left historians won out in academia, they never brought most lay history readers around to their viewpoint. Most of the public not only continues to regard history as a discrete, verifiable body of facts—about presidents, wars, great events, and the like—but they like their history to portray America, as one witticism has it, having been born perfect and improving ever since. The gulf between these two conceptions of history remains: The public tends to prefer affirmative tales of political and military triumph, while scholars like skeptical, critical accounts, often focused on the slighted stories of women, African-Americans, and other minorities.
Hoffer is right to highlight this gulf between two conceptions of history. But it's not clear how that gulf produced these recent brouhahas. Sometimes Hoffer seems to fault the post-1960s historians who, he says unpersuasively, "did not have the same motivation as their predecessors for shielding established historical writers ... from criticism." At other times he talks of a "conservative backlash" eager to trash these historians. And on still other occasions he seems to endorse the facile explanation that in their eagerness to win fame, readers, and wealth, these historians fatally cut corners.
The four historians Hoffer discusses not only committed very different offenses; they were "popular" in very different ways. Goodwin won celebrity not by churning out best sellers—she has spent years on each of her books—but through her sunny punditry on PBS and NBC News. Until Ellis snagged a Pulitzer Prize with his best-selling Founding Brothers in 2000, his work, although elegantly written and published by trade presses, hardly resembled pop history. Bellesiles, despite the critical acclaim initially afforded to Arming America, never attained superstardom within the profession or substantial recognition outside it. Only Ambrose might be fairly accused of jettisoning standards to sell books, but the discovery, by Forbes's Mark Lewis, that Ambrose's plagiarism habit began way back in 1964 with his Ph.D. dissertation, published by Louisiana State University Press, suggests his motives were far more complex. Besides, scores of historians, inside and outside the academy, succeed every year in writing history that finds general readers without sacrificing scholarly rigor. Hoffer's popular/scholarly dichotomy is too simplistic.
More unfortunate, Hoffer turns censorious toward the end of his book, praising what he rightly describes as an "auto-da-fe, complete with stake and faggots" perpetrated by opinion-mongers in the media. As a former member of the American Historical Association's Professional Division, Hoffer is understandably peeved that the organization chose to stop its practice of adjudicating charges like those leveled at the Fraudulent Four. But in a book premised on the idea that this quartet of concurrent scandals stemmed from causes deeper than individual character, his solution—rebukes doled out by a professional body—seems naive. It was wise for the AHA to remove itself from the impossible business of resolving these disputes about culpability and instead to try to spread awareness of what good scholarship entails.
In contrast to Hoffer's stern conclusion, Robin's Scandals and Scoundrels is refreshingly free of moralism and alarmism—a must-read for anyone now fuming that Goodwin is back on television or Ellis back on the best-seller list. Though not condoning his subjects' behavior, Robin is more analytical than judgmental, more interested in understanding the meaning of these offenses that in administering another slap to their sorry culprits. "I find," he notes, "that the debates on academic impropriety discussed in this book suggest vibrancy rather than trauma." They demonstrated, he argues, the continuous process of establishing norms for the profession.
There's no reason to believe that acts of academic impropriety are any more common today than they used to be. What changed is the adjudication of wrongdoing, a task that the popular media appropriated from academia. By 2002, the popularity of Ambrose, Bellesiles, Ellis, and Goodwin had placed them under the watchful eye of an increasingly scandal-obsessed and intolerant media. These authors' "popularity" is relevant not because writing for the public somehow encourages shoddiness—it doesn't—but because their prominence allowed reporters and pundits to inflate their acts of wrongdoing into national scandals.
Arbiters in the media rushed in to enforce norms of behavior when they believed that academics were becoming lax. But where scholars tend to resolve disputes through careful, drawn-out deliberation, the media incline toward sensationalism and black-and-white verdicts. Moreover, in the last decade many Americans, including journalists, have adopted a primitive zero-tolerance moralism—a punitive code that encourages the trying of minors as adults, three-strikes-you're-out sentencing, the Borking of Cabinet nominees for minor mistakes, the regular-as-clockwork feeding frenzies in presidential campaigns, and the impeachment of a president for lying about sex. And they have relished the schadenfreude of the downfall of a famous historian, politician, or other celebrity.
For all the media hysteria that standards had fallen, it should be noted that Bellesiles was stripped of his job, Ellis suspended for a year, and Goodwin bounced from the PBS NewsHour and the Pulitzer Prize board. These were all perfectly appropriate punishments. Ambrose, as an author who simply didn't care about his scholarly reputation anymore and who could get paid handsomely for cookie-cutter best sellers, seemed distressingly beyond penalty. But, a lifelong smoker who had testified in court on behalf of big tobacco, he died of lung cancer in October 2002.
sidebar
The point was that Nixon's apologists, in unflaggingly trumpeting the case for Nixon's innocence, contributed to a dynamic that helped Nixon: If they persuaded seemingly neutral arbiters such as Ambrose of their positions, those arbiters would then be invoked by the apologists to claim victory. So, for example, Rabbi Baruch Korff, one of Nixon's more colorful defenders from his Watergate days, wrote his memoir in 1995, citing the line, "Two wrongs do not make a right … but they do make a precedent." Korff was able to credit the aphorism to Nixon's authoritative "biographer Stephen Ambrose," rather than to his old crony Lasky. Korff used Ambrose's standing to back up his own belief that "the justification for defending Nixon is all the stronger given … what we now know other presidents did." Needless to say, Ambrose hadn't convinced Korff of that belief; he had held it all along and was simply using Ambrose to enhance the credibility of his statement.
sidebar
"Consensus history," a term coined by the late Johns Hopkins historian John Higham, is commonly used by professional historians to refer to a view of American history that was dominant in the 1950s. Unlike the Progressive historians, such as Charles and Mary Beard, who preceded them, or the New Left historians who followed them, consensus historians saw in the American past more unity than conflict. Willing to posit distinctive national traits, they accepted notions of American exceptionalism and an American character. Some consensus historians, such as Daniel Boorstin, celebrated this unity, while others, such as Richard Hofstadter, lamented it. But they generally agreed on the possibility of writing master narratives about a unitary American people, focused on familiar highlights such as the American Revolution and the Civil War.
sidebar
Hoffer agrees with a distinction I made in Slate in March 2002 that Goodwin's behavior—particularly her response to the revelations—was more honorable than Ambrose's. He also recognizes that Bellesiles's wrongs were of a different order of dishonesty than any of the others', and that Ellis's were the mildest. Yet his framework of "popular" versus "scholarly" history unwisely forces all of them into the same category of unscrupulous historians on the make and in search of stardom, thereby neglecting distinctions he otherwise duly notes.
Dan Greenberg is the author of Nixon's Shadow: The History of an Image (2003). He teaches history at Rutgers University and writes the History Lesson column for Slate, where this article first appeared.
This piece first ran in Slate and is reprinted with permission of the author.
Copyright © 2004 Dan Greenberg