Bruce Bawer has written a brilliant essay exploring our post-WWII popular culture. The 1950s and "The Sixties" were the times of my life. If this is (fair & balanced) admiration, so be it.
[x Wilson Quarterly]
The Other Sixties
by Bruce Bawer
Two decades, the 1950s (1950–59) and “The Sixties” (ca. 1965–74), continue to be the touchstones by which American liberals and conservatives define themselves. To those on the right, the 1950s were the last good time, an era of sanity and maturity, order and discipline, of adults behaving like adults and children knowing their place. To those on the left, the 1950s were a time of fatuous complacency, mindless materialism, and stultifying conformism—not to mention racism, sexism, and other ugly prejudices. By contrast, “The Sixties,” for conservatives, were an explosion of puerile irresponsibility and fashionable rebellion, the wellspring of today’s ubiquitous identity politics, debased high culture, sexual permissiveness, and censorious political correctness. For liberals, the period was a desperately needed corrective that drew attention to America’s injustices and started us down the road toward greater fairness and equality for all.
Of course, we know all this. But what do we know about the early 1960s, the years between those touchstone decades? Well, we know that they saw perhaps the most dangerous incident in the history of American foreign policy, the Cuban Missile Crisis, and perhaps the most stirring moment in the nation’s long domestic racial conflict, Martin Luther King, Jr.’s “I Have a Dream” speech. These events, recounted in numerous books and movies, have become the stuff of American legend, though their social and cultural contexts have too often been given short shrift. Indeed, the period itself has too often been lost in the shuffle, viewed as merely transitional (the lingering twilight of the Eisenhower era, the predawn of the Age of Aquarius), and largely overshadowed by the legend of the man who presided over it, John F. Kennedy. So enthralled, or benumbed, have later generations been by the endlessly repeated anecdotes about Kennedy, his family, his women, and his administration’s crises that they have failed to look closely at the era itself.
Which is not to deny that Kennedy gave the period a focus and a tone. “Let the word go forth from this time and place,” he said in his inaugural address, “that the torch has been passed to a new generation of Americans, born in this century, tempered by war, disciplined by a hard and bitter peace, proud of our ancient heritage.” Those few words, as it happens, did a good job of reflecting not only the thinking of the president and his men, but also the temper of the time that had just begun, a period at once aware of its newness, restless for change, and respectful of its past, its roots, its traditions. In this sense, it differed markedly from the periods that bookended it. Preceded by an era that was to a large extent passively conservative, and followed by a divisive epoch in which a radical-left groundswell provoked a strong conservative reaction, the early 1960s were something else entirely—a time dominated, to an extent almost unimaginable today, by reform-minded, bipartisan, consensus liberalism. The years were classical liberalism’s last hurrah.
To read through the bound volumes of the newsmagazines Time and Newsweek, issue by issue, from the late ’50s onward, is to be struck, sometime around the beginning of the 1960s, by the sudden proliferation of the word new. Society was newly open, popular culture newly experimental, religious institutions (in the words of one contemporary observer) “newly irenic.” There was even talk among Vatican II-influenced, reform-minded Catholics of a “New Church.” A new national order was under construction: After three centuries, it appeared that America was at last beginning to confront its racial divisions and inequities and move toward greater unity and fairness. And there was a new world order, or at least a “New Europe,” as headlines of the day frequently put it. Where formerly there had been a continent made up of countries that had warred with one another for centuries, there was suddenly a Common Market that seemed headed toward that miracle of miracles, unified sovereignty.
There was a New English Bible, its language condemned as barbarous by none other than T. S. Eliot (who would die in 1965). And there was a new, disorienting way of mapping out the country: In August 1963, an unbylined writer in The New Yorker’s “Talk of the Town” column confessed that “for the past several weeks, we have been trying to come to terms with the Post Office’s new address-by-number system, called, with somewhat unnerving cajolery, the Zip Code.” (Alas, that “unnerving cajolery” was the language of the future.) There was even something called the “new math,” one of many educational innovations rooted largely in a fixation on besting the Russians.
Newsweek carried a regular full-page feature called “New Products and Processes,” which heralded a Brave New World of, among much else, small record players (from Toshiba, one of several Japanese companies that were beginning to reverse the 1950s equation of “Made in Japan” with cheapness and shoddiness), removable car seats for children, aluminum (not tin) cans, overhead projectors, a $12,000 videotape recorder “primarily for use in offices, factories, and hospitals,” and the IBM 1440 “Flexible Finder,” a marvelous small-business computer that
stores information in interchangeable plastic packs. Each pack weighs about 10 pounds and holds six magnetic memory disks containing a total of 3 billion characters of information. In a matter of seconds, the disk pack is placed on a drive spindle and the computer unit is ready to operate, speedily searching the memory disk for the data needed to perform its assigned chore. . . . The 1440 will rent at $1,500 to $6,000 a month . . . [and] sell for $90,000 to $315,000.
Time’s Man of the Year for 1960 was, for the first time, not one individual but a group of individuals—“15 brilliant Americans, exemplars of the scientists who are remaking man’s world.” A new heaven and a new earth seemed within reach.
On the gender front, things were changing fast, and increasing numbers of women were working in traditionally male jobs. (“Today’s career woman,” noted one disapproving commentator, “is becoming the equal of men.”) In 1962, the closest thing America had to a popular feminist tract was Helen Gurley Brown’s Sex and the Single Girl, which took for granted that every young woman’s dearest wish was “a rich, full life of dating.” In 1963, a very different book about sex and single girls was published. “One gets the impression that this is how Ernest Hemingway would have written had he gone to Vassar,” quipped television talk show host Jack Paar about Mary McCarthy’s novel The Group, which was viewed at the time as a daringly frank depiction of women’s intimate lives. Many considered McCarthy’s book a harbinger of new ways of thinking and writing about the lives of women. A few months later, Betty Friedan’s The Feminine Mystique appeared, to be followed by other manifestoes from more radical feminists.
A spirit of synthesis and unity reigned on many fronts. If the Common Market promised to erase ancient national divisions in Europe, the leaders of the mainstream Protestant denominations of America spoke ambitiously of uniting their churches within the next few years—a movement heralded in a Time cover story, “The Ecumenical Century.” (The movement, like many other hopeful developments, would peter out and die ignominiously amid the divisions of “The Sixties.”) Meanwhile, Catholic Americans, whose church, as reported in an Atlantic Monthly supplement, was coming “out of the catacombs,” were heeding the urgings of their pope, John XXIII, to embrace Jews and Protestants as their brothers and sisters. One of the first signs of this new thinking came in December 1960, when the pope and the archbishop of Canterbury met at the Vaticanthe first such meeting in the history of their two churches. “With increasing frequency,” noted Time in 1962, “Catholic theologians are being asked to speak to Protestant groups, and Protestants to Catholics.” When the Second Vatican Council was convened in 1962, non-Catholic religious leaders were stunned to find themselves not just invited as observers but given access to sensitive documents of the sort the Curia would once have classified top secret.
To many Americans, the country—indeed, the world—seemed, in an astonishing number of respects, on the verge of becoming One. In 1961, James McCord, president of Princeton Theological Seminary, hailed a new “Age of Syncretism” and “the dawn of universal history,” and he wasn’t talking only about religion. Nor was America’s leading Catholic theologian, John Courtney Murray, speaking just of his own church when, on the eve of the Kennedy presidency, he hailed the beginning of a “new era in the United States.” Upon the death of John XXIII, in June 1963, a New Yorker eulogist commented that “few successors of St. Peter have labored as hard as he to achieve the injunction of Christ, ‘May they be one.’”
Such was the spirit of the times, which would not long survive Pope John. It was, in fact, a spirit with which most Americans did not actually concur, though this fact was, at the time, easy to ignore, at least if you were a member of the Northeastern establishment. The decisive defeat of Barry Goldwater in the 1964 presidential election certainly suggested to many observers that conservatism as a force in American politics was dead. Even liberals who realized how conservative the country actually was tended to take for granted that persuasion and education would change that state of affairs over time. Or else they simply assumed that conservatives would continue to keep their mouths shut.
In the 1940s, America won a colossal war against fascism; in the 1950s, it achieved a colossal prosperity. In terms of material wealth, postwar American life was like nothing else on earth—a thing of wonder, the realization of millennia of human hopes and dreams. As America entered the 1960s, there was a widespread sense that the nation had an opportunity, at last, to do something with that prosperity.
In part, this simply meant that Americans had the freedom to relax and enjoy, to loosen up a bit and ease certain restraints and disciplines. (Young Americans of the 1960s, who had known only security and prosperity, tended to have a view of life and the world very different from that of their parents, who had grown up during the Depression and World War II.) But it also meant that Americans at last had the luxury to do some hard thinking, to face up to social wrongs, and to be a bit more generous, perhaps, with the less fortunate among them. After a decade of fixation on their economic success, Americans began to pay serious attention to the indigence in the world’s richest land. “For a long time now,” wrote Dwight Macdonald in 1963, in a New Yorker review of Michael Harrington’s The Other America that was itself almost as long as a book, “almost everybody has assumed that . . . mass poverty no longer exists in this country.” Using statistics drawn largely from Harrington’s book, Macdonald demonstrated that “everybody” was wrong.
Facing up to the reality of poverty was one thing; knowing what to do about it was another. “The problem,” wrote Macdonald in his review, “is obvious: the persistence of mass poverty in a prosperous country. The solution is also obvious: to provide, out of taxes, the kind of subsidies . . . that would raise incomes above the poverty level, so that every citizen could feel he is indeed such.” In the Lyndon Johnson years and afterward, of course, it would become increasingly clear that the solution was not at all obvious. If many Americans were essentially in agreement on what their country’s major social challenges were, they were hardly in agreement on what to do about them. And it was disagreement over the best way to address the challenges that would give rise to the ideological rifts of “The Sixties.” In the early 1960s, however, these divisions lay in the future, and the solutions to many of American society’s most formidable problems did indeed seem obvious.
It was in the early 1960s that many Americans first heard about air and water pollution, about urban blight and suburban sprawl. In the course of a couple of weeks in 1962, Newsweek told its readers about the “population explosion” and its dire consequences (“Too Many Babies?” was the question on the cover), and about the grim message of Rachel Carson’s new book, Silent Spring: “DDT, parathion, and malathion spray have a somber lining.” Moreover, after a long silence, young Americans were beginning to speak up. “Last year they went boom,” wrote Time in 1961 about college students. Teenagers and twentysomethings were at last openly political, picketing Woolworth’s lunch counters to protest segregation, rallying against the House Un-American Activities Committee and ROTC. There was an unusual degree of high school and undergraduate participation even in the fledgling right-wing movement of Barry Goldwater. The memory of the era’s earnest, low-key student politics would fade fast amid the campus riots and sit-ins of “The Sixties.”
Today, the early 1960s seem remote: men wearing ties and neatly pressed suits on all occasions, working women of every age identifying themselves as “career girls,” black people still largely “in their place,” gay people firmly closeted. Yet in these times of ours—when both hard-hitting social and political satire and genuine flag-waving patriotism are simultaneously in style, when Robbie Williams is turning the Frank Sinatra and Sammy Davis, Jr. tunes of the early 1960s into hits all over again, when Hollywood remakes the Rat Pack’s Ocean’s Eleven, when Richard Rodgers’ 1962 musical "No Strings" has returned to the New York stage, and when more and more Americans appear worn out by the ideological wars of recent decades and newly eager for a sensible centrist consensus—the early 1960s appear far more accessible and attractive than either the gray decade that Robert Lowell called “the tranquillized Fifties” or “The Sixties” of LSD, hardhats, Janis Joplin, and Archie Bunker.
To be sure, the attractiveness of the early 1960s is bound up to a considerable extent with the period’s naiveté, its innocence as to the moral and strategic complexities of the projects it was undertaking so eagerly. In the end, the apparent liberal consensus would prove largely illusory, and therefore temporary. What united America during those years, to the extent that it was united, was not an elaborately articulated ideology but a broadly shared set of good intentions. Only later would these intentions be overwhelmed and undermined, as multitudinous prejudices, resentments, and differences in values, beliefs, and priorities came to the fore; only later would the more fractious and extreme elements of society, at both ends of the political spectrum, find their voices and gain a semblance of legitimacy.
Though the naiveté of the early 1960s is not something to which we should wish to return, much about the times remains highly appealing. The period seems in many ways to represent a congenial balance between highbrow and middlebrow, between seriousness and frivolity, and between ideas and values that we now associate with the political Left and Right. Those years were America’s liberal moment, and a pivotal point in American history. They were the gestation period of the postmodern era in which we now live.
One way to get a handle on the period is to look at some of its more representative cultural figures—the men and women who reflected its style, tone, and preoccupations. Among those figures were two talk-show hosts, David Susskind and Jack Paar. American TV in the 1950s had combined vaudeville-style variety (Texaco Star Theater), comfortable sitcoms (Father Knows Best, The Donna Reed Show), and solemn middlebrow drama (Playhouse 90). In “The Sixties,” TV would offer a mishmash of retrograde fare that sought to ignore entirely the new currents in American society (Here’s Lucy) and shows that strove—some more successfully than others—to be “with it” (Laugh-In, All in the Family). Between came an era in which TV, with surprising frequency, reached impressive levels of sophistication. What distinguished this period was the quality of the talk. The twin peaks were Susskind’s Open End and Paar’s Tonight Show. Both struck a knowing balance between seriousness and irreverence that was at once characteristic of the period and unlike anything Americans had seen before.
Open End, which first aired in 1958, would last a long time; under the title The David Susskind Show, it ran until 1987. But it was as Open End, during the early 1960s, that the show had by far its greatest impact. The title referred to the program’s indeterminate running time: Susskind and his guests kept on talking, sometimes for hours, until it was felt that what needed to be said had been said.
Open End was not about wall-to-wall irony, as David Letterman is today, or about Oprah-style self-realization. It was about ideas and about the art of conversation itself. The guests, unlike Oprah’s and Letterman’s, weren’t there because they had something to promote; they were there because they had something to say. Admittedly, the guest lists included the usual high-profile entertainers, but what Open End became known for was substance. Susskind’s interlocutors included authors, artists, scientists, and political leaders; he went one-on-one with Harry S. Truman and Nikita Khrushchev. Susskind was not, by nature, a celebrity-ego massager but a restless intellect, probing, challenging, often obnoxious. His show’s popularity is testimony to the high level of seriousness that a great deal of the viewing public was willing to tolerate, and even embrace, during the early 1960s. It was a time when American mass taste may well have been more sophisticated than it has ever been.
Then there was Jack Paar, the closest thing Susskind had to a late-night equivalent. Paar took over the Tonight Show in 1957 from one funnyman (Steve Allen) and passed it on in 1962 to another (Johnny Carson). But though Paar, too, was amusing, he was an altogether different sort of host—and humorist—from Allen or Carson. Engaged, passionate, unabashedly neurotic and oversensitive, Paar had a wonderful sense of humor and brilliant comic timing. He did his share of skits and jokes, but he was far better known for his epigrammatic quips about political figures and events. Like Susskind, Paar interviewed politicians and made little secret of his own leanings. He broadcast from the Berlin Wall as it was being constructed; he interviewed John Kennedy and Richard Nixon when they were presidential candidates; he did not hide his disdain for the Cuban dictator Fulgencio Batista or his support for Fidel Castro, who was also one of his interview subjects (and who had not yet identified himself as a communist).
Paar represented a distinct, even radical departure from mainstream 1950s entertainment, but he was not a man of “The Sixties.” It was on his show (not Ed Sullivan’s, as legend has it) that the Beatles made their American TV debut, on film, singing “She Loves You” in January 1964—though as Paar has always freely admitted, he showed them not because he liked their music but because he thought they were “a joke,” a silly fad. What he was laughing at, of course, without realizing it, was the era to come, which the Beatles would personify, then and forever, and which would soon relegate the tastes and values of Paar’s heyday to the dustbin of history. It’s remarkable to realize how quickly Paar went from being a pivotal figure of the Zeitgeist to being a relic for whom the post-fame years would always seem somewhat out of joint. (Decades later, Paar described himself as offended and embarrassed by sexual situations on the relatively innocuous TV comedy Mad about You.)
Paar embodied a key aspect of American culture of the early 1960s: its awareness that one could be a thoroughly mature, successful, and socially responsible member of what later in the decade would be derided as “the Establishment” and still be irreverent, funny, and even silly. It was as if Americans shared, more than they ever had before (or have since), an unspoken understanding that thought, wit, and culture were not burdens but were, rather, among the pleasures and privileges afforded a free and affluent people.
Nor was the easy sophistication of early-1960s popular culture limited to talk shows. The 1950s had been a decade of sometimes mindless conformism; “The Sixties” would be an era of rebellion and reaction, much of it also mindless. The early 1960s, at their heart, were neither. They were a time of serious questioning, when political ideas, social conventions, and cultural values underwent vigorous, searching, and cogent examination. Suddenly, America was less afraid of dissent, and throughout mainstream culture, intelligent, respectful disagreement was coming to be seen as a good thing. Even the period’s lighter theatrical fare tended to challenge 1950s-style conformity, asserting the value of fun, frivolity, irreverence. One thinks, for example, of such plays as Jean Kerr’s "Mary, Mary" (1961) (Kerr was the emblematic playwright of the era), a frothy comedy about divorce and taxes—and of such musicals as "Little Me" and "A Funny Thing Happened on the Way to the Forum" (1962). There was a similar spirit of irreverence in Herb Gardner’s "A Thousand Clowns" (1962), about a fellow who liberates himself from his stultifying career as a TV writer, and Neil Simon’s "Barefoot in the Park" (1963), about an adventuresome young bride who chafes at her husband’s sudden sobriety.
The dramatic situations these and other distinctive entertainments of the period presented, on both stage and screen, and the sexual humor they contained, were thought daring at the time; within a couple of years they would be considered embarrassingly passé. Take, for example, such Doris Day movies as Lover Come Back (1961), a sex farce with Rock Hudson, and That Touch of Mink (1962), in which Day keeps frustrating Cary Grant’s attempts to bed her. Or take, for that matter, Julie Andrews in Mary Poppins (1964), which contains a good deal of suggestive and genuinely funny humor that’s intended to go over the heads of small children. Day and Andrews, the two big female stars of the era, were savvy, sexy, sophisticated actresses who knew their way around a double entendre; yet both saw their stock plummet—and their images become twisted and ridiculed—as the early 1960s gave way to “The Sixties.” Andrews followed The Sound of Music (1965), the greatest box-office hit since Gone with the Wind, with more than a decade of flops.
One of the defining public figures of the early 1960s was a pretty young actress who was the ingenue of the age. Her movies included The Chapman Report (1962), inspired by the Kinsey report; the saucy western Cat Ballou, (1965); and a series of artsy, libidinous, and awful French movies, culminating in the ridiculous Barbarella (1968), directed by her then husband, Roger Vadim. But her predominant image in America at the time was that of the Doris Day-like “good girl” or pretty young housewife. To watch her now in such light romantic comedies as Tall Story (1960), Barefoot in the Park (1967), and, above all, Sunday in New York (1963), a defining movie of the period—a then chancy, now innocuous story about a 22-year-old woman tired of her virginity, with an oh-so-hip jazz score by Peter Nero (remember him?)—is to be astonished anew at the difference between the Jane Fonda of those days and the “Hanoi Jane” who climbed on a tank in North Vietnam and won an Oscar for playing a hooker in Klute (1971). (Perhaps the breathtaking change of image should come as no surprise when the woman who carried it off also managed subsequently to trade in her aging-radical husband Tom Hayden for plutocrat Ted Turner and transform herself from anticapitalist icon into queen of the workout-video industry.)
Or look at the early-1960s TV shows that were known to Time magazine writers as sitchcoms but that came to be called sitcoms. By comparison with the depiction of domesticity in such 1950s staples as The Donna Reed Show and Ozzie and Harriet, the portrait of family life on The Dick Van Dyke Show, the emblematic sitcom of the early 1960s, seemed staggering in its sheer smartness and casual elegance. Mary Tyler Moore, as Laura Petrie, revolutionized the pop-culture depiction of the American housewife simply by wearing capri pants and not spending all her time in the kitchen. (In a striking reversal of ironclad 1950s practice, Rob Petrie was occasionally shown preparing meals or mixing drinks.) The series at least touched glancingly on—though it did not quite topple—many of the social barriers that 1950s TV hadn’t dared approach: Rob’s colleague Sally Rogers was a single professional woman; his colleague Buddy Sorrell’s bar mitzvah was the centerpiece of one episode, which showed Jewishness not as a phenomenon of the immigrant ghetto (as in the 1950s series The Goldbergs) but as a part of mainstream American culture. And the cast of yet another episode included, if only briefly, a middle-class black couple who embodied none of the inane stereotypes to which black people had been bound theretofore in TV and movies (up to and including Eddie “Rochester” Anderson’s shuffling servant in the 1950s’ Jack Benny Show).
Yet still the show, not the message, remained the thing. Early-1960s TV comedy never approached the explicitly political content of such standard “Sixties” programs (many of them actually of the early 1970s) as All in the Family and Chico and the Man. Which helps to explain why The Dick Van Dyke Show can seem less dated today than does, say, either Ozzie and Harriet, with its period-bound social conventions, or All in the Family, with its stream of up-to-the-minute political references.
And then there were the early-1960s comedians. The joke-telling style established during the vaudeville era had continued to define American standup comedy through the 1950s. In the early 1960s, that changed very quickly. Jack Paar, in a late-1990s interview, remembered the period as a lost “golden age” of “sophistication” and “wit,” when “there were people like Bob Newhart, and Carol Burnett, and Mike and Elaine.”
“Mike and Elaine” were Mike Nichols and Elaine May. As the PBS series American Masters later put it, together they “revolutionized the landscape of American comedy,” changing “our expectations of comedy, and our sense of humor.” They worked with tools that would be the staples of early-1960s comedy: improvisation, low-key wit, and a sharp satirical perspective on Establishment institutions. Their brief and very high-profile joint career peaked when An Evening with Mike Nichols and Elaine May opened on Broadway in 1960. Both would go on to film careers, Nichols mainly as a director, May chiefly as a screenwriter.
Nichols and May weren’t alone in reshaping American comedy during these years. In addition to Bob Newhart, there were Mort Sahl, Shelley Berman, and Woody Allen. (Allen was described in an August 1962 issue of Newsweek as having “the nervous delivery of Mort Sahl and the puny physique of Wally Cox,” but “material closer to that of essayists S. J. Perelman and Robert Benchley.”) All these comedians, venturing far from the familiar territory of broad gags and pratfalls as practiced by Milton Berle, Lucille Ball, and other stars of the 1950s, served up low-key, sophisticated monologues (or, in the case of Nichols and May, improvisational dialogues) that functioned not only as entertainment but as social criticism, and that were funny in highly original ways.
If American pop culture in “The Sixties” would be shaped largely by the Beatles, American pop culture in the early 1960s owed some of its distinctive flavor to another British foursome: the comedy troupe Beyond the Fringe, who arrived on Broadway in 1962. Newsweek hailed Peter Cook, Jonathan Miller, Alan Bennett, and Dudley Moore for their “unrelentingly satirical attitude toward the sacred and the profane. . . . The four Fringemen are as . . . in tune with their times as Mike Nichols and Elaine May.” In 1950s America, middle-class values often seemed to be sacrosanct, while in “The Sixties” they would be dismissed condescendingly by some Americans and defended fiercely by others. In the early 1960s, Americans still respected these values but responded open-mindedly, even enthusiastically, to irreverent humor at their own expense. That balance seems to me just about right. Admittedly, there was a broad insipid strain to the pop culture of the early 1960s. The highest-rated TV show of the period was, after all, The Beverly Hillbillies. There plainly existed (to borrow a term Richard Nixon would popularize a few years later) a “silent majority” with little regard for sophisticated humor. The Hillbillies notwithstanding, however, the early 1960s seemed a golden, or at least silver, era of high culture. “Young people,” reported Time in July 1960, “are reading more and better books than ever before.” Two months later, the magazine enthused: “The book business is booming, classical records are selling by the stack, and art galleries are thriving.”
More than anyone else, Leonard Bernstein personified this flourishing high culture. Though identified in the public mind largely with his stage hit West Side Story (1957), Bernstein, who had been appointed musical director of the New York Philharmonic in 1958, became a famous face to middle Americans in the early 1960s when he used his Broadway fame to help promote classical music. His target audience included not only middle-class adults but their children as well, and his “Young People’s Concerts,” broadcast on TV in the early 1960s to extraordinary acclaim, had an impact one could hardly imagine nowadays, let alone duplicate. Watching those programs today, one remains immensely impressed by Bernstein’s first-rate teaching skills, his refusal to talk down to children, and his obvious dedication to the cause of educating young people about music. (His prominence and his widely recognized busyness were reflected in a 1963 New Yorker cartoon in which a woman, watching TV with her husband, asks him: “Do you suppose Leonard Bernstein is trying to cover up some lack?”)
Inspired by an earnest optimism, Bernstein sought to transform the world both culturally and politically, to spread to the multitudes a love of high culture and, along with it, a more liberal sensibility. In this, he was a true man of the early 1960s. Yet as the times changed and the early 1960s shaded into “The Sixties,” Bernstein, like many other earnest liberals, would find himself dazed and confused in the strange new moral territory the country had entered. His reflexive empathy for the downtrodden served him well in the early 1960s, but when he applied it later in the decade to phenomena such as the Black Panthers, he came off as naive and injudicious. That, of course, would be the thrust of Tom Wolfe’s Radical Chic (1970), which described in painful detail Bernstein’s eager courting of the Panthers at a 1969 soirée in his Park Avenue penthouse.
Wolfe’s unforgettable portrait of that evening captures High Sixties limousine liberalism at its most absurd. But Wolfe does not stress sufficiently that Bernstein, by 1969, was simply a man out of his time. He had intelligently and honorably negotiated early 1960s America, but he lost his way in the more complex political landscape of “The Sixties.” The man who in the early 1960s had embodied his nation’s highest cultural and social aspirations failed to respond sensibly to the era’s new challenges. In Wolfe’s book, he comes off as nothing less than a fool. In his eagerness to move with the times, Bernstein neglected to draw responsible distinctions and made himself irrelevant.
Even food changed in the early 1960s. For the most part, the American diet through the 1950s was tame and bland, its most representative dish being meatloaf and mashed potatoes. Then along came Julia Child, who started a culinary revolution with her first book, Mastering the Art of French Cooking (1961), and who domesticated and demystified French cuisine with her easygoing, playful manner. Thanks to her, millions of Americans grew more adventurous in their eating habits. Those new habits were part of a broad pattern of changes in the American way of life, not just in diet but in clothing and décor. Prosperity allowed Americans to travel abroad, and Western Europe, which during the 1950s had still been living in the shadow of World War II, grew increasingly forward-looking. Ablaze with culture, it was newly attractive to newly flush Americans, who visited in record numbers.
In a short time, the United States took on a more cosmopolitan cast. This development, as has often been noted, was influenced by Francophile first lady Jacqueline Kennedy. But it was the spirit of the times that made the difference. In the 1950s, many Americans would have regarded such phenomena as French cuisine and designer dresses as unassimilably alien. In “The Sixties,” the mentality of the New Left, whose Establishment-defying casual wear forever changed American dressing habits, would condemn haute couture, haute cuisine, and anything else haute as irredeemably classist and counterrevolutionary. But in the early 1960s, there was a thaw; coq au vin and Givenchy got a foothold in American culture and lost something of their strangeness. JFK, too, played a part in setting fashion. In The New Yorker for November 30, 1963, the first issue of that magazine to appear after the assassination, the memorial article ended with the observation that “when we think of him, he is without a hat.” Ever since, it has been difficult to picture any of our chief executives with a hat.
The period’s defining work of fiction was Harper Lee’s To Kill a Mockingbird. Published in August 1960, and faithfully adapted as a 1962 movie starring Gregory Peck, the novel told the story of two white Alabama children and their father, a lawyer who quietly and bravely stands up against prejudice, ignorance, and backwardness. Though set in the 1930s, it was an emblematic story of the early 1960s—of America’s own awakening from a kind of childhood innocence into the full moral truth about itself and its past.
To be sure, the novel’s earnest liberalism, so lavishly admired at the time, would come in for some vicious criticism by the end of the decade, and its racial politics, which had been thought enlightened, would be dismissed by some as offensively paternalistic. Yet the novel has endured in the schoolroom (despite occasional ignorant efforts to ban it on account of its politically incorrect period dialogue), where it continues to serve as a model of skillful storytelling and a useful springboard for the discussion of moral values and social issues. The book’s signal quality is its simple decency. Indeed, it does not seem too outrageous an exaggeration to say that simple decency was a hallmark of the early 1960s. Racial questions still seemed relatively simple; the bitter, polarizing ideological divisions that would open up in “The Sixties,” and that persist in American politics to this day, lay in the future. On important issues, the leading politicians in both parties, as well as the most respected Establishment figures, were essentially in agreement, sharing a broad vision of social progress allied with a firm anticommunism. There were few serious differences within the mainstream of American thought as to what the country was essentially about. Even Charles E. Coughlin, the Catholic priest who in the 1930s had been a popular radio anti-Semite, told Newsweek in 1962 that “bigotry is passé.”
This is not to suggest that everything on the civil rights front was going smoothly or predictably. The mood of the time made for the occasional odd turn of events. After To Kill a Mockingbird, with its heroic portrait of a lawyer fighting institutional racism, won Harper Lee a Pulitzer Prize, even the state legislature of Governor George Wallace’s Alabama—itself the very embodiment of institutional racism—felt moved to pass a resolution offering “homage and special praise to this outstanding Alabamian who has gained such prominence for herself and so much prestige for her native state.” And this in the same month, May 1961, that Freedom Riders were viciously assaulted in several Alabama cities for trying to integrate intercity buses!
In the summer of 1963, Time reported that “week by week, the U.S. civil rights movement burns more deeply in its intensity, shifts into bewildering new directions, expands fiercely in its dimensions.” Yet for all the intensity and puzzlement, most Americans of goodwill seemed to have accepted the idea that they were witnessing, if not taking an active role in, a process of social change that was essentially positive and that would in time bring greater social harmony. Clearly, the rhetoric of Dr. King and others was having an effect. (In September 1961, Time hailed what it called “integration 1961 style: peaceful compliance with the law of the land.”) There was a general understanding and acceptance, as there had not been in the 1950s, that integration was America’s future. Few imagined the difficulties ahead, let alone the urgency and ferocity that would mark political protest later in the decade.
The process of integration that was under way throughout America in the early 1960s was especially conspicuous in show business. “Until a year ago,” reported Newsweek in September 1962, “stores in Negro districts, and magazines like Ebony, were the only American marketplace for Negro mannequins. Now such girls are winning the attention of white model agencies.” Interracial romance and marriage, so recently taboo, were suddenly in the public eye on a regular basis. Pictures of mixed-race celebrity couples, such as Eartha Kitt and her husband, appeared regularly in the newsmagazines. On Broadway, the Rodgers musical "No Strings" centered on a romance between characters played by Richard Kiley and Diahann Carroll, and the fact that the romance’s interracial nature was just there, presented not as a burning political issue but as an inconsequential human detail (which was not mentioned once in the show), had a strong impact on audiences.
The most famous black person in America to be married to a white person—indeed, perhaps the most famous black person in America other than Martin Luther King, Jr.—was Sammy Davis, Jr., who was the husband of Swedish actress Mai Britt. Along with Frank Sinatra, Dean Martin, Joey Bishop, and Peter Lawford, Davis was a member of the “Rat Pack,” also known as “The Clan.” During the early 1960s they were the coolest thing on the continent, the very definition of hip. And the matter-of-fact inclusion of Davis among them made a powerful statement about integration. As with the unmentioned interracial affair in No Strings, the statement was all the more powerful because neither Davis nor his fellow Rat Packers were inclined to discuss or debate their racial politics. They just lived them, sometimes with real courage. The easygoing way Davis and his friends interacted on and off stage, making jokes about race rather than speeches, left many Americans feeling a lot more comfortable about the new America than they might otherwise have been.
Sinatra and his Clan were perfect symbols of the early 1960s. They were too hip for the ’50s, and too unhip—with their tuxedoes and cocktails, and their un-PC banter about booze and broads—for the dope-smoking, jeans-wearing “Sixties.” But the sheer fun of the Rat Pack looks far more appealing today than the dour New Left and Religious Right moralisms of later decades. “The Sixties” sent the Rat Pack down in flames. The Beatles landed, and in the blink of an eye Sinatra and friends seemed hokey and irrelevant, if not downright offensive. (Only a couple of years after he’d been at the top of the showbiz heap, Sinatra was pleading with radio stations to give him “equal time in Beatleland.”) As for Davis himself, his notorious, career-damaging embrace of Richard Nixon in 1968 reflected the confusions of an entertainer who, not unlike Leonard Bernstein, was very much a man of the early 1960s, a man of good intentions who responded unwisely to the “Sixties” cry of “Which side are you on?” and came off looking foolish.
In religion, liberal reform was the order of the day. Time, naming John XXIII its Man of the Year for 1962, described the Second Vatican Council as “the beginning of a revolution in Christianity.” The revolution, which stressed reconciliation and forgiveness, seemed to be occurring everywhere. While John XXIII was pointing the Catholic Church in a new direction with his encyclical Pacem in Terris, Anglican bishop John A. T. Robinson was turning his own church’s theology upside down with his bestselling book Honest to God, an assault on traditional doctrines. Morris West’s novel The Shoes of the Fisherman (1963) told of a gutsy, liberal-minded pontiff who sells off the Vatican’s treasures to feed the poor; it hardly seemed a fanciful story in those heady days.
Elmer Gantry, Sinclair Lewis’s novel about a shady tent-meeting evangelist, had caused an uproar on its publication in 1927. When Richard Brooks’s movie version was released in 1960, Time observed that “hardly anybody is complaining.” Indeed, many critics considered the movie’s topic, the hypocrisies and fire-and-brimstone excesses of Protestant fundamentalism, utterly irrelevant to 1960s America. The future of Christianity lay with the progressive ecumenism of John XXIII. Few in the mainstream press foresaw any such thing as the Religious Right, even though millions of future members of the movement were all around them, worshiping quietly, playing little or no role in national politics, and waiting only for the advance of civil rights and the implementation of Supreme Court decisions against prayer in public schools to rise up and make their power felt.
And yet, and yet. Even as all the good liberal ideas were being spread about in the early 1960s, and progressive reforms being planned and implemented, America was in the midst of a seemingly intractable nuclear standoff with the Soviet Union. To be sure, Joseph Stalin was dead, and the current Soviet premier, Nikita Khrushchev, had openly condemned some of Stalin’s more bloodthirsty acts. It appeared possible that the Soviet Union might actually reform itself to some degree. Nonetheless, the early 1960s proved to be the most dangerous period of the whole Cold War. Russia was testing the “new” America, and America was testing the “new” Russia. The result was the Cuban Missile Crisis. For several days, the two countries hovered on the brink of nuclear annihilation. And then America went back to normal. Or pretended to. (And what, in such circumstances, was “normal,” anyway?)
The civil defense craze was at its height, though at the time most Americans seem not to have regarded it as a craze at all but as a matter of commonsense preparation. In 1961, President Kennedy said that “prudent” families should have their own bomb shelters. In August of that year, Time reported that “more and more families made preparations last week to go underground.” Federal agencies issued pamphlets explaining how to build home fallout shelters, and private firms such as the Norton Atomic Shelter Corporation of Highland Park, Illinois, did a brisk business. Wham-O, the makers of the Hula Hoop and (later) the Frisbee, put a $119 do-it-yourself shelter kit on the market. A famous map in an October 1961 issue of Time, whose cover story explained that “civil defense must be part of the normal way of life,” illustrated the potential effect of a single atomic bomb dropped on Manhattan. Concentric circles marked the areas within which various percentages of the population would be killed—instantly by the detonation or slowly by fallout. (When I saw the map recently, I recognized it at once from my baby-boom New York childhood.)
Americans lived with the knowledge that at any moment a nuclear attack might eradicate the country as they knew it and compel them, if they were still alive, to retreat with their families to a basement hideaway. Officially, the nation was at peace and living well; at the same time, it was enduring a daily trauma of colossal proportions. The largely suppressed awareness that a strange and disturbing reality lay concealed beneath society’s genial and placid surface is at the thematic heart of such deeply weird movies of the era as The Manchurian Candidate (1962), Lolita (1962), and The Birds (1963), and of the creepy TV comedies The Munsters and The Addams Family. That same awareness animates the period’s most distinctive TV series, The Twilight Zone.
Lasting for five seasons (1959–64), The Twilight Zone tapped into all those unvoiced fears and insecurities that are presumably hard-wired into the human psyche, which explains why, all these decades later, the series’ best episodes, in reruns, continue to disturb and haunt. The program spoke with particular urgency to the early 1960s Zeitgeist, especially the preoccupation with atomic war. Reading through a list of Twilight Zone storylines, one is struck by the number of times the show explicitly addressed worries about the nuclear threat. In one episode, with the nation on the brink of atomic attack, two men plan to steal a spaceship and escape the planet; in another, a group of suburban neighbors, fearing an invasion from outer space, fight over access to a bomb shelter. Several Twilight Zone episodes took place in the aftermath of nuclear war. In perhaps the most famous of them, a misanthropic bookworm is pleased to be the lone survivor of such a war because he now has all the time in the world to read; but when he sits down on the steps of the public library with a pile of books, his reading glasses fall off and break.
More often, the series, which was created, produced, introduced, and often written by Rod Serling, approached the period’s apprehensions in a more elliptical fashion. A traveler arrives in a town and wonders where the people are. A man awakens to discover that nobody knows him and that all traces of his existence have vanished. A defendant being sentenced to death gives a passionate, urgent courtroom declamation in which he insists that the courtroom and all the people in it are not real. An airline passenger sees a monster walking on the plane’s wing. Five strangers find themselves mysteriously confined in a huge cylinder. Aliens land on Earth, and the book they’ve brought along, To Serve Man, turns out to be not a humanitarian manual but a cookbook. The anxieties reflected in these storylines are relatively unambiguous. Perhaps everything is not as we think it is. Perhaps we are not who we think we are. Perhaps we are trapped in something from which there is no escape. Perhaps the fine, orderly society we think we are living in is only an illusion, concealing horrors more immense and threatening than anything we can imagine. Such was the undercurrent of early-1960s life as captured by The Twilight Zone.
It’s haunting to read chronologically through the confident newspapers and newsmagazines of the early 1960s while knowing the end of the story. The clock was winding down, and the America that people expected to continue along much the same path for years to come would soon be gone forever. Yet no one realized. “One knew in one’s bones,” observed the anonymous “Talk of the Town” columnist in The New Yorker’s issue of May 18, 1963, “that 1936 was prewar. . . . In 1963, we are surely . . . in the post-postwar period. It does not, though, have the feel of prewar days that 1936 had.”
But war was already under way. Though the conflict in Indochina was by 1963 a present reality, no one foresaw the consuming, destructive, all-transforming struggle it would become. No one foresaw the Berkeley Free Speech Movement, the Paris Commune, the sit-ins, the riots, the Summer of Love, Woodstock. Those events would take place in, and shape, another world.
Nor did anyone foresee the Kennedy assassination—the event that, for everyone alive at the time, was decisively transitional. In retrospect, to be sure, the transition was presaged by several other developments in 1963: the death of John XXIII on June 2, the murder of Medgar Evers on June 12, the March on Washington for civil rights on August 28. Yet November 22, 1963, was the watershed. By December, Time was noting “a mounting tide of conservatism” in politics and religion; in February 1964 the Beatles arrived in New York; 1965 would see seizures of campus buildings by college students and riots in the Los Angeles neighborhood of Watts. “Sixties” music, “Sixties” politics, “Sixties” culture took hold. And as they did so, the American consensus (or the illusion thereof) unwound, and centrist liberalism faded away, its adherents scattering to both left and right, becoming part of the nascent New Left, or of the movement that would come to be called neoconservatism, or, in some cases, just hovering between, uncertain, rudderless, alienated by the rhetoric on both sides. Americans who had marched together at Selma would be at each other’s throats, fighting over busing, food stamps, crime, affirmative action, “moral equivalence,” political correctness, prayer in the schools, abortion, homosexuality.
Though new issues occupy the front burner, that polarization endures today, and the concept of civic obligation—so central to the early 1960s—has long since been supplanted by a reflexive cynicism and a tendency to judge all public discourse by its entertainment value. Who, in the early 1960s, would have imagined that 40 years later the best-selling books on public affairs would be not earnest tracts on poverty and the environment but crude partisan rants by the likes of Michael Moore, Ann Coulter, Al Franken, and Michael Savage? Likewise, the respectably middlebrow common culture of the early 1960s is only a memory, as is the pipe dream of an America enchanted by serious literature and classical music; instead we have American mass culture, a worldwide economic powerhouse that transforms almost everything it touches. And though that mass culture is, admittedly, large and diverse—and fragmented—enough to include many bright spots, it also has staggering depths of vulgarity, is aimed (largely) at 12-year-olds, and has little regard for intelligence, seriousness, or wit. The early 1960s’ naiveté may be gone, but philistinism and ignorance thrive unashamed. In a time when many Americans appear far more eager to be coarsened than to be edified, the early 1960s look very attractive indeed.
But what’s past is past. By its very nature, that decent, earnest, innocent interlude could not last more than a moment. And though it was clear by nightfall on November 22 that an era had ended, the awareness that a new period was genuinely underway dawned, no doubt, on a different day for everyone. For one person, it may have been the day he first saw a teenage boy with shoulder-length hair; for another, the day she first smelled a strange, sickly sweet smoke coming from the back of the school bus. My own memory yields a cluster of images that must date back to the spring of 1967, when I was 10 years old. It was a warm, sunny weekend afternoon, and I was walking with my parents through Tompkins Square Park in the neighborhood of Manhattan that had long been called the Lower East Side but that would soon be known as the East Village. We had driven in from Queens to see my grandmother, a Polish immigrant who lived in the neighborhood. But my parents were curious to get a look at the flower children, whom we had heard about and seen on the news. So instead of returning to our car after our visit we walked over to the park, in which I had never before set foot. And indeed there they were, in real life, all around us, reclining on the grass—young people dressed in T-shirts and bell-bottom jeans, one or two of them playing guitars, their manner strangely casual, loose, relaxed in a way I had never seen before. And, yes, with flowers in their hair.
I didn’t know what to make of them. But their image lodged firmly in my mind, and I knew that day that the world had changed.
Bruce Bawer, an American writer who lives in Norway, is the author of A Place at the Table: The Gay Individual in American Society (1993) and Stealing Jesus: How Fundamentalism Betrays Christianity (1997) His essays and reviews have appeared in The Hudson Review, The New York Times Magazine, The New Republic, Partisan Review, and many other publications.
Copyright © Spring 2004 Wilson Quarterly
Monday, October 25, 2004
Plus ca change, plus c'est la meme chose
Thick As Thieves
I learned two things from this article in the Boston Globe. The daily fishwrap in Boston is owned by the New York Times Company. No wonder the Red Sox have an inferiority complex with the Yankees. Doris Kearns Goodwinadmitted plagiaristis still treated like royalty by Tim Russert on "Meet The Press" and Don Imus on "Imus in the Morning." Imus, in particular, spares no vituperative in describing those people he disdains: the Dickster, the Slickster, and O'Reilly. Imus calls them weasels, war criminals (in the Dickster's case), or scum. However, Imus (like Russert) frequently invites Goodwin to be a guest on a broadcast and passes her off as some sort of wit. Nitwit is more like it. Goodwin stole words that were written by another and passed them off as her own. Goodwin deserves the sort of celebrity earned by Martha Stewart. As my maternal grandmother would put it: "Doris Kearns Goodwin has more nerve than a government mule."
If this is (fair & balanced) disdain, so be it.
[x Boston Globe]
Hollow History
A former plagiarism adviser for the American Historical Association says his profession is in deep trouble. But some colleagues say his case doesn't add up.
By Matthew Price
With fat biographies of sundry Founding Fathers appearing every other month and bookstore tables still piled high with odes to the Greatest Generation, the public's appetite for the American past appears as healthy as ever. But according to University of Georgia historian Peter Charles Hoffer, we're being sold a bill of goods.
In his new book, "Past Imperfect: Facts, Fictions, Fraud -- American History from Bancroft and Parkman to Ambrose, Bellesiles, Ellis, and Goodwin" (PublicAffairs), Hoffer contends that his profession "has fallen into disarray" and aims a polemical blast at his fellow historians for condoning sloppy scholarship and an anything-goes ethical climate.
A specialist in Colonial history and American jurisprudence, Hoffer is a respected scholar whose previous work has generally earned the esteem of his peers. Now, setting himself up as judge, jury, and executioner, Hoffer puts historians in the dock -- and throws the book at them."
American history," he writes, "is two-faced" -- split between celebratory popularizers who often value rousing narrative over scholarly rigor and academic specialists whose jargon-riddled, often dour monographs ignore the ordinary reader. Meanwhile, Hoffer accuses the American Historical Association (AHA), where he has served as an adviser on plagiarism and a member of its professional standards division, of abdicating its responsibility to enforce basic scholarly principles in both realms.
Hoffer revisits the now-familiar cases of a quartet of historians brought low by scandal in 2002: former Emory University professor Michael Bellesiles, who was accused of falsifying data in "Arming America," his controversial 2000 study of 18th- and 19th-century gun culture; Stephen Ambrose and Doris Kearns Goodwin, who were both found to have used material from other scholars without full attribution; and Mount Holyoke's Joseph Ellis, who was rebuked for spinning tales of his nonexistent Vietnam combat record in classes and newspaper articles. According to Hoffer, these were not just isolated incidents but symptoms of a wider problem -- one that goes far beyond the headlines to the very way history is written and consumed in America.
Hoffer's case is impassioned, but the final verdict will belong to his peers. Is the entire historical profession in America, as Hoffer wrote in a recent e-mail, "sailing close to the edge"? Or, as some of his colleagues are already suggesting, is Hoffer himself guilty of exaggeration and distortion?
According to Hoffer, the rot can be traced to the roots of American historiography. In the 19th century, America's founding historians thought they were merely assembling facts that were out there waiting to be found. Here were the origins of "consensus history," which stressed unity over division and trumpeted the ideals of American democracy and the march of Manifest Destiny. Driven by nationalistic impulses, scholarly pioneers like Francis Parkman and George Bancroft (both from Massachusetts, and both namesakes of prestigious history prizes) produced dramatic narratives to inspire their countrymen, such as Parkman's "The Oregon Trail" (1849) and Bancroft's 10-volume "A History of the United States" (1834-74).
America's classic historians were powerful stylists, Hoffer writes, but their history was "inherently fallacious" and grievously incomplete. These celebratory narratives "falsified our history because [they] left out or dismissed the experience of more than half of America's population: Indians, women, servants, slaves, and immigrants." In addition to distorting the record, Hoffer asserts, Parkman and his fellows were also pioneers in plagiarism, freely copying whole passages from other historians without bothering to use quotation marks.
With the rise of the modern university system in the late 19th century, the culture of gentlemanly amateurs gave way to university-trained professionals who swore allegiance to science. But scholarship little improved, Hoffer contends. Instead, it simply dressed up the old bogus myths of the past in pseudo-scientific jargon, relying on fabrication and, once again, "plagiarism," as historians repeated, "without citation and without criticism, the old self-sustaining truisms."
The old model didn't fully give way until the 1960s. Animated by Marxism and inspired by the example of British labor historians like E.P. Thompson, the "new historians" (as Hoffer dubs them) began writing history from below, focusing on the neglected story of women, minorities, and the working class, even as the profession itself began diversifying. The study of slavery and slave culture in particular flourished in the 1970s, as scholars like Eugene D. Genovese, in his landmark 1974 work "Roll, Jordan, Roll," opened up new vistas on African-American history. At the same time, the AHA undertook new initiatives to elevate the profession, creating a Professional Division in 1974 to audit and set standards of historical scholarship.
But Hoffer sees serious problems here, too. Although the new historians helped create a culture of scrupulous attribution and methodological sophistication, they also retreated behind a wall of footnotes and obscure jargon -- and brought a relentlessly gloomy perspective to the teaching of American history. "Determined to withhold . . . proofs that American history could inspire and delight," Hoffer writes, the new historians lost touch with a general audience, who wanted more than a dim recitation of American failure.
This historiographical sea change, Hoffer argues, has led to deep fractures within the historical community. Today's academics -- the heirs of the new history -- dismiss historians who write with a popular touch and disdain readers who prefer their history chock-full of heroic derring-do. Meanwhile, superstar historians (abetted by trade publishers with lax scholarly standards) churn out cheery consensus history by another name, often built on outmoded (and sometimes deeply compromised) scholarship.
It is this chasm between history's academic and popular realms, Hoffer argues, that gave rise to the Ambrose, Ellis, Goodwin, and Bellesiles sagas. Hoffer is particularly harsh on Bellesiles, who resigned from his job at Emory and was stripped of the Bancroft Prize in the wake of the controversy over "Arming America."
To his defenders, the former Emory historian was the victim of a conservative plot, spearheaded by the National Rifle Association, to discredit Bellesiles' conclusion that, contrary to the image of the musket-wielding patriot, few early Americans owned functional guns. But in Hoffer's telling, Bellesiles engaged in deliberate "falsification" of his data. Furthermore, Hoffer asserts, Bellesiles published his book with the trade publisher Knopf (which eventually withdrew the book from circulation) rather than a scholarly press "in order to claim . . . immunity from close professional scrutiny." (While an investigative panel formed by the AHA found no outright falsification, they condemned Bellesilles' evasiveness about his source records, many of which could not be traced.)
As for Goodwin and Ambrose, who are also published by trade presses, Hoffer brushes aside their claims that the instances of missing footnotes or insufficient citations were just unintentional and isolated lapses in otherwise sound work. Whatever the intention, Hoffer writes, the end result is the same: "plagiarism," which under AHA standards, he notes, does not require actual intent to deceive. (He brings greater sympathy to the case of Joseph Ellis, whose scholarship itself was not questioned, suggesting that the same imaginative powers that led him to lie about his life story may have helped him write more subtle and nuanced books.)
But Hoffer reserves his bitterest jibes for the AHA, which last year gave up adjudicating cases of scholarly misconduct, citing its lack of public impact on the profession. To Hoffer, this is a serious abdication of responsibility. "We must start acting like professionals, instead of making believe we're professionals," he said in a recent phone interview. "One of the hallmarks of professionalism is to discipline erring members."
Hoffer himself, however, is facing the criticism of his peers, many of whom express skepticism about his dire picture of the profession.
Alan Taylor, a professor of history at the University of California, Davis, and author of the Pulitzer Prize-winning "William Cooper's Town" (Knopf), applauds Hoffer's "courage" for his criticisms of the AHA, whose refusal to formally judge plagiarism charges (and other misdeeds) he calls "cowardly." But he scoffs at the suggestion that the entire profession is facing a grave ethical predicament."
I think Hoffer overly dramatizes it. The sun rises and several thousand historians go to work and produce scholarship that shows integrity," Taylor said in a recent interview. "I don't know anybody who goes around the graduate program at UC-Davis and says `Michael Bellesiles got taken down so we're in a crisis."
'Taylor also takes issue with Hoffer's expansive definition of plagiarism. To accuse Bancroft and Parkman of "plagiarism," Taylor argues, is simply ahistorical. "Our definition of plagiarism did not apply in the 19th century. It was the custom only to cite from the original document you quoted, and not the secondary source it appeared within. It wasn't just Parkman doing it -- everybody in that generation did it." As for the definition Hoffer applies to today's scholars, Taylor says, it sweeps up some instances of legitimate paraphrase and is "sometimes so hard and fast that we'd all be guilty of it."
Eric Foner, professor of history at Columbia University and former AHA president, says historians need to do a better job of monitoring professional misconduct, but he cautions that there are practical limits to what the AHA can do about it. "We don't have the power of sanction," he explains. "We can't take away a historian's license to practice." And misdeeds, he points out, hardly go unpunished. "Bad publicity is a pretty big sanction, especially if you're a popular historian."
But more importantly, the historians take issue with Hoffer's stark depiction of today's marketplace for history. Even within the popular realm, argues Taylor, there is a diversity of scholarship. "Certainly, a lot of the public prefers prepackaged patriotism," he says. "But take Howard Zinn -- is he out of touch with the public? His book ["A People's History of the United States"] is an all-time bestseller."
Besides, some say, academic training does not necessarily lead to better work. In the phone interview, Hoffer took aim at historian David McCullough (who does not hold a Ph.D.), noting that his Pulitzer Prize-winning 2001 biography of John Adams, while enjoyable, doesn't cite any sources that are less than 25 years old -- a sign, in Hoffer's view, that "scholarship doesn't matter." But Gordon S. Wood, professor of history at Brown and author, most recently, of "The Americanization of Ben Franklin" (Penguin), dismisses this complaint."
I don't think his scholarship was deficient," says Wood. "If you're interested in Adams' political theory, forget it, it's not here. It's really a very personal story of Adams and [his wife] Abigail. As long as you accept that, it's done superbly."
In the end, the debate may be less about footnotes and facts than about just how popular, in every sense of the word, history should be allowed to be. Laurel Thatcher Ulrich, a professor at Harvard and author of the Pulitzer Prize-winning (and commercially successful) "A Midwife's Tale" (1991), shares some of Hoffer's reservations about shoddy history on the bestseller list. Still, she urges her peers to take a less proprietary attitude toward the past."
We need to have a little a bit of humility to recognize people can do what they want to with the past," says Ulrich. "Historians do not own history."
Matthew Price is a regular contributor to the Globe.
© Copyright 2004 The New York Times Company