Friday, June 12, 2015

Today, The Saga Of Brian Williams: Tell Me A Story & You'll Tell Me A Lie

In the following essay, Professor James McWilliams offers a meditation on memory triggered by the denouement of Brian Williams.earlier this year. Sniff, sniff. Do you smell the odor of burning trousers? If this is another (fair & balanced) epistemological expedition, so be it.

[x TAS]
The Examined Lie
By James McWilliams

Tag Cloud of the following piece of writing

created at TagCrowd.com

On March 24, 2003, Chief Warrant Officer Randy Summerlin, 31, looked out the window of his Chinook helicopter and became concerned. Several men were running across the Iraqi desert. They were whirling what appeared to be white towels over their heads to signal the impending approach of Summerlin’s three-copter convoy. Farther ahead, two figures emerged from a Nissan pickup. One held an AK-47 rifle, the other a rocket launcher.

What came next happened in a flash. “I felt a shudder in the aircraft and a big boom,” Summerlin told Stars and Stripes. A rocket-propelled grenade ripped through a cargo container the helicopter was transporting on a sling. Another tore a melon-sized hole into the aircraft’s tail housing. Two rounds from the AK-47 pierced the cabin, one hitting an electrical panel and the other nicking a soldier’s cheek. The Chinook retreated, ran into a sandstorm, and made an emergency landing many miles away, in a safer bowl of dust.

An hour or so later, a helicopter deterred by another sandstorm arrived and offered assistance. It was a run-of-the-mill detour, except for one detail: NBC News anchor Brian Williams was on board. The significance of Williams’s presence at this location wouldn’t be relevant until Williams publicly recalled the incident as “a terrible moment a dozen years back during the invasion of Iraq when the helicopter we were traveling in was forced down after being hit by an RPG.” He told versions of this story several times—all in public venues and all placing himself nearby or inside the downed aircraft—until January 2015, when his narrative, as well as his reputation, collapsed under scrutiny.

The ensuing public condemnation, wavering between moral outrage and mockery, was overwhelming. Williams responded by saying that he had “misremembered” the incident. The general public—at least most of it—would have none of that. He lied. Flat out. Bald faced. The weight of social media reinforced Williams’s new reputation as a scoundrel. NBC placed him on unpaid leave for six months (that’s a $5 million mistake, for those counting). Mission veterans put the final nails of judgment in Williams’s coffin. “I don’t know how he could have mistaken that,” one said.

“I don’t think it was a mistake.”

The most captivating public controversies are the ones where the response reveals more than the transgression. The “pants-on-fire” reaction by the public to Williams’s fabrication, the almost gleeful vehemence expressed on Facebook pages and across the Twittersphere, certainly confirms the seductive pleasure of catching someone red-handed. But this reaction also obscures the underlying messiness of the Big Lie. The gavel-like finality with which Williams was judged absolves us of pondering the deeper questions about how a situation like this one arises. What is the machinery of memory? How does memory concoct the stories we tell about ourselves? The failure to address these questions is unfortunate. The road to Truth might be paved with righteousness, but the precarious relationship all of us have with the past also lends false assurance to the stories that we consider to be objectively true. Condemning Williams and leaving it at that is an all-too-easy response to a much more interesting phenomenon: unintentionally misrepresenting the truth.

Autobiographical (or as psychologists call it, episodic) memory is necessarily flawed. The colloquialisms used to describe it—“etched into my brain,” “seared into my memory,” “if memory serves,” “never forget”—might emphasize its reliability. But these catchphrases capture an outmoded understanding of memory. It’s memory as an ageless photograph instead of memory as a time-sensitive dive into murky psychic territory. Psychologists who study the mysteries of memory speak with a tellingly different lexicon. Transience, misattribution, binding failure, and positive illusions—terms that point to the messiness of recollection—present memory as it really is: a necessarily flawed reconstruction of past experience rather than a carbon copy retrieved from a static cognitive archive.

If retrieving memory is a process—and recounting it a performance—then there are numerous ways its accuracy can derail. Daniel Schacter, a professor of psychology at Harvard, has spent his career researching those ways. In The Seven Sins of Memory (2001), he notes how “binding failures,” which happen when memory latches onto an inaccurate detail and deems it true, create “confusions between events we actually experience and those we only think about or imagine.” Our innate suggestibility tempts us to weave extraneous details from subsequent events—conversing with friends, absorbing miscellaneous media bytes, reading a novel—into the fabric of our original recollection. The gist remains (you know you landed in a helicopter in a desert amid a frisson of danger) but, as Schacter and others explain, the specifics can blur into impressions that in some cases disappear altogether. It’s not exactly a comforting thought, but every time we return to the incident, we take a different route to reach it and, in turn, come home with a slightly—or not so slightly—different story. The mind never remembers the same way twice.

Considerable research into the neurobiology of memory retrieval supports the idea that our recollections are inherently shaky. According to a literature review in the journal Nature Reviews Neuroscience, the molecular mechanisms underscoring what scientists call memory reconsolidation—basically, the recovery of a memory that has already been coded into the brain—highlight the presence of a “labile period” during which “the memory can be modified.” The initial consolidation of a memory depends on a protein synthesis. When protein synthesis inhibitors are introduced into the brain after retrieval of the original consolidated memory, the updated memory, which has passed through the labile phase, takes a different cellular pathway. The result is an alteration of the original memory.

However roughly, the genealogy of Williams’s story suggests the behavioral and biological foundations of misremembering. In March 2003, Williams, in an NBC segment that aired two days after the incident, reported, “The Chinook ahead of us was almost blown out of the sky.” In 2005, speaking with his colleague Tim Russert, he said, “The helicopter in front of us was hit,” adding that “a guy got up, fired an RPG … and it beautifully pierced the tail rotor of the Chinook in front of us,” an observation clearly suggesting he saw the incident. Two years later, in March 2007, Williams went back to Iraq, and the Associated Press, never citing a source, wrote that Williams was “traveling with retired U.S. Army General Wayne Downing, who was with him on a previous visit, when Williams’s helicopter was forced down by insurgent fire.” In July 2007, Williams, in a televised report, referred to a moment “when the Chinook helicopters we were traveling in at the start of the Iraq war were fired on.” In a May 2008 blog post, he explicitly associated himself with the attack, writing, “The Chinook helicopter flying in front of ours … took an RPG to the rear rotor, as all four of our low-flying Chinooks took fire.” By 2013, on the "Late Show with David Letterman," he was recalling how “two of our four helicopters were hit by ground fire, including the one I was in.” In January 2015, he reported in a "Nightly News" story that “the helicopter we were traveling in was forced down after being hit by an RPG.”

Elizabeth Loftus, a cognitive psychologist at UC–Irvine, attributes this phenomenon—the changing nature of a particular memory—to “the misinformation effect.” She told me she has been lecturing about its impact on the Williams episode ever since the news broke. What particularly struck her was “how eerily familiar it was” to a 2008 story told by Hillary Clinton, who falsely remembered enduring sniper fire as she and Chelsea Clinton ran across the tarmac after landing in an airplane in Bosnia 12 years before. Loftus hypothesized a scenario whereby Williams, like Clinton, unknowingly embellished his experience. “In 2003 he reports it accurately,” she said. “Two years later he sees another helicopter get attacked”—perhaps on a news report. “Two years after that his helicopter was attacked.” Thus, with every telling, the original version of the story slowly evolves into something different. Our adherence to a black-and-white sense of right and wrong might lead us to think that Williams’s embellishments were intentionally designed to mislead us. And perhaps they were. We might even decide that he is a self-aggrandizing jerk. And perhaps he is. But Loftus offers a more charitable interpretation and concludes, “Why be so harsh when it happens so often?”

Why indeed? I had my own reasons for being interested in the Williams ordeal. The news broke after I had just experienced a case of misremembering. In an autobiographical essay for the Los Angeles Review of Books, I wrote about meeting R.E.M. lead singer Michael Stipe in 1993 at a Washington, D.C., inaugural ball. Our face-to-face conversation was an unforgettable moment (well, for me anyway)—and thankfully that much was correct. But I also wrote that after we spoke, Stipe rushed to the stage and sang a duet with Bono, the lead singer of U2. That detail was incorrect. A reader gently reminded me that though Stipe did indeed perform at the party, it was with two other members of U2, not Bono. I felt foolish. A little scared, even, about my mistake. So I took some solace when Loftus—mentioning a mix-up in a story she’d told a friend about riding in a generic taxi versus a swank private limo—admitted that this sort of innocent confusion occasionally happens to her as well. It happens to everybody. When Williams was dealing with the fallout from his misremembrance, Bill O’Reilly of Fox News was called out by Mother Jones magazine for fabricating a story about, in O’Reilly’s words, “having survived a combat situation in Argentina during the Falklands War.” Further assuaging my concern was Hillary Clinton’s perfect response to her own misremembering incident. “I had a different memory,” Clinton stated. “It proves I’m human, which for some people is a revelation.”

We’re human. We misrepresent ourselves. Often without meaning to.

If the prospect of humans routinely communicating through inaccurate, if innocent, recollections sounds like a sociobiological dystopia, it shouldn’t. In his book Why We Lie (2007), the philosopher David Livingstone Smith contends that lying is not only normal but even necessary for maintaining psychological equilibrium and social stability. He hypothesizes that “far from being a sign of emotional disturbance,” self-deception exists “at the core of our humanity.” The evolutionary impulse to deceive, Smith contends, taps an ancient disposition—“the Machiavellian unconscious”—that enables us to mislead ourselves so that we can strategically mislead others. “The task of getting on in life,” he told me, “requires self-deception because too much honesty is antisocial,” not to mention detrimental to survival. In this respect, flawed memory isn’t an indication of flawed character. Instead, as Harvard psychologist Schacter puts it, it’s the “byproduct of otherwise desirable and adaptive features of the human mind,” the kind of adaptations that enable humans to communicate through implicitly agreed upon, if often factually inaccurate, narratives. The vices of memory, Schacter writes, “can also be its virtues.”

Intriguing as the idea of the virtuous fib is, traditional Western morality has little tolerance for it. There’s a tendency, buttressed by a long history of ideas, to associate being wrong with evil and being right with righteousness. This dichotomy—which requires all truth to be right and all lies to be wrong—works well for the courtroom and the house of worship. But it fails to honor the true nature of error. As Kathryn Schulz observes in her new book, Being Wrong: Adventures in the Margin of Error (2010), “the whole reason it’s possible to be wrong is that, while it’s happening, you are oblivious to it.” (Unless, of course, you are deliberately lying.) You assume you are right and, under that assumption, blunder blissfully through the world like a bull in a china shop, sending truths big and small crashing to the ground. We are quick to condemn the mess that ensues. But if we stop at the condemnation, we miss something critical: the possibility of blundering into something important.

Thinking you’re right when you’re wrong might make a person appear foolish, and it might greatly annoy those who see matters in more concrete terms, but a certain amount of blundering is necessary and even desirable. If we’re ever going to hypothesize, explore, and engage experimentally with the mysteries around us, we must cultivate a habit of mind that, to some extent, fearlessly refuses to accept reality as it is. As Schulz writes, “seeing the world as it is not” is “the essence of imagination, invention, and hope.” Every major breakthrough in the history of human thought has required thinkers to risk rejecting the existing understanding of truth, grapple with doubt, and make mistakes—many of them humiliating—before finding answers. Einstein described this process as “feeling for the order lying behind the appearance.” Error, which is a prerequisite for all progress, is proof that we are wrong. But more important, it’s also proof that the spirit remains playful, that it is feeling for something it cannot see, wanting more than meets the eye.

Innocent lies happen because memory, which is inseparable from error, connects humans through the stories we concoct. Playful spirits—Pierre Teilhard de Chardin, Stephen Jay Gould, Alan Turing, Herman Melville—tend to take the responsibility of storytelling more seriously than others. They’re the ones likelier to go out on narrative limbs. Brian Williams’s mistake, for all the discussion it has generated, confirmed something essential about his character: he’s a storyteller. We’ll likely never know the extent to which his lie was intended to amplify his self-worth or, more charitably, connect him to an audience for whom he genuinely cared. But as a newsman working in a media climate that, sadly, expects anchors to entertain as they inform, Williams was obligated to be not only a journalist but also a personality. For as much mockery as he has endured, the fact remains that, in grabbing us by the lapels and saying you gotta hear this, Williams was inviting us to be moved, to witness his enthusiasm, his narrative verve, and as it would turn out, his blundering.

In his 1936 essay “The Storyteller,” [PDF] the German philosopher and critic Walter Benjamin identifies “the trading seaman” story as a genre of storytelling that requires those who travel abroad to return home with “the lore of faraway places.” Such seamen are expected to be generous when sharing that lore, to hold the village in thrall, to capture the imagination of the villagers. It’s a social duty of sorts, and in an age of rapid mechanization, Benjamin lamented its decline. In this light, perhaps Williams’s fabrication was an unintended consequence of his fealty to the diminishing role of the trading seaman. Perhaps, for a born storyteller who spends much of his professional life sitting behind a desk reading copy into a camera, the prospect of being liberated and coming home empty-handed was unthinkable. Or as David Livingstone Smith might put it, perhaps he saw such a response as being “antisocial,” sort of like sailing into the Heart of Darkness, never finding Kurtz, and coming home to regale your shipmates with esoteric stories about the flora. Could Williams, on some unconscious level, have felt duty-bound to lie?

Given that storytelling is a collaborative endeavor between the teller and the listener, this question seems as important as any question about Williams’s factual accuracy. The storyteller, after all, needs an audience like a fish needs water. This reality further confirms the impulse to burnish carefully selected pieces of memory. British psychologist Martin Conway notes that “two forces go head-to-head in memory.” There’s the desire to get the story right, to accurately represent the impressions that pin the memory in place. Then there’s the desire to loosen the details from their foundation and arrange them in an elegant arc, to present a story that holds together, sticks, and captures us with its power. For the storyteller at least, the latter impulse not only overtakes the former, it also feeds off itself, ensuring that, as Williams’s favorable ratings increased, so did the pressure to make an old story more compelling than the previous version.

Telling a good story is hard to do. It is certainly harder than reciting a list of facts. It requires more mental flexibility, more creative energy, more wit, more empathy, optimism, vulnerability, grandiosity, and adherence to the spirit—not the letter—of the Truth. The effective storyteller must, among other skills, imagine an audience, see himself from the outside in, and hear his own story with his own ears to evaluate its impact. This somewhat terrifying exercise understandably compels him to shape a self-serving narrative—he is constructing a public identity while shining a klieg light on that process. If there’s room to dodge, most storytellers will, understandably, do so in a direction that flatters.

The psychologist Charles Fernyhough, author of Pieces of Light: How the New Science of Memory Illuminates the Stories We Tell About Our Pasts (2013), identifies this habit by describing a study of disputed memories in twins. In it, researchers found that “if the person at the center of the memory did something admirable, or had something bad happen to them … then [the memory] tended to be claimed for the self.” By contrast, “if the star of the memory was shown in a bad light, it tended to be passed off onto the other.” This seems fair. If we take the risk of presenting a story to others, we should at least reserve the option of rewarding ourselves with a little self-serving distortion, a small dopamine-boosting lie that enhances our self-image.

If there’s an elephant in the room in the Williams incident, it’s the context in which he misremembered: combat. War—a mirror-house of memory distortion—exacerbates the already pervasive bias fostered by everyday misattribution. Writing to his friend Charles Poore in 1953, Ernest Hemingway warned him, “Remember Charlie in the first war all I did mostly was hear guys talk; especially in hospital and convalescing. Their experiences get to be more vivid than your own. You invent from your own and from all of theirs.” Hemingway brilliantly developed this theme of wartime misattribution in A Farewell to Arms (1929, 1957, 2012). In it, a unique rhetoric builds around combat, one that articulates the aesthetics of wartime trauma for the purposes of describing the combat experience for noncombatants, rather than expressing the prosaic truth of actual wartime experience—which, as soldiers know, can be maddeningly dull.

This charged rhetoric evolves alongside the book’s expatriate protagonist, Frederic Henry. Frederic’s combat experience is generally unremarkable. But when asked how many Austrians he has killed as a lieutenant in the Italian ambulance corps, he recalls, “I had not killed any, but I was anxious to please—and I said I had killed plenty.” Later in the novel, Frederic receives medals for suffering an injury while supposedly rescuing wounded soldiers from the battlefield. He wears the medals inside his coat even though, as he confides to a friend, he doesn’t deserve them. “I was blown up,” he confesses, “while we were eating cheese.” Hemingway—who at one point has Frederic’s lover Catherine tell him, “Keep right on lying to me. That’s what I want you to do”—may have been channeling his own case of wartime embellishment. Though the public loves the story that Hemingway was wounded as an ambulance driver, he was actually hit with shrapnel while handing out chocolates to soldiers as a Red Cross volunteer. He was on canteen duty.

Clinical research confirms that the fog of war can enhance the fog of memory. Soldiers’ recollections of combat-related incidents are unusually unstable and exaggerated. When Gulf War veterans filled out questionnaires on two separate occasions about their specific wartime experiences, almost 90 percent of them changed one of their responses. Seventy percent recalled an event during the second interview that they’d failed to note during the first. Memory variation was similarly evident with noncombatants who were in the vicinity of war. In a 16-item questionnaire, 88 percent of Gulf War peacekeepers changed an answer between their first and second response. War, it seems, is not only hell. It’s hell on memory.

Oral historians get this better than others do. J. Todd Moye, professor of history at the University of North Texas and the author of Freedom Flyers: The Tuskegee Airmen of World War II (2010), is intimately familiar with such discrepancies. He helped conduct more than 800 oral histories on the Tuskegee airmen—African-American pilots who fought in World War II. In a recent conversation, Moye explained that inconsistent memories were endemic to individual and collective combat accounts. When interviewing people associated with the Tuskegee airmen (but not necessarily the combat veterans themselves), Moye and his colleague Bill Mansfield had to become experts at distinguishing between the “wannabees” and the “sortabees.” Both groups habitually deployed the royal “we”—as in “we flew here, we flew there”—to mask the reality that the eponymous “we” never saw a split second of combat. “We” may have never even been in an airplane. The sortabees were at least affiliated with the airmen—as airplane mechanics and such—but the wannabees had nothing to do with them at all. Moye sighed. “There’s something about the war narrative,” he said, “that deals with machismo and social prestige and leads to a lot of made-up stories.”

Obviously it’s possible that Brian Williams intentionally fabricated his wartime story. Maybe he is, as so many of his critics have said, a pathological liar with a self-inflating mission. The 11 other incidents of exaggeration that NBC is investigating certainly support such an assessment. Still, it seems unlikely that a person of Williams’s intelligence would consciously tell such an exposed lie and think he’d walk away unscathed. Social media, for all its lunacy, keeps us honest, and nobody knows this better than those who feed information into that leviathan and experience the fact-checking backlash on a regular basis. So, given what we now know about the instability of memory, as well as the crucible of war, there’s ample space to join psychologist Elizabeth Loftus in offering a more charitable assessment of Williams’s misremembering episode. However provisionally or tentatively, can we forgive Brian Williams for his sin?

Making that charitable move pulls observers of Williams’s drama into the incident in a more personal way. It requires us to recognize a closely related phenomenon, one that’s just as potentially disruptive as the idea that individual memories are innocently distorted. Specifically, we must also acknowledge the fraught nature of collective memory. Collective memory—call it history—is something we all participate in. And it’s subject to the same biases and misattributions (and willful misinterpretations)—consider Holocaust denial, support for the Lost Cause of the Confederacy, and Armenian massacre revisionism—that distort an individual’s episodic memory. Blind error, as is the case with episodic memory, is, for better or worse, equally essential to preserving our role as members of relationships, families, clans, communities, and nations. Whereas autobiographical memory is prone to inaccuracy, collective historical memory—as any honest historian will tell you—is practically governed by it. We routinely warp memories and mislead each other as individual agents guided by “truthiness.” And the histories we collectively create and identify with are often similarly abstracted from the truth, driven by motives that may lurk in an unconscious netherland, serving larger, and sometimes darker, motivations.

For better or worse, our dependence on collective memory defines much of our lives. Without shared memory there would be no love, friendship, or loyalty. There would be no hatred, jealousy, or revenge. Because shared memories bind us through carefully crafted stories that honor identity cohesion more than empirical accuracy, they are the narrative glue holding together marriages, families, nation-states, and softball teams. By integrating the self into a group through stories, shared memories infuse tradition with meaning, demand our presence at more funerals than births, and as Nietzsche observed, hold us to future commitments that keep us loyal to the past. Still, despite the gravity of these tasks, collective memories are never as stable as we might hope them to be. They’re always contested, usually provisional, and often dead wrong.

But this chronic inaccuracy isn’t necessarily a cause for existential despair. False collective memory (again, as with autobiographical memory) may be an unintended consequence of an otherwise socially beneficial attempt to weave our personal narratives into the most generously construed stories we can construct—the ones that ultimately empower us to leave behind a history for future generations to make their own.

False collective memories instigate outrage every bit as intense as the anti-Williams backlash. The perennial battles over public commemorations—courthouse displays of the Ten Commandments, a billboard of the Ku Klux Klan founder put up in Selma on the 50th anniversary of the famous Civil Rights march, or a monument in Pittsburg, Texas, praising Baptist minister Burrell Cannon as the first person to build a plane that went airborne—are cases in point. While memorials often evoke the past in moving ways—think of Maya Lin’s Vietnam Veterans Memorial—they can just as easily score cheap ideological points. From my home in Austin, Texas, I can walk a few blocks to the statehouse and read a dozen historical markers that honor the Confederacy’s commitment to states’ rights while practically obliterating the reality of slavery. The French cultural historian Pierre Nora refers to this form of historical reconstruction as “commemorative bulimia.” Under the influence of disingenuous commemoration, history drowns in nostalgia (or worse) instead of constructing a just and usable past. Shallow commemoration blots out history and fills the space with intentional and self-serving distortions.

Fortunately, efforts to rearrange the past aren’t always so cynical. In her book The Past Within Us: Media, Memory, History (2005), Tessa Morris-Suzuki identifies a more honest quest for “historical truthfulness.” Acknowledging that there’s no such thing as a single “historical truth”—and well aware that, quoting writer Nishio Kanji, history is “an uncertain accumulation of human wisdom formed from the fluid substance of language”—Morris-Suzuki explores the “imagination and empathy” involved in recovering and crafting history’s most alluring narratives and, in so doing, uncovers a more hopeful form of storytelling. The goal in this version of historical reconstruction isn’t to ignore history’s ugly truths but to exhume them, grapple with them, and develop a more comprehensive understanding of human behavior over time.

Like the errant seaman, the curious seeker of historical truth, driven by virtue and a quest for justice, “invites us to enter into an empathetic relationship with the people of the past: to imagine their experiences and feelings, mourn their suffering and deaths and celebrate their triumphs,” Morris-Suzuki writes. This empathy applies to history’s victims as well as its perpetrators. What better way to understand the full moral perspective of a historical reality such as slavery than to try to see it through the eyes of both the slave and the master? Through this shared endeavor, she explains, “We create our sense of belonging to certain groups of people.” This move toward imaginative inclusion is cultivated through historical narratives that are necessarily inaccurate. The historian, when it comes to the past’s most intimate details, has no choice but to be almost always wrong. Just as we cannot reconstruct exact personal memories, we cannot construct exact historical memories. We must fill in the details with educated speculations. But that’s okay, because if factual errors result from the impulse to construct a sincere and just historical narrative—a narrative that inspires true insight, and maybe even the better angels of our nature—then the payoff can take us further than factual accuracy ever could (even if it were achievable). Wisdom thus derives not only from formal history but also from more openly imaginative reconstructions of the past. The works of Shakespeare and John Dos Passos foster just as much insight as “objective” histories from credentialed experts.

In Prosthetic Memory: The Transformation of Remembrance in the Age of Mass Culture (2004), Alison Landsberg, a historian at George Mason University, suggests how virtuous opportunities might arise in the realm of collective memory. Her thesis relies on what we might well dismiss as a slippery maneuver—claiming someone else’s history as your own. Landsberg begins with the premise that the modern era “makes possible and necessary a new form of public cultural memory.” She promotes a manner of recollection that, though she knows it can backfire, allows “taking on memories of events through which one did not live.” In the information age, these memories of a past we didn’t directly experience are deposited into our own memory bank to help determine our place in sociohistorical space—hence the idea of a “prosthetic” memory. They are, as Landsberg writes, “transportable,” and as such, they encourage experiences “through which a person sutures himself or herself into a larger history.” If that seems too theoretical, consider this: when children from a variety of racial backgrounds are exposed to a documentary such as "Eyes on the Prize," about the history of the Civil Rights Movement, or when Americans form impressions about the Civil War based on Ken Burns’s famous series, shared prosthetic memories form.

It’s a subversive historical notion. Landsberg knows this. She writes about how prosthetic memories “challenge more traditional forms of memory that are premised on claims of authenticity, ‘heritage,’ and ownership.” But her ambition doesn’t end at disruption. In a phone conversation, Landsberg, while expressing a keen understanding of how history can be intentionally distorted, highlighted “the utopian thread” in her book. She explained how prosthetic memories “allow us to see through other eyes—to feel empathy.” Claiming collective memories that aren’t ours—something, as we’ve seen, that’s also common in autobiographical memory—can lead to desirable results. Landsberg, who is especially interested in the role of media (rather than, say, literature) in shaping historical memory, stresses “the possibilities that are opened up by prosthetic memory to forge connections.” The “ethical dimension” of prosthetic memory, she told me, enables us to take narratives not our own and rewrite them from within, revising them so that “people who share little in the way of cultural or ethnic background might come to share certain memories.” In the best of scenarios, a collective memory can take us beyond ourselves and into a relationship with others, bound by a shared story—a story that’s created to serve higher purposes than mere truth.

Historians spend careers making grist for the narratives we collectively construct. But as Landsberg’s work indicates, historians aren’t necessarily the ones crafting the big stories. The media do that. Brian Williams does that. As the Williams controversy intensifies, as major media outlets offer shallow interpretations about the Williams incident, for instance saying that it “is not about misremembering or lying; it’s about millions in ad revenue and the sanctity of network news,” it’s worth stepping back and recognizing that the carnival of deceit is everywhere—and it includes us.

Truth matters. Journalists should be expected to report the truth as accurately and reliably as they can. But how we seek the truth also matters. Contemporary life bombards us with the Big Lie. The conventional wisdom is that we should fight those lies with the truth. If memory serves, then truth prevails. But memory is a shaky thing, and as a result, we can fight lies only with our own versions of the truth. The real trouble with Williams’s fabrication isn’t that it was a fabrication, or that it may have been unintentional. Rather, it allows us to keep pretending that moral behavior is black and white; that the integrity of our aspirations is somehow protected when narration yields to objectivity; that a lie is just a lie, and not an act to be examined for what it says about ourselves, about the stories we tell each other, and finally, about the stories we want to hear. Ω

[James McWilliams is the Ingram Professor of History at Texas State University. His most recent book is The Modern Savage: Our Unthinking Decision to Eat Animals (2015). See the other five books by James McWilliams here. He received a BA (philosophy) from Georgetown University, as well as an MA (American studies) from The University of Texas-Austin and a PhD (history) from The Johns Hopkins University.]

Copyright © 2015 The American Scholar/Phi Beta Kappa



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2015 Sapper's (Fair & Balanced) Rants & Raves