Of all of the things that I am (and am not), I cannot escape my fate as an only child. If this is (fair & balanced) self-pity, so be it.
[x New York Magazine]
The Onlies: Only children are just like most New York kidssophisticated, precocious, sometimes a little lonelyonly more so.
By Vanessa Grigoriadis
If you saw Ondine, who’s a fifth-grader at Grace Church School, and her mom, Elizabeth Cohen, walking down the street, you’d probably guess that Ondine is an only child. They have the same stride and the same carriage, and even though there’s more than a foot of difference between their heights, they chatter like best friends. On a recent Saturday afternoon, as she goes with her mom to pick up some dry cleaning on Prince Street, Ondine is dressed in a hipster’s outfit of camouflage pants, bomber jacket, and new Skechers boots with an adult-size heel. Her cell phone is in her mom’s purse. Her fluffy hair is accented with blonde highlights, quite unlike when she had it cornrowed recently for a friend’s Survivor-themed party. The next day, she was invited to the Plaza for tea. “She looked like she had dreadlocks,” says Cohen.
“At the Survivor party,” says Ondine, “they had a monkey on a coconut, and you had to throw these hoops and get them on the monkey—”
“It was in between seasons, so she went to tea in this long black dress,” says Cohen. “She looked almost Amish—”
“I did not look Amish,” says Ondine, grimacing. “I have never been so insulted!”
Ondine, who is still not allowed to go outside alone—“Because of serial killers lurking everywhere,” she says, rolling her eyes—has a strong opinion about everything, and it’s usually a good one. She doesn’t like Hillary Duff, but she does like Coldplay and the White Stripes. Her favorite restaurants are Il Cantinori and Lure Fishbar: “I like the architecture there—I mean, the interior,” she says. “It’s like a boat. I like it better than Balthazar—they make everything such a big deal. You buy a little salad and it’s really expensive and they say it has this sppppecial Itttttalian drrrressing.”
Ondine is living a full Manhattan life. Why would she even want a brother or sister? After all, she has a cat. “I have to bug my mother to have playdates all the time,” she says, shooting her mom a sly look. “I’m loooonely,” she says.
If New York children are sophisticated, precocious, the city’s only children are even more so. How could they not be? Often, as in the case of Ondine, they live like little adults, eating the same food, having the same conversations. The normal red-state/blue-state division in a family—parents versus children—does not apply. A family with a single child is all for one and one for all. The children take on some of the characteristics of adults, and the adults take on some of the characteristics of children—though the child tends to be the focus of everyone’s attention, including his own. Growing up on the Upper West Side, I didn’t want a sibling—I wanted a twin. An identical twin, just like Sweet Valley High’s Jessica and Elizabeth Wakefield, my generation’s version of the Olsens. (Ashley and Mary-Kate Olsen are, in fact, fraternal twins, though it takes a real fan to tell them apart consistently.)
Those of us who have been only children cannot imagine life another way; it’s an unknown unknown, as Donald Rumsfeld might put it. Families seem predestined when you are a part of them, as John Updike, an only child who grew up with his parents and mother’s parents near Reading, Pennsylvania (terminus of the famous Monopoly railroad), writes: “I was an only child. A great many only children were born in 1932. I make no apologies. I do not remember ever feeling the space for a competitor within the house. The five of us already there locked into a star that would have shattered like crystal at the admission of a sixth.”
There were indeed many only children born in Updike’s time; from 1920 to 1940, the percentage of only-child families rose to 30 percent, primarily because of the economic hardships of the Depression. Afterward, during the baby boom, the number fell to 15 percent. Today, according to the 2003 Current Population Survey, single-child families outnumber two-child families (20 percent versus 18 percent), and social scientists tentatively predict that the number of onlies will keep growing, bringing the national average number of children per family down below 2.1. In Manhattan, more than 30 percent of New York City women over 40 have only one child, and over 30 percent of all families are single-child families, according to data compiled by Rutgers University.
There are many reasons to have one child—population-control arguments and lifestyle arguments as well as a general desire to be more cosmopolitan and European (where the average family size is estimated at 1.4 children)—but what are most often mentioned are late pregnancy and the cost. More American women than at any point in history are conceiving after the age of 35. It is more expensive to raise a child to age 18 than ever before—according to the Department of Agriculture, the national average expenditure for parents making over $70,000 is $323,975 ($47,467 is for food).
New York is even worse. The average apartment price in Manhattan is over $1 million. This year, it costs $25,000 to go to high school at Dalton, $13,000 more than it did when I graduated thirteen years ago. Astronomical expenses like these, and the focus on career, and the fact that for many, the postcollegiate support system is extended well into the thirties, make New York City the national capital of only children.
The classic American idea about only children, nurtured in suburbs where two children could seem too few, is that they’re oddballs—coddled, spoiled, lonely. Raised without the camaraderie and competition of sibling society, they’re simultaneously stunted and overdeveloped—a repository of all their parents’ baggage (hello, Chelsea Clinton). When the Chinese government mandated in 1979 that each family could have only one child (a directive that would lead to 70 million only-child births over the following two decades), President Reagan gave special consideration on immigration applications to Chinese objectors to the one-child policy. The negative stereotype of only children persists today: According to a 2004 Gallup poll, only 3 percent of Americans think a single-child family is the ideal family size.
“I don’t remember what I did to cause or provoke it, but one day as a punishment the teacher told me to stand up in front of the whole class and tell them what it was like being an only child,” Betsey Niederman, an actress and the mother of a 7-year-old only herself, Matthew, says of childhood in New Canaan, Connecticut, in the fifties. “I will never forget those glaring eyes—it was like The Scarlet Letter. I started to talk, but my eyes misted over and I ran out of the classroom. I walked along the Merritt Parkway until an elderly couple picked me up and took me home. I told my mom that I never wanted to go back there, and she let me stay home before I switched schools. She didn’t want me to go to school anyway. She was attached to me.”
Although large-scale empirical studies, in vogue in the late seventies, found only children to be no more bratty and lonely than other children and that the most important factor—thank you, Dr. Freud—is the quality of the parenting, the stereotypes have recently found some—smallish—validation in academic research. In a paper published this year in the Journal of Marriage and Family, Ohio State University researchers analyzed a Department of Education study of 20,000 kindergartners and found that while only children perform at the same level as firstborns in school, they tended to be less socially skilled than their peers—more temperamental, less sensitive to the feelings of others.
Only children in pop culture are not exactly well-adjusted. Chandler is the only Friend without siblings, the “weird,” neurotic friend who also has a third nipple and a drag-queen father. Noah Wyle (Dr. Carter) is the chilly, entitled only child of ER. None of the characters on Seinfeld have siblings, nor do those on Curb Your Enthusiasm, except the sap, Cheryl. The only child is a superhero—as in Harry Potter or on Buffy, where the only sibling in sight was the magical creation of a gang of evil monks—or a brat. Here is Eloise on room service: “I always say, ‘Hello this is me Eloise and would you kindly send one roast-beef bone, one raisin and seven spoons to the top floor and charge it please, thank you very much.’ Then I hang up and look at the ceiling for a while and think of a way to get a present.”
A thoroughly unscientific culling of famous only children can suggest a certain kind of character, one who’s comfortable (sometimes too comfortable) creating his own weather, who is at home (sometimes too at home) with his or her own contradictions—and occasionally something of a megalomaniac: Alan Greenspan, Frank Sinatra, Tiger Woods, FDR, Rudolph Giuliani, Roy Cohn, Laura Bush, and the three Apollo 8 astronauts. Elvis, Priscilla, and Lisa Marie Presley were all only children—Elvis’s twin brother was stillborn, and after his father was sent to jail for forging an $8 check, he and his mom started sleeping in the same bed, which would continue until he was an adult. Elvis swiped his first uppers from Gladys, and when she died in 1958, some think he never recovered. William Randolph Hearst, the inspiration for Charles Foster Kane, was doted on by both his possessive mother and his uneducated, millionaire father, who granted his every wish except when he asked on a trip to London to move into Windsor Castle (he would later spend 30 years constructing his own palace, Hearst Castle). Then there’s Hans Christian Andersen: Born to a 23-year-old shoemaker and 30-year-old washerwoman, the highly religious children’s author liked to say that his life was a wonderful fairy tale. He began his first autobiography, “I feel that an invisible and loving hand directs the whole of [my life]; that it was not blind chance which helped me on my way, but that an invisible and fatherly heart has beat for me.”
That you could have parents whose hearts beat for you and only you is a comfort. For one thing, there are no set limits on what a parent will give an only child, no pressure from other siblings to split things up. It’s not spoiling, it’s just . . . life. You are the one who gets to decide what playground to go to. You are the one who gets all the money in the end. You are the one who, if you happen to be Leyla Marchetto, daughter of Da Silvano’s proprietor, gets to travel the world with your dad, almost like a lover, visiting Nice, Florence, Kenya, Paris, and St. Barts before 18, and who is assured when you move to L.A. at 25, partially to distance yourself from that father, that his compatriots, like Jack Nicholson, will be looking out for you (this is assuming that Jack Nicholson’s looking out for you is a good idea). Family becomes not so much a pedagogy as a democracy, not even a family, really, or at least it feels that way to the kid. There is the perception, if not always the reality, of equality: When my dad would take second helpings at dinner, I always insisted on the same—“But I am bigger than you,” he tried to explain. It’s almost too much, how intensely everyone relates to each other, especially in a standard two-bedroom apartment. “When I tried to transfer to boarding school in eleventh grade because I disliked my school so much, my dad sent me to a shrink to make sure that I wasn’t trying to run away from him,” says Joanna Bernstein, 31. “That’s what being an only child is.”
For only children, that blessed sense of entitlement, where you’re always listened to and taken seriously—if not to Paris or Kenya—is at war with a sense of being smothered. “I feel that the attention was wonderful, but the imperative to bloom—to be happy, really—was not,” says Deborah Siegel, project director of a women’s research center and co-editor with Daphne Uviller of a forthcoming book of essays on only childhood, Party of One. “Even now, my mom talks about how The Runaway Bunny was my favorite book, which it wasn’t necessarily. It’s about this bunny that runs away to join the circus: The mother says, ‘If you go away, I’ll become the circus master,’ and then the bunny says, ‘I might go be a sailboat,’ and the mother says, ‘I’ll be the wind.’ ”
She might have preferred, say, Tolstoy. Only children tend to develop precocious interests—Ondine started Dante’s Inferno at 7, though she didn’t make it to the second page. Matthew Niederman regaled me with a detailed explanation of hyperspace, the relative advantages of Richard Meier’s new buildings, and why we should pull out of the Iraq war but not before scouring the hole where Hussein was found, because that’s where the WMDs are. Nevertheless, “I’m looonely,” he said, striding around the toy-strewn living room. “My dad is on his computer 8,000 hours a day. I have all these cool realistic toys, but they’re fake. Game Boys hypnotize your brain, but I could have a Game Boy Battleship with two players and maybe I could have someone to play it with me.” He does have the company of his girlfriend, though. “She likes my jokes and thinks I’m funny,” he says with a matter-of-fact shrug. “I think she’s kind and loving.”
Then Matthew’s mom and dad told him they were going out to dinner.
“Noooooo!” he shrieked, jumping in his father’s lap. “I want chicken! I want Daddy’s chicken!”
Rather than having siblings always in your business, you have parents, who tend to be much more formidable adversaries and who can choose not to cook chicken. The issue tends to arise when kids don’t share parents’ interests, or vice versa—usually around the time kids start wanting to play board games, the only child finds himself at odds. From about third grade on, I’d come back from violin, piano, ballet, modern dance, or fencing lessons around six and then—what? Dinner and homework, which might include a science project that was a whole-family project, or perhaps some math-problem sets, which could mean a fight with my father (“You’re getting it wrong to spite me” was his explanation for my lack of aptitude). With a sibling, you would remember what you did. That was the time Karen and I played tennis by the light of the moon, and that was the time Sam froze all my underwear in the refrigerator. Instead, it was always us three roomies, self-motivating to do whatever it was we had to do, except that the two of them got to tell me I didn’t practice my violin long enough and I should go to sleep already. The condition of being an only child gives one a lot to think about—and plenty of time to think.
Greta is a popular girl in her second-grade class, especially with some of the boys. As a philosophical thinker, she’s surprisingly advanced. “I don’t think we’re real,” she says, smiling. “I think we’re all the imagination of God.” From time to time, she stands in front of the mirror and whispers, “Oh, God, I’m changing so much!” She looks less like one or the other parent than an exact combination of the two, as if she were digitally composed. Matt Keating and Emily Spray have thought about having another kid, but Greta was a tough baby, so they didn’t feel that they could handle another, plus there are careers to consider—Matt’s as a musician, Emily’s as a designer. “At the time your financial responsibilities triple, the time you have to put into your career is cut in half,” says Keating, echoing the concerns of many a New York parent. “It would be hard to do again.”
So it’s just the three of them—plus Greta’s imaginary friends, Choga and Honchi, which are only Greta in the mirror, though they’ve been less in evidence recently; Choga, disturbingly enough, recently died in one of her dreams. They live in a beautiful apartment in a doorman building—granted, they do share the bedroom, their areas divided by a canvas scrim—and Greta’s room is a child’s dream, packed with all the best books and toys and stuffed animals. The coffee table in the apartment is where she plays her games, like Store, or Artist, or School, which was the main event one Saturday morning in October (there is some TV in the house—not much—and a computer, but no video games). “Does anyone know what 400 plus 400 is?” she asks, at her chalkboard.
She scrambles to the floor to sit alongside her students—Noae, Alic, Peae, Picyo—and waves a hand over her head.
“I remember telling my husband, ‘You don’t understand— I was an only child. I need to hear that I’m beautiful and smart and that you love me every single day,” says an adult only child. She had to learn “how to be in a different kind of relationship.”
“I know! I know!”
It used to be that Greta wouldn’t get up in the morning unless her mom did—“I don’t want to be alone,” she’d say—but six months ago, she begged for a cat. (It’s hard to deny an only child a pet.) Now she wakes up all on her own and goes to the living room with Timothy to play. The cat, predictably, has become a comic sibling. “It feels kind of pathetic to compare a cat to a human, but having another living thing has changed things in the house so much,” says Spray. “It’s made me wonder if we missed an opportunity not having another kid.” Greta calls Timothy “Little Brother.” She holds him like an infant and dances around the room—then pushes him off her shoulder and he crashes to the floor. That’s something you couldn’t do with a real little brother.
But with a real little brother—though most only children say they’d rather have an older sibling, because they want someone to tease them—there would be all sorts of other problems, like the almost tribal dance of vying for parents’ attention. The family would petrify into a whole different, Brady Bunch tableau: a domineering and successful older-born (Marcia, Marcia, Marcia); a nonconfrontational and confused middle; a flaky yet power-hungry last.
This is the popular mythology of birth order, which has some truth to it—lasts and middles have indeed been shown to be less successful, as a group, than firsts. Birth-order theory has been around for over a century, and most of the stereotypes have been put to rest. A lot of these studies used only children as a control group. It wasn’t until the seventies that they became a subject of study in themselves: With women bearing fewer children and the rise of feminism, there was a notion that only kids might constitute a potential public-health problem, and suddenly there was money for meticulous empirical sociological studies on onlies (i.e., the vocal patterns of onlies versus firstborns at 3 months), many led by Toni Falbo of the University of Texas at Austin. Falbo and her colleagues were surprised to find that there were no findings. Not only was there nothing wrong with being an only child, but, Falbo found in studies in both the U.S. and China, only children’s personalities were in almost every way comparable to firstborns’. They were no more selfish, socially awkward, grandiose, or needy—though firstborns were generally considered more attractive.
Grant money moved on to more pressing public-health issues, and a cottage publishing industry of pop pro-only-child apologias sprang up, written either by parents of an only or by onlies themselves—Ellie McGrath’s fine My One and Only, Susan Newman’s Parenting an Only Child, a niche publication called Only Child Magazine, and Bill McKibben’s Maybe One, an Easter Island defense of onlies (i.e., with a world population of 12 billion by 2050, only one is only ethical). However, while Falbo and her colleagues found no statistically significant differences between only children and firstborns, there were quite a few hypotheses for further testing. It was found, for example, that only children had a smaller circle of friends and adults with whom they socialized than firstborns, and that they perhaps continue this trend later in life; that they had a peculiarly mixed self-esteem pattern, whereby they thought of themselves more often than kids with siblings, but were less likely to compare themselves positively to others, as firstborns did with their little brothers and sisters; and that onlies perceived their parents more affectionately than did other kids, perhaps because they received more consistent and moment-to-moment reinforcement from those parents. Studies are ongoing—Falbo herself is beginning work on a ten-year study of adult only children, using one of the sociologist’s favorite databanks, the people of the state of Wisconsin—though the field is a little stagnant at the moment. “In order to answer the most interesting questions, we would have to take kids and randomly assign them to different families,” says Douglas Downey, a sociologist at Ohio State, “but we can’t do that, obviously, and we’ll never be able to do that.”
What studies did establish was that, for all the Nicholas Scoppetta success stories (an orphan, he was raised in a shelter), the most important factor was the socialization provided by the mother—the only child is “at the utter mercy of his education,” wrote Alfred Adler. With only children, the bond of the triad itself and that between parent and child was found to be especially intense, particularly in the case of mother and daughter. I heard, in my interviews, many moms that called their daughters “my beautiful little friend,” “my best friend,” or even “my little sister.” While only children and their moms were found to be more flexible in their understanding of typical sex roles, perhaps because the child was forced to satisfy both parents’ desire for self-replacement, the mother-daughter relationship was especially fraught, subject to infinite analysis.
Last summer, Karen Stabiner’s 14-year-old daughter, Sarah, went to Central Park with the class from her summer film program, while Karen went to visit a friend on the West Side. Her friend wanted to walk across the park. “I said, ‘I can’t do that. What if I bump into Sarah? She would be so embarrassed.’ My friend said, ‘It’s Central Park. What’s the chance you’ll bump into your daughter? Sure enough, we got to an underpass by 72nd Street and there she was. It was her day to shoot, so she was telling everyone what to do—she wanted the camera guy over there and the actor over there. I said, ‘I can’t say hello to her, she’ll be humiliated’—because she’s my only, I’m very sensitive to whether I am being a good parent in terms of letting her go. Finally, I said, ‘Okay, if she sees us and we make eye contact, I’ll say hello, but if she doesn’t see us, I’m not going to intrude on this moment.’ I held an umbrella to the side so she wouldn’t see me, and when we made it through the underpass, I had this really visceral rush of relief that I had done a good thing, a difficult thing. Later, I ended up telling Sarah, and not only didn’t she believe me, but she was upset that we hadn’t said hello. So you can’t win for trying.”
My parents claim they had only one because they “hit the jackpot,” but my mom also says my father would have had another except she didn’t see the need. My mother and I, who bear a considerable resemblance, are close. Very close. We share a similar sense of drama, and we tend to see my childhood as a series of nightmarish affronts to our sacred Kristevian bond. Like the whole situation with the “white girl kitten” I requested for my seventh birthday, an incident I asked my mom to clarify recently over e-mail. She wrote, “Neither Dad nor I had had pets, so we were not so enthusiastic about a cat, but one day you said, firmly and seriously, ‘I don’t have any sisters or brothers. I need a kitten.’ One of Dad’s Ph.D. students had a cat with a litter, and on your birthday he brought the cat (Pebbles) in a birdcage. She was very high-strung and wasn’t the greatest pet, but next summer we found a beautiful golden kitten in our garage in East Hampton and she became your adoring patient playmate.”
How cute! And I did love that second cat, my own “little brother,” Squeak. But there’s one part Mom has left out. Pebbles, being a nasty cat by nature and perhaps not all that fond of getting dressed up in baby clothes, including bonnets, decided late one night when my mom and I were reading on my bed to attack. My mother pushed me in front as we dashed down the hallway with Pebbles’s teeth attached to her calf (she still has scars). We took refuge in the bathroom as Pebbles threw her little body against the door in repeated desperate attempts to come in and kill us. Finally, all was quiet, and as I waited in the bathroom, my mom and a neighbor coaxed her into the cat carrier and then, just in case she possessed some previously unseen Houdini-esque cat capabilities, put the carrier out on our twentieth-floor terrace and locked the door. The next morning, Pebbles went to Uncle Jimmy the Vet, who said that she had experienced a nervous breakdown, something one in every sixteen cats may have in their lifetimes. Furthermore, she was not a city cat. She was a country cat. There was a nice nurse who had a farm upstate, and she liked Pebbles very much. “Bye, Pebbles,” I said, as she glowered at me through the mesh of a kennel cubicle.
It wasn’t until I was 15 and became friends with a fast girl from Fieldston who smoked Camel Lights, wore ripped jeans exclusively, and somehow had a driver’s license before everyone else that I realized what had happened. “Your mom killed your cat,” she drawled, flicking a cigarette out the sunroof of her Audi.
My mother was making dinner when I arrived home with this shattering information. “I wish you wouldn’t be friends with that girl,” she said. So much for my little sister.
In an only-child family, every member—except, apparently, the cat—is indispensable. The giant investment of time and love in a child can create outsize worries about mortality. A child—any child—dying is unthinkable. But many of the parents of only children I spoke to have spent a lot of time thinking about what would happen when, eventually, they die—possibly a projection of their real fears. It doesn’t readily occur to them that that child would most probably have a life by that point, a spouse of his own, children of his own. These onlies were always going to be children; if the parents couldn’t be there to take care of them, it was unclear who would.
Of course, all the science, and the anecdotal evidence, too, points to the fact that only children are perfectly capable of taking care of themselves, if in occasionally unusual ways. Only children often work out their sibling issues and their need for attention with their friends. The most social person I know is an only child. As a kid, she would sit at the kitchen counter with the A–Z class list and call each person, emotionlessly and alphabetically, until someone agreed to play with her. She does a more subtle version of this today.
Only children spend a fair amount of time mulling over their aloneness. “My biggest concern is that I’ve befriended so many weirdos on my mission alone that when it’s time to start traveling in a pair it won’t work and something will happen to my family and I’ll be the only one to deal with it,” says Laura Flam, 26, a student at the Fashion Institute of Technology.
And, as any psychoanalyst will tell you, family doesn’t have to be fate. Says Daphne Uviller, 32, an only child and expectant mother who lives on the parlor floor of her parents’ West Village townhouse (if you had the option, you’d still be living there, too): “I remember telling my husband, ‘You don’t understand—I was an only child. I need to hear that I’m beautiful and smart and that you love me every single day.’ He was like, No. I didn’t freak out. I learned how to be in a different kind of relationship.”
Ondine spends a lot of time skittering around her parents’ Soho apartment, but the space that is really her space is the loft bed in her room, piled with soft pastel quilts and up a long sailor’s ladder. Up there, she has a little TV and a portable DVD player, plus dozens of magazines that are strewn all over the mattress. Every night, before her mom puts her to sleep, they sit up there and read—quietly, just the two of them, two clever blondes with sophisticated taste and a lot to say. Her mom reads Vogue. Ondine reads Teen Vogue. “This month I learned about how even the smartest kids in class cheat,” says Ondine. “And Avril Lavigne says that the music industry is really corrupt and sometimes people—I mean, artists—have other people singing for them. Did you hear about Ashlee Simpson on Saturday Night Live?” She hiccups. “Ow, I have the hiccups,” she says, then claps her hand over her mouth. “God! Did I just say, ‘Ow, I have the hiccups?’ ”
Tonight, after homework, a little knitting on her new red scarf, and a mom-administered bath, Ondine was going to sleep. She was particularly excited about Halloween: Last year, she was Margot Tenenbaum, with a wood finger and cigarette holder—“That’s what happens when you raise kids in the city,” she said pragmatically—but this year she’s going to be a bunny, but a “chic bunny.” She also wanted to take back what she had said before about being an only child. “I don’t really want a brother or sister—I think that’s why God invented the TV,” she said. “I like having all the attention for myself.”
Vanessa Grigoriadis writes for New York Magazine, The New Yorker, and Rolling Stone Magazine among others.
Copyright © 2004 New York Magazine
Saturday, November 06, 2004
My Brother Was An Only Child
The Real Neocon Theorist: Bernard Lewis!
Much has been made about the Neoconservatives who have seduced W into the Iraqi adventure. Paul Wolfowitz, Richard Perle, William Kristol and their ilk are described as acolytes to the late Leo Strauss (1899-1973), a political philosopher at the University of Chicago. A case is made here for another Neocon poster boy: Bernard Lewis. LewisCleveland E. Dodge Professor of Near Eastern Studies, Emeritus at Princeton Universityis today's foremost non-Arab interpreter of Islam and Islamic culture. Unlike Strauss, Lewis has dedicated his scholarship to the Middle East. Now a strong case is made that the Neocons are not Straussians, but Lewisites. The evidence is piling up in Iraq that Lewis' theories have led us to the wrong conclusions and their resultant policies. Fallujah will make Mogadishu seem like a Sunday School picnic: "Blackhawk Down" Redux. If this is (fair & balanced) blame-shifting, so be it.
[x Washington Monthly]
Bernard Lewis Revisited
By Michael Hirsh
What if Islam isn't an obstacle to democracy in the Middle East but the secret to achieving it?
America's misreading of the Arab world—and our current misadventure in Iraq—may have really begun in 1950. That was the year a young University of London historian named Bernard Lewis visited Turkey for the first time. Lewis, who is today an imposing, white-haired sage known as the “doyen of Middle Eastern studies” in America (as a New York Times reviewer once called him), was then on a sabbatical. Granted access to the Imperial Ottoman archives—the first Westerner allowed in—Lewis recalled that he felt “rather like a child turned loose in a toy shop, or like an intruder in Ali Baba's cave.” But what Lewis saw happening outside his study window was just as exciting, he later wrote. There in Istanbul, in the heart of what once was a Muslim empire, a Western-style democracy was being born.
The hero of this grand transformation was Kemal Ataturk. A generation before Lewis's visit to Turkey, Ataturk (the last name, which he adopted, means “father of all Turks”), had seized control of the dying Ottoman Sultanate. Intent on single-handedly shoving his country into the modern West—“For the people, despite the people,” he memorably declared—Ataturk imposed a puritanical secularism that abolished the caliphate, shuttered religious schools, and banned fezes, veils, and other icons of Islamic culture, even purging Turkish of its Arabic vocabulary. His People's Party had ruled autocratically since 1923. But in May 1950, after the passage of a new electoral law, it resoundingly lost the national elections to the nascent Democrat Party. The constitutional handover was an event “without precedent in the history of the country and the region,” as Lewis wrote in The Emergence of Modern Turkey, published in 1961, a year after the Turkish army first seized power. And it was Kemal Ataturk, Lewis noted at another point, who had “taken the first decisive steps in the acceptance of Western civilization.”
Today, that epiphany—Lewis's Kemalist vision of a secularized, Westernized Arab democracy that casts off the medieval shackles of Islam and enters modernity at last—remains the core of George W. Bush's faltering vision in Iraq. As his other rationales for war fall away, Bush has only democratic transformation to point to as a casus belli in order to justify one of the costliest foreign adventures in American history. And even now Bush, having handed over faux sovereignty to the Iraqis and while beating a pell-mell retreat under fire, does not want to settle for some watered-down or Islamicized version of democracy. His administration's official goal is still dictated by the “Lewis Doctrine,” as The Wall Street Journal called it: a Westernized polity, reconstituted and imposed from above like Kemal's Turkey, that is to become a bulwark of security for America and a model for the region.
Iraq, of course, does not seem to be heading in that direction. Quite the contrary: Iraq is passing from a secular to an increasingly radicalized and Islamicized society, and should it actually turn into a functioning polity, it is one for the present defined more by bullets than by ballots. All of which raises some important questions. What if the mistakes made in Iraq were not merely tactical missteps but stem from a fundamental misreading of the Arab mindset? What if, in other words, the doyen of Middle Eastern studies got it all wrong?
A growing number of Middle Eastern scholars who in the past have quietly stewed over Lewis's outsized influence say this is exactly what happened. To them, it is no surprise that Lewis and his acolytes in Washington botched the war on terror. In a new book, provocatively titled The Case for Islamo-Christian Civilization, one of those critics, Columbia scholar Richard Bulliet, argues that Lewis has been getting his “master narrative” about the Islamic world wrong since his early epiphanic days in Turkey—and he's still getting it wrong today.
In Cheney's bunker
Lewis's basic premise, put forward in a series of articles, talks, and bestselling books, is that the West—what used to be known as Christendom—is now in the last stages of a centuries-old struggle for dominance and prestige with Islamic civilization. (Lewis coined the term “clash of civilizations,” using it in a 1990 essay titled “The Roots of Muslim Rage,” and Samuel Huntington admits he picked it up from him.) Osama bin Laden, Lewis thought, must be viewed in this millennial construct as the last gasp of a losing cause, brazenly mocking the cowardice of the “Crusaders.” Bin Laden's view of America as a “paper tiger” reflects a lack of respect for American power throughout the Arab world. And if we Americans, who trace our civilizational lineage back to the Crusaders, flagged now, we would only invite future attacks. Bin Laden was, in this view, less an aberrant extremist than a mainstream expression of Muslim frustration, welling up from the anti-Western nature of Islam. “I have no doubt that September 11 was the opening salvo of the final battle,” Lewis told me in an interview last spring. Hence the only real answer to 9/11 was a decisive show of American strength in the Arab world; the only way forward, a Kemalist conquest of hearts and minds. And the most obvious place to seize the offensive and end the age-old struggle was in the heart of the Arab world, in Iraq.
This way of thinking had the remarkable virtue of appealing powerfully to both the hard-power enthusiasts in the administration, principally Bush and Donald Rumsfeld, who came into office thinking that the soft Clinton years had made America an easy target and who yearned to send a post-9/11 message of strength; and to neoconservatives from the first Bush administration such as Paul Wolfowitz, who were looking for excuses to complete their unfinished business with Saddam from 1991 and saw 9/11 as the ultimate refutation of the “realist” response to the first Gulf War. Leaving Saddam in power in '91, betraying the Shiites, and handing Kuwait back to its corrupt rulers had been classic realism: Stability was all. But it turned out that the Arab world wasn't stable, it was seething. No longer could the Arabs be an exception to the rule of post-Cold War democratic transformation, merely a global gas station. The Arabs had to change too, fundamentally, just as Lewis (and Ataturk) had said. But change had to be shoved down their throats—Arab tribal culture understood only force and was too resistant to change, Lewis thought—and it had to happen quickly. This, in turn, required leaving behind Islam's anti-modern obsessions.
Iraq and its poster villain, Saddam Hussein, offered a unique opportunity for achieving this transformation in one bold stroke (remember “shock and awe”?) while regaining the offensive against the terrorists. So, it was no surprise that in the critical months of 2002 and 2003, while the Bush administration shunned deep thinking and banned State Department Arabists from its councils of power, Bernard Lewis was persona grata, delivering spine-stiffening lectures to Cheney over dinner in undisclosed locations. Abandoning his former scholarly caution, Lewis was among the earliest prominent voices after September 11 to press for a confrontation with Saddam, doing so in a series of op-ed pieces in The Wall Street Journal with titles like “A War of Resolve” and “Time for Toppling.” An official who sat in on some of the Lewis-Cheney discussions recalled, “His view was: 'Get on with it. Don't dither.'” Animated by such grandiose concepts, and like Lewis quite certain they were right, the strategists of the Bush administration in the end thought it unnecessary to prove there were operational links between Saddam and al Qaeda. These were good “bureaucratic” reasons for selling the war to the public, to use Wolfowitz's words, but the real links were deeper: America was taking on a sick civilization, one that it had to beat into submission. Bin Laden's supposedly broad Muslim base, and Saddam's recalcitrance to the West, were part of the same pathology.
The administration's vision of postwar Iraq was also fundamentally Lewisian, which is to say Kemalist. Paul Wolfowitz repeatedly invoked secular, democratic Turkey as a “useful model for others in the Muslim world,” as the deputy secretary of defense termed it in December 2002 on the eve of a trip to lay the groundwork for what he thought would be a friendly Turkey's role as a staging ground for the Iraq war. Another key Pentagon neocon and old friend of Lewis's, Harold Rhode, told associates a year ago that “we need an accelerated Turkish model” for Iraq, according to a source who talked with him. (Lewis dedicated a 2003 book, The Crisis of Islam, to Rhode whom “I got to know when he was studying Ottoman registers,” Lewis told me.) And such men thought that Ahmad Chalabi—also a protégé of Lewis's—might make a fine latter-day Ataturk—strong, secular, pro-Western, and friendly towards Israel. L. Paul Bremer III, the former U.S. civil administrator in Iraq, was not himself a Chalabite, but he too embraced a top-down Kemalist approach to Iraq's resurrection. The role of the Islamic community, meanwhile, was consistently marginalized in the administration's planning. U.S. officials saw Grand Ayatollah Ali al-Sistani, the most prestigious figure in the country, as a clueless medieval relic. Even though military intelligence officers were acutely aware of Sistani's importance—having gathered information on him for more than a year before the invasion—Bremer and his Pentagon overseers initially sidelined the cleric, defying his calls for early elections.
Looking for love in all the wrong places
Lewis has long had detractors in the scholarly world, although his most ardent enemies have tended to be literary mavericks like the late Edward Said, the author of Orientalism, a long screed against the cavalier treatment of Islam in Western literature. And especially after 9/11, Bulliet and other mainstream Arabists who had urged a softer, more nuanced view of Islam found themselves harassed into silence. Lewisites such as Martin Kramer, author of Ivory Towers on Sand: The Failure of Middle Eastern Studies in America (a fierce post-9/11 attack on Bulliet) and other prominent scholars such as Robert Wood of the University of Chicago, suggested that most academic Arabists were apologists for Islamic radicalism. But now, emboldened by the Bush administration's self-made quagmire in Iraq, the Arabists are launching a counterattack. They charge that Lewis's whole analysis missed the mark, beginning with his overarching construct, the great struggle between Islam and Christendom. These scholars argue that Lewis has slept through most of modern Arab history. Entangled in medieval texts, Lewis's view ignores too much and confusingly conflates old Ottoman with modern Arab history. “He projects from the Ottoman experience onto the Middle East. But after the Ottoman Empire was disbanded, a link was severed with the rest of Arab world,” says Nader Hashemi, a University of Toronto scholar who is working on another anti-Lewis book. In other words, Istanbul and the caliphate were no longer the center of things. Turkey under Ataturk went in one direction, the Arabs, who were colonized, in another. Lewis, says Hashemi, “tries to interpret the problem of political development by trying to project a line back to medieval and early Islamic history. In the process, he totally ignores the impact of the British and French colonialists, and the repressive rule of many post-colonial leaders. He misses the break” with the past.
At least until the Iraq war, most present-day Arabs didn't think in the stark clash-of-civilization terms Lewis prefers. Bin Laden likes to vilify Western Crusaders, but until relatively recently, he was still seen by much of the Arab establishment as a marginal figure. To most Arabs before 9/11, the Crusades were history as ancient as they are to us in the West. Modern Arab anger and frustration is, in fact, less than a hundred years old. As bin Laden knows very well, this anger is a function not of Islam's humiliation at the Treaty of Carlowitz of 1699—the sort of long-ago defeat that Lewis highlights in his bestselling What Went Wrong—but of much more recent developments. These include the 1916 Sykes-Picot agreement by which the British and French agreed to divvy up the Arabic-speaking countries after World War I; the subsequent creation, by the Europeans, of corrupt, kleptocratic tyrannies in Saudi Arabia, Syria, Egypt, Iraq, and Jordan; the endemic poverty and underdevelopment that resulted for most of the 20th century; the U.N.-imposed creation of Israel in 1948; and finally, in recent decades, American support for the bleak status quo.
Yet as Bulliet writes, over the longer reach of history, Islam and the West have been far more culturally integrated than most people realized; there is a far better case for “Islamo-Christian civilization” than there is for the clash of civilizations. “There are two narratives here,” says Fawaz Gerges, an intellectual ally of Bulliet's at Sarah Lawrence University. “One is Bernard Lewis. But the other narrative is that in historical terms, there have been so many inter-alliances between world of Islam and the West. There has never been a Muslim umma, or community, except for 23 years during the time of Mohammed. Except in the theoretical minds of the jihadists, the Muslim world was always split. Many Muslim leaders even allied themselves with the Crusaders.”
Today, progress in the Arab world will not come by secularizing it from above (Bulliet's chapter dealing with Chalabi is called “Looking for Love in All the Wrong Places”) but by rediscovering this more tolerant Islam, which actually predates radicalism and, contra Ataturk, is an ineluctable part of Arab self-identity that must be accommodated. For centuries, Bulliet argues, comparative stability prevailed in the Islamic world not (as Lewis maintains) because of the Ottomans' success, but because Islam was playing its traditional role of constraining tyranny. “The collectivity of religious scholars acted at least theoretically as a countervailing force against tyranny. You had the implicit notion that if Islam is pushed out of the public sphere, tyranny will increase, and if that happens, people will look to Islam to redress the tyranny.” This began to play out during the period that Lewis hails as the modernization era of the 19th century, when Western legal structures and armies were created. “What Lewis never talks about is the concomitant removal of Islam from the center of public life, the devalidation of Islamic education and Islamic law, the marginalization of Islamic scholars,” Bulliet told me. Instead of modernization, what ensued was what Muslim clerics had long feared, tyranny that conforms precisely with some theories of Islamic political development, notes Bulliet. What the Arab world should have seen was “not an increase in modernization so much as an increase in tyranny. By the 1960s, that prophecy was fulfilled. You had dictatorships in most of the Islamic world.” Egypt's Gamel Nasser, Syria's Hafez Assad, and others came in the guise of Arab nationalists, but they were nothing more than tyrants.
Yet there was no longer a legitimate force to oppose this trend. In the place of traditional Islamic learning—which had once allowed, even encouraged, science and advancement—there was nothing. The old religious authorities had been hounded out of public life, back into the mosque. The Caliphate was dead; when Ataturk destroyed it in Turkey, he also removed it from the rest of the Islamic world. Into that vacuum roared a fundamentalist reaction led by brilliant but aberrant amateurs like Egypt's Sayyid Qutb, the founding philosopher of Ayman Zawahiri's brand of Islamic radicalism who was hanged by al-Nasser, and later, Osama bin Laden, who grew up infected by the Saudis' extreme version of Wahhabism. Even the creator of Wahhabism, the 18th-century thinker Mohammad Ibn Abd al-Wahhab, was outside the mainstream, notorious for vandalizing shrines and “denounced” by theologians across the Islamic world in his time for his “doctrinal mediocrity and illegitimacy,” as the scholar Abdelwahab Meddeb writes in another new book that rebuts Lewis, Islam and its Discontents.
Wahhabism's fast growth in the late 20th century was also a purely modern phenomenon, a function of Saudi petrodollars underwriting Wahhabist mosques and clerics throughout the Arab world (and elsewhere, including America). Indeed, the elites in Egypt and other Arab countries still tend to mock the Saudis as déclassés Bedouins who would have stayed that way if it were not for oil. “It's as if Jimmy Swaggert had come into hundreds of billions of dollars and taken over the church,” one Arab official told me. The hellish culmination of this modern trend occurred in the mountains of Afghanistan in the 1980s and '90s, when extremist Wahhabism, in the person of bin Laden, was married to Qutb's Egyptian Islamism, in the person of Zawahiri, who became bin Laden's deputy.
Critics were right to see the bin Laden phenomenon as a reaction against corrupt tyrannies like Egypt's and Saudi Arabia's, and ultimately against American support for those regimes. They were wrong to conclude that it was a mainstream phenomenon welling up from the anti-modern character of Islam, or that the only immediate solution lay in Western-style democracy. It was, instead, a reaction that came out of an Islam misshapen by modern political developments, many of them emanating from Western influences, outright invasion by British, French, and Italian colonialists, and finally the U.S.-Soviet clash that helped create the mujahadeen jihad in Afghanistan.
Academic probation
Today, even as the administration's case for invading Iraq has all but collapsed, Bernard Lewis's public image has remained largely intact. While his neocon protégés fight for their reputations and their jobs, Lewis's latest book, a collection of essays called From Babel to Dragomans: Interpreting the Middle East, received mostly respectful reviews last spring and summer. Yet events on the ground seem to be bearing out some of the academic criticisms of Lewis made by Bulliet and others. Indeed, they suggest that what is happening is the opposite of what Lewis predicted.
The administration's invasion of Iraq seems to have given bin Laden a historic gift. It has vindicated his rhetoric describing the Americans as latter-day Crusaders and Mongols, thus luring more adherents and inviting more rage and terror acts. (The administration admitted as much last summer, when it acknowledged that its “Patterns of Global Terrorism” report had been 180 degrees wrong. The report, which came out last June, at first said terrorist attacks around the world were down in 2003, indicating the war on terror was being won. Following complaints from experts, the State Department later revised the report to show that attacks were at their highest level since 1982.)
The new Iraq is also looking less and less Western, and certainly less secular than it was under Saddam. In the streets of Baghdad—once one of the most secular Arab capitals, women now go veiled and alcohol salesmen are beaten. The nation's most popular figures are Sistani and his radical Shiite rival, the young firebrand Moktada al-Sadr, who was permitted to escape besieged Najaf with his militia intact and is now seen as a champion of the Iraqi underclass. According to a survey commissioned by the Coalition Provisional Authority in late May, a substantial majority of Iraqis, 59 percent, want their religious communities to have “a great deal” of influence in selecting members of the new election commission. That's far more than those who favored tribal leaders (38 percent), political figures (31 percent), or the United Nations (36 percent). The poll also showed that Iraq's most popular political figures are religious party-affiliated leaders such as Ibrahaim Jaferi and Abdul Aziz al-Hakim. To a fascinating degree, Islam now seems to be filling precisely the role Bulliet says it used to play, as a constraint against tyranny—whether the tyrant is now seen as the autocratic Americans or our man in Baghdad, interim Prime Minister Iyad Allawi.
Bremer once promised to ban Islamic strictures on family law and women's rights, and the interim constitution that he pushed through the Governing Council in March affirms that Islam is only one of the foundations of the state. But Sistani has dismissed the constitution as a transition democracy, and Iraq's political future is now largely out of American hands (though the U.S. military may continue to play a stabilizing role in order to squelch any move toward civil war). “I think the best-case scenario for Iraq is that they hold these parliamentary elections, and you get some kind of representative government dominated by religious parties,” says University of Michigan scholar Juan Cole. Even Fouad Ajami, one of Lewis's longtime intellectual allies and like him an avowed Kemalist, concluded last spring in a New York Times op-ed piece: “Let's face it: Iraq is not going to be America's showcase in the Arab-Muslim world … We expected a fairly secular society in Iraq (I myself wrote in that vein at the time). Yet it turned out that the radical faith—among the Sunnis as well as the Shiites—rose to fill the void left by the collapse of the old despotism.”
Turkey hunt
Today, the anti-Lewisites argue, the only hope is that a better, more benign form of Islam fights its way back in the hands of respected clerics like Sistani, overcoming the aberrant strains of the Osama bin Ladens and the Abu Mousab al-Zarqawis. Whatever emerges in Iraq and the Arab world will be, for a long time to come, Islamic. And it will remain, for a long time, anti-American, beginning with the likelihood that any new Iraqi government is going to give the boot to U.S. troops as soon as it possibly can. (That same CPA poll showed that 92 percent of Iraqis see the Americans as occupiers, not liberators, and 86 percent now want U.S. soldiers out, either “immediately” or after the 2005 election.) America may simply have to endure an unpleasant Islamist middle stage—and Arabs may have to experience its failure, as the Iranians have—before modernity finally overtakes Iraq and the Arab world. “Railing against Islam as a barrier to democracy and modern progress cannot make it go away so long as tyranny is a fact of life for most Muslims,” Bulliet writes. “Finding ways of wedding [Islam's traditional] protective role with modern democratic and economic institutions is a challenge that has not yet been met.”
No one, even Bush's Democratic critics, seems to fully comprehend this. Sens. Joseph Biden (D-Del.) and Hillary Clinton (D-N.Y.) have introduced legislation that would create secular alternatives to madrassas, without realizing that this won't fly in the Arab world: All one can hope for are more moderate madrassas, because Islam is still seen broadly as a legitimating force. “What happens if the road to what could broadly be called democracy lies through Islamic revolution?” says Wood of the University of Chicago. The best hope, some of these scholars say, is that after a generation or so, the “Islamic” tag in Arab religious parties becomes rather anodyne, reminiscent of what happened to Christian democratic parties in Europe.
This may already be happening slowly in Turkey, where the parliament is dominated by the majority Islamic Justice and Development Party. The JDP leader, Prime Minister Recep Tayyip Erdogan—who was once banned from public service after reciting a poem that said “the mosques are our barracks, the domes our helmets, the minarets our bayonets, and the faithful our soldiers”—has shown an impressive degree of pragmatism in governing. But again, Turkey is a unique case, made so by Kemal and his secular, military-enforced coup back in the '20s. If Erdogan still secretly wants to re-Islamicize Turkey, he can only go so far in an environment in which the nation's powerful military twitches at every sign of incipient religiosity. Erdogan is also under unique pressure to secularize as Turkey bids to enter the European Union, which is not a card that moderate Arab secularists can hold up to win over their own populations.
Resolving the tension between Islam and politics will require a long, long process of change. As Bulliet writes, Christendom struggled for hundreds of years to come to terms with the role of religion in civil society. Even in America, separation of church and state “was not originally a cornerstone of the U.S. Constitution,” and Americans are still fighting among themselves over the issue today.
In our talk last spring, Lewis was still arguing that Iraq would follow the secular path he had laid out for it. He voiced the line that has become a favorite of Wolfowitz's, that the neocons are the most forthright champions of Arab progress, and that the Arabists of the State Department who identified with the idea of “Arab exceptionalism” are merely exhibitng veiled racism. This is the straight neocon party line, of course: If you deny that secular democracy is the destiny of every people, you are guilty of cultural snobbery. But somehow Lewis's disdain for Islam, with its hagiographic invocation of Ataturk, managed to creep into our conversation. Threaded throughout Lewis's thinking, despite his protests to the contrary, is a Kemalist conviction that Islam is fundamentally anti-modern. In his 1996 book The Middle East: A Brief History of the Last 2,000 Years, for example, Lewis stresses the Koran's profession of the “finality and perfection of the Muslim revelation.” Even though Islamic authorities have created laws and regulations beyond the strict word of the Koran in order to deal with the needs of the moment, “the making of new law, though common and widespread, was always disguised, almost furtive, and there was therefore no room for legislative councils or assemblies such as formed the starting-point of European democracy,” he writes. In other words, Islam is an obstacle. “The Islamic world is now at beginning of 15th century,” Lewis told me. “The Western world is at the beginning of the 21st century.” He quickly added: “That doesn't mean [the West] is more advanced, it means it's gone through more.” Following that timeline, Lewis suggested that the Islamic world is today “on the verge of its Reformation”—a necessary divorce between religion and politics that Lewis believes has been too long in coming. This view has become conventional wisdom in Washington, resonating not only with the neocons but also with the modernization theorists who have long dominated American campuses. Yet behind this view, say scholars like Bulliet, lies a fundamental rejection of Arabs' historical identity. The reason for that, Bulliet believes, resides in the inordinate influence that Lewis's historical studies of the Ottomans retain over his thinking—and by his 1950 visit to Turkey. Bulliet notes that as late as 2002, in the preface to the third edition of The Emergence of Modern Turkey, Lewis “talked about the incredible sense of exhilaration it felt for someone of his generation, shaped by the great war against fascism and the emerging Cold War, to see the face of the modern Middle East emerge in Turkey.” As a model, Bulliet argues, Turkey “was as vivid a vision for him 50 years later as it was at the time.”
But again, Turkey's experience after the Ottoman empire's dissolution was no longer especially relevant to what was happening in the Arab world. Ataturk, in fact, was not only not an Arab, but his approach to modernity was also most deeply influenced by the fascism of the period (Mussolini was still a much-admired model in the 1920s). And Lewis never developed a feel for what modern Arabs were thinking, especially after he began to adopt strong pro-Israel views in the 1970s. “This is a person who does not like the people he is purporting to have expertise about,” says Bulliet. “He doesn't respect them, he considers them to be good and worthy only to the degree they follow a Western path.”
The neoconservative transformationalists of the Bush administration, though informed by far less scholarship than Lewis, seemed to adopt his dismissive attitude toward the peculiar demands of Arab and Islamic culture. And now they are paying for it. The downward spiral of the U.S. occupation into bloodshed and incompetence wasn't just a matter of too few troops or other breakdowns in planning, though those were clearly part of it. In fact, the great American transformation machine never really understood much about Arab culture, and it didn't bother to try. The occupation authorities, taking a paternalistic top-down approach, certainly did not comprehend the role of Islam, which is one reason why Bremer and Co. were so late in recognizing the power of the Sistani phenomenon. The occupation also failed because of its inability to comprehend and make use of tribal complexities, to understand “how to get the garbage collected, and know who's married to who,” as Wood says. Before the war, Pentagon officials, seeking to justify their low-cost approach to nation-building, liked to talk about how much more sophisticated and educated the Iraqis were than Afghans, how they would quickly resurrect their country. Those officials obviously didn't mean what they said or act on it. In the end, they couldn't bring themselves to trust the Iraqis, and the soldiers at their command rounded up thousands of “hajis” indiscriminately, treating one and all as potential Saddam henchmen or terrorists (as I witnessed myself when, on assignment for Newsweek, I joined U.S. troops on raids in the Sunni Triangle last January).
There remains a deeper issue: Did Lewis's misconceptions lead the Bush administration to make a terrible strategic error? Despite the horrors of 9/11, did they transform the bin Laden threat into something grander than it really was? If the “show of strength” in Iraq was wrong-headed, as the Lewis critics say, then Americans must contemplate the terrible idea that they squandered hundreds of billions of dollars and thousands of lives and limbs on the wrong war. If Bernard Lewis's view of the Arab problem was in error, then America missed a chance to round up and destroy a threat—al Qaeda—that in reality existed only on the sick margins of the Islamic world.
It is too soon to throw all of Lewis's Kemalist ideas on the ash-heap of history. Even his academic rivals concede that much of his early scholarship is impressive; some like Michigan's Cole suggest that Lewis lost his way only in his later years when he got pulled into present-day politics, especially the Israeli-Palestinian issue, and began grafting his medieval insights onto the modern Arab mindset. And whether the ultimate cause is modern or not, the Arab world is a dysfunctional society, one that requires fundamental reform. “The Arab Development Report” issued in the spring of 2002 by the U.N. Development Programme, harshly laid out the failings of Arab societies. Calling them “rich, but not developed,” the report detailed the deficits of democracy and women's rights that have been favorite targets of the American neoconservatives. The report noted that the Arab world suffers from a lower rate of Internet connectivity than even sub-Saharan Africa, and that education is so backward and isolated that the entire Arab world translates only one-fifth of the books that Greece does. Some scholars also agree that in the longest of long runs, the ultimate vision of Lewis—and the neocons—will prove to be right. Perhaps in the long run, you can't Islamicize democracy, and so Islam is simply standing in the way.
Iran is the best real-world test of this hypothesis right now. A quarter century after the Khomeini revolution, Iran seems to be stuck in some indeterminate middle state. The forces of bottom-up secular democratic reform and top-down mullah control may be stalemated simply because there is no common ground whatsoever between their contending visions. That's one reason the Kemalist approach had its merits, Fouad Ajami argued in a recent appearance at the Council on Foreign Relations. “I think Ataturk understood that if you fall through Islam, you fall through a trap door. And in fact, I think the journey out of Islam that Ataturk did was brilliant. And to the extent that the Muslim world now has forgotten this. . .they will pay dearly for it.”
But there is no Ataturk in Iraq (though of course Chalabi, and perhaps Allawi, would still love to play that role). For now, Sistani remains the most prestigious figure in the country, the only true kingmaker. Suspicions remain in the Bush administration that Sistani's long-term goal is to get the Americans out and the Koran in—in other words, to create another mullah state as in Iran. But those who know Sistani well say he is much smarter than that. Born in Iran—he moved to Iraq in the early 1950s, around the time Lewis saw the light—Sistani has experienced up close the failures of the Shiite mullah state next door. He and the other Shiites have also suffered the pointy end of Sunni Arab nationalism, having been oppressed under Saddam for decades, and they will never sanction a return to that. So Sistani knows the last, best alternative may be some kind of hybrid, a moderately religious, Shiite-dominated democracy, brokered and blessed by him and conceived with a nuanced federalism that will give the Kurds, Sunnis and others their due. But also a regime that, somewhat like the Iranian mullahs, uses its distinctive Islamic character, and concomitant anti-Americanism and anti-Westernism, as ideological glue. For the Americans who went hopefully to war in Iraq, that option is pretty much all that's left on the table—something even Bernard Lewis may someday have to acknowledge.
Michael Hirsh is a senior editor at Newsweek, based in Washington, and author of At War with Ourselves: Why America is Squandering its Chance to Build a Better World (Oxford University Press).
Copyright © 2004 The Washington Monthly