Friday, May 31, 2013

Before There Was A BFI (Big, F-word Of Choice, Idiot), There Was Westbrook Pegler

The Westbrook Pegler of our time is The BFI (Rush Limbaugh, born January 12, 1951) and he was on the sidehill eating grapes through most of Pegler's heyday. The amazing thing about Pegler is that his last gig — with the John Birch Society — ended with Pegler's firing because of his extremist rhetoric. Of course — if they could read — most Teabaggers would welcome the likes of Pegler with open arms. If this is a (fair & balanced) view of invective, so be it.

[x Humanities]
Who Was Westbrook Pegler?
By David Witwer

Tag Cloud of the following article

created at TagCrowd.com

(Click to embiggen)

A cartoon from the 1940s pictures a formal dance party torn about by tuxedoed men yelling at each other, with a woman sitting in the center of the image saying, “All I did was mention Pegler!” She was referring to Westbrook Pegler, a syndicated newspaper columnist whose career stretched from 1934 until his retirement in 1962.

Controversial, even notorious, in his day, he had slipped into obscurity in the decades following his death in 1969. So much so that in 2004 William F. Buckley wrote a piece about him in the New Yorker hoping future generations would rediscover Pegler’s writing and appreciate his role as an iconoclast commentator with a firmly populist perspective. Buckley quoted approvingly an assessment offered by the liberal columnist Murray Kempton, who described Pegler as “the common man with a grievance . . . one of the American workingmen who have heroically and stubbornly struggled to remain class conscious.” The son of a journalist whose meager wages had barely been enough to provide for his family, Pegler achieved great financial and professional success in his career as a journalist. But, as Buckley pointed out, his columns featured a populist message closely tied to the years of struggle Pegler had known. “I claim authority to speak for the rabble because I am a member of the rabble in good standing” was Pegler’s own distillation of this point.

But, appropriately enough, Buckley’s article marked the beginning of a new postmortem chapter in the history of Pegler’s controversial career. Diane McWhorter fired back a response to the New Yorker article, calling it a “furry postmodern rehabilitation” that “floats a defense of Pegler while burying the charges against him.” In contrast to Buckley, McWhorter described Pegler in the 1930s and 1940s as a “leading popularizer of one of the most concerted antidemocratic crusades in this country’s history: the vicious backlash against the New Deal and the labor movement to which it gave legal protection.” Four years later, when Sarah Palin used an unattributed quote from Pegler in her acceptance speech at the Republican National Convention, commentators invoked an even stronger term to denounce the columnist. Robert F. Kennedy Jr. referred to the “fascist writer Westbrook Pegler, an avowed racist.”

While all of this was going on, I was finishing a book on Pegler’s role in exposing a major union corruption scandal in 1939 and 1940. I had spent years patiently explaining who he was to people who asked about my work and who had never heard of him. I would assure them that in his day he played a prominent role, one similar to the current best known radio or cable TV commentators. Most people looked unconvinced; they assumed this was just another case of scholarly obscurantism. Then, suddenly, he was back in the news again, occupying contentious space as in his heyday in the late 1930s and early 1940s.

In his popular newspaper columns, Pegler managed to combine an unsurpassed gift for invective, a populist crusading zeal, and occasionally a willingness to look at issues from a fresh and honest perspective. He was then still very much a working journalist who proudly did his own legwork and zealously followed up the tips he received. In 1939, these characteristics helped him uncover the mob ties of two officials high in the International Alliance of Theatrical Stage Employees, one of the most important unions in Hollywood. It would later come out in federal court that the Chicago mob had used the two officials, William Bioff and George Browne, to siphon millions of dollars from its membership. Before the mob had installed Bioff in his union position, he had done various kinds of strong-arm work, including managing a brothel. Pegler discovered that although he had been convicted of pandering back in 1922, Bioff had never actually served out his sentence. A few months later, Pegler unmasked George Scalise, the president of the Building Service Employees’ International Union, revealing the man’s previous conviction on a white slavery charge and then exposing his links to the New York mafia. Both Scalise and Bioff were nothing more than pimps, Pegler claimed. “They got their training for the post of bargaining agent [in unions] by serving as such for prostitutes,” he wrote.

He made the most of these stories, arguing that they revealed a larger problem with corruption in the labor movement. Overlooking the role of the employers who had actively promoted the careers of Scalise, Bioff, and Browne, the columnist focused his criticism on the leadership of the American Federation of Labor (AFL), charging its president, William Green, with being complacent in the face of growing corruption in the affiliated labor organizations. Pegler’s column on January 19, 1940, took the form of an open letter to President Green. Asserting that the cases of Scalise, Bioff, and Browne were no more than the tip of the iceberg, Pegler charged, “The roster of officials” in the AFL “contained the nucleus for a good, major league rogue’s gallery.” Addressing Green directly, Pegler continued, ”I can’t see how you can fail to know what he [Scalise] is or, if you do know what he is, why you haven’t had him thrown out of the American Federation of Labor. Do you think it is doing the American Federation of Labor any good to permit such a man to be president of one of your big international unions or doing the rank and file working stiffs any good to subject them to the rule of a vicious mobster?”

Pegler used his columns to emphasize the plight of the working stiffs enrolled in such unions, but unable to protect their own interests. He juxtaposed the luxurious lifestyle of Scalise with the janitors and cleaning women who made up the bulk of the membership of the Building Service Employees’ International Union. Scalise’s salary of $20,000 a year, his unlimited expense account, and the fact that he had never labored in the trade his union represented were contrasted with the working lives of the membership. “Greetings,” Pegler opened one column, addressing those union members directly. “Your honored international president, George Scalise, who learned the trade of bargaining agent in the same school that was patronized by Willie Bioff, the dictator of the amusement craft unions, which is to say as bargaining agent for prostitutes, has recently bought a country mansion far from the crowded slums in which most of you live.” “Many of you,” he continued, “—that is the chambermaids among you—are caring for upward of 20 rooms every day in hotels ranging in character from mediocre to bad, for wages of $14 a week, and $20 a week is considered to be good pay for the most prosperous of you. Out of this you pay your initiation fees and dues, plus occasional fines for such offenses as speaking disrespectfully of your union officers.”

American Federation of Labor President Green and other union leaders responded defensively. They cited the AFL’s constitution, which made unions, such as the Building Service Employees’ International, autonomous bodies and limited the ability of federation leaders to interfere. It was up to the union members to exercise their democratic rights, Green explained, and oust those officials who had betrayed their trust.

Pegler derided this response as disingenuous and indicative of the AFL leadership’s unwillingness to address the problem of union corruption. “I may be naïve,” Pegler wrote, “but if I were president of the A.F. of L. I would find somewhere in the [AFL] Constitution legal authority to disown and throw out of the movement not only Willie Bioff and George Scalise, the convicted vice-mongers who never were workers and always have been racketeers. . . . I would not wait for reporters or public prosecutors to delouse my organization, but, out of devotion to the A.F. of L. and its good name in the interests of the rank and file, I would raise pluperfect hell until all criminals and grafters were driven out.” Because he made this his goal, Pegler claimed to be not an enemy of organized labor, or of the AFL, but a better friend to it than President Green. “I am not hostile to the A.F. of L. or any of its component unions, and I am much more sympathetic with the underpaid and oppressed rank and file than many of the union officials are.”

In fact, he was hostile toward unions. This had not always been the case. He had been one of the early members of the Newspaper Guild, the union of journalists that had formed in 1933. A year later, Pegler had written a column mocking claims by the National Association of Manufacturers and the Chamber of Commerce that workers had no need of any new legislation to protect their right to join unions. Employers could be counted on to provide fair treatment to their workers without any governmental oversight, these groups claimed. Pegler’s tongue-in-cheek response was that workers somehow had misinterpreted this so-called fair treatment in the past. “Always in the past the workman’s best friend was the employer, although some of the working people, being ignorant and easily misled by self-seeking agitators, sometimes allowed themselves to doubt this and to try to befriend themselves. Some there were who hadn’t the breadth of mind or the intelligence to understand that when the employer cut a workman’s wages or laid him off or put him out of his cabin and ran him down the road at the point of a bayonet it hurt the employer worse than it hurt him.”

By 1937, Pegler had become disenchanted with the Newspaper Guild and with the labor movement in general. He became a strident critic of the Wagner Act, a pivotal New Deal reform that provided legal protections for workers to organize and set the stage for a period of dramatic union growth. Pegler depicted himself as the guardian of the rights of the individual workers, often overlooked, he claimed, in the pro-union environment of the day. He adopted their cause as his own. In the midst of a wave of sit-down strikes, Pegler asserted, “American labor is a big term. It includes millions of unorganized working people and millions of others who belong to unions but aren’t orators or parliamentarians and have little or nothing to say about the actions of the smart professionals who run their affairs.” His theme, one he went back to frequently in the years that followed, was that unions did not always represent the will of their members and that nonunion employees were workers too, whose rights should be considered. Two years later, he depicted himself as an advocate for “those who refuse to join unions, or [who] do join them under silent protest, [and] have lacked a means of presenting their case.” They were, he explained, the forgotten men of the new labor relations system. “The employer and the labor faker can make themselves heard, but the persecuted individual in the middle receives no hearing from the public and no respect from a government board [National Labor Relations Board] which was established with the frank purpose of assisting organized labor.”

Almost single-handedly, Pegler’s columns raised the issue of union corruption and placed it on the nation’s political agenda. He received a Pulitzer Prize for reporting in 1941. Time magazine acknowledged his prominence, announcing that “reader nominations for Time’s Man of the Year are now closed. Latest tabulations showed President Roosevelt in front, Comrade Stalin second and Columnist Westbrook Pegler third.” A year later, a survey of five hundred editors of daily newspapers, conducted by the University of Wisconsin School of Journalism, ranked him the nation’s “best adult columnist.” His columns went out six days a week to 174 newspapers that reached some ten million subscribers. The Saturday Evening Post touted him as “undoubtedly one of the leading individual editorial forces in the country.”

He used this editorial influence to promote new limits on union power and a host of other conservative causes, including opposition to Franklin Roosevelt and the New Deal. The columnist had come to believe that the New Deal’s expansion of federal authority and its support for organized labor represented a threat to individual freedom. He denounced the various government relief programs as political opportunism or, as he put it, “the exploitation of the great national emergency of poverty and idleness for personal profit and political power and the use of public money to put the poor in a grateful mood at election time.” Presaging a later generation of criticism directed at Washington insiders, he caricatured the administrators staffing these New Deal agencies as impractical elites cut off from the realities facing ordinary Americans. They were, he wrote, “a lot of wabble-wits stuck away in offices in Washington.” He attacked the president and his wife with a level of invective that even now, in today’s era of heightened partisan rancor, seems shocking. In February 1942, for instance, his column included the following jibe: “For all the gentle sweetness of my nature and my prose I have been accused of rudeness to Mrs. Roosevelt when I only said she was impudent, presumptuous and conspiratorial, and that her withdrawal from public life at this time would be a fine public service.”

Nor was his treatment of Congress any less sharp. A classic example of his style appeared in November 1941, when he wrote to urge Congress to pass new restrictions on unions in the face of opposition from the Roosevelt administration. “What a miserable, fumbling, timid aggregation of political trimmers and panhandlers our Congress is these days when it is openly said and never denied because it is wretchedly true, that the lawmaking body of the greatest republic on earth is afraid to pass any law that would place decent restraints on an organized mob of racketeers and dictators because the president won’t give the high-sign.” Pegler maintained that same tone throughout the column. He asserted, for instance, that congressmen “whimper like a kennel of curs because the President won’t give them his gracious permission to do their obvious duty.” “What the United States Congress lacks is guts,” the columnist concluded. He suggested that, “when the flag of the New Order is unfurled it should contain a broad yellow streak in memory of the men who sold their country out for a few lousy jobs.”

This writing style, as much as the opinions he expressed, was part of his appeal. He foreshadowed the tough-guy stance adopted by later radio and TV commentators, describing his columns as “these right-thinking, spade-calling, straight from the shoulder dispatches.” And readers responded to that style, even if many of them disagreed with the opinions expressed. As one of his biographers recalled, during the columnist’s best years, “it was possible for many millions of Americans, not all of them dunderheads, reactionaries and bigots by any means, to open their newspapers with a little quickening of the pulse, wondering whom Westbrook Pegler would clobber that day.” The columnist’s strident tone seemed to serve a useful purpose. The Los Angeles Times in 1941 referred to him as the “nation’s master prodder” in an editorial titled “Mr. Pegler Leads the Way Again!”

Even his political opponents could at times acknowledge Pegler’s appeal. Commenting on his influence in the upcoming 1942 Congressional elections, the liberal journalist George West wrote in the New Republic that “what we are up against is the Westbrook Pegler mind.” West accused Pegler of “giving greater aid and comfort to our domestic fascists than any other one man in the United States.” At the same time, however, West noted that “in spite of the exasperation and disgust that his column often inspires when he either shows a perverse failure to see straight or hits below the belt in true guttersnipe fashion,” he still considered Pegler “my favorite reactionary.” He intended this as more than just faint praise. “Pegler is an artist, a man of great courage, a hater of tyranny,” West explained, “and he calls the shots as he sees them.”

But Pegler’s reputation steadily faded in the years that followed. He developed a tendency to harp unrelentingly on the same set of issues and the occasional moments of humor and open-mindedness became increasingly rare, leaving more and more space for unmeasured invective. His publisher, Roy W. Howard, tried to warn him about this tendency. Using a boxing metaphor, Howard wrote, “You are losing a lot of points because of low swinging, heeling your glove and employing a few of the tactics of a journalistic Elbows McFadden.” A “more or less brutal bar-room fighter quality,” Howard asserted, was overtaking Pegler’s columns. “I think it is bad because it gives people the impression that you’re in a grouch with yourself and the world and that, consequently, anything you say can be discounted on the grounds that your effort does not spring from a sincere desire to redress a wrong, but rather from a grouch’s desire to kick a cat.” Pegler refused to heed the advice, and he quit working for Howard in 1944. He moved his column over to the Hearst Syndicate, and the trend Howard had warned him against continued unabated.

As a result, he lost the audience he most wanted to have. Many of the workers whose cause Pegler had always claimed to champion now viewed him with distaste. As one newspaper editor explained, “Labor—rank and filers as well as leaders—dislike and distrust him. They feel he is prejudiced and unfair.” As his columns degenerated into intemperate screeds in the years that followed, critics would tag Pegler as the “stuck whistle of journalism.” His conservatism drifted into extremism, and by 1962 he became too strident for the Hearst Syndicate, which canceled his column. After that, Pegler wrote briefly for the John Birch Society, but by that time he had become too cantankerous even for them to accept. Ω

[David Witwer is a professor of American Studies and History at Pennsylvania State University-Harrisburg. He received a BA from DePauw University and both an MA and PhD from Brown University. His most recent book is Shadow of the Racketeer: Scandal in Organized Labor (2009).]

Copyright © 2012 National Endowment for the Humanities

Since the Google Reader will go dark on July 1, 2013, another site is available tor readers of a lot of blogs (or a single blog). The alternative is Feedly. For a review of Feedly by the NY FIshwrap's David Pogue, click here.

Creative Commons License
Sapper's (Fair & Balanced) Rants & Raves by Neil Sapper is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License. Based on a work at sapper.blogspot.com. Permissions beyond the scope of this license may be available here.



Copyright © 2013 Sapper's (Fair & Balanced) Rants & Raves

Thursday, May 30, 2013

A Tempest In A Dismal-Science-Teapot?

In his salad days, this blogger engaged in many jousts in cyberspace — hurling flames with glee — but now, this blogger is long past such childish sport. However, it seems that a pair of Harvard economists have gotten into a virtual spat with a Princeton economist with a Nobel Prize and a HUGE vat of ink at the NY Fishwrap. Another dismal scientist weighs in with a lecture on virtual civility to the combatants. Will they heed the call to cease and desist or will they go on to MAD (Mutual Assured Destruction)? If this is an example of (fair & balanced) ado about nothing, so be it.

[x Forbes]
Optimal Civility
By Adam Ozimek

Tag Cloud of the following article

created at TagCrowd.com

(click to embiggen)

There has been some discussion of civility as a result of Reinhart and Rogoff’s open letter to Krugman, which included accusations of incivility:

We admire your past scholarly work, which influences us to this day. So it has been with deep disappointment that we have experienced your spectacularly uncivil behavior the past few weeks. You have attacked us in very personal terms, virtually non-stop, in your New York Times column and blog posts.

This has highlighted a question to me that is aside from the RR vs K debate, which is what optimal civility looks like? Civility is just one characteristic with which you could throw in the related writerly characteristics of generosity, harshness, and others. The case for writing with civility is that doing otherwise has a poisonous effect on debate. People who you might persuade stop reading you, those who write in disagreement are pressured to return the incivility, which in turn drivers other readers out of the conversation. I believe it also affects those you are being uncivil towards in more than their rhetoric. Good counterarguments are much harder to acknowledge and even see when they come with a spoonful of “you’re a ***** idiot”. Through this process incivility leads to defensiveness, and this in turn drives polarization. In addition incivility drives people out of the debate. Writers may have counterarguments to some piece they read but don’t want to reply because they don’t wish to be the target of incivility and personal attacks. Overall, I would characterize the internet in suffering from too little rather than too much civility.

However, does incivility not have it’s place? If not incivility in the sense of writing rudely, there is at least a place for harshness, and writing aimed at lowering the status of a writer in everyone’s eyes. Ken Rogoff once wrote an open letter to Joe Stiglitz about a book of his that while civil certainly contained the goal of lowering readers opinions of Joe overall. For example, the point of this story is to show that Joe Stiglitz has a massive ego and is overconfident, and then to draw the line between these characteristics of Joe and the weaknesses in the book:

One of my favorite stories from that era is a lunch with you and our former colleague, Carl Shapiro, at which the two of you started discussing whether Paul Volcker merited your vote for a tenured appointment at Princeton. At one point, you turned to me and said, “Ken, you used to work for Volcker at the Fed. Tell me, is he really smart?” I responded something to the effect of “Well, he was arguably the greatest Federal Reserve Chairman of the twentieth century” To which you replied, “But is he smart like us?” I wasn’t sure how to take it, since you were looking across at Carl, not me, when you said it.

My reason for telling this story is two-fold. First, perhaps the Fund staff who you once blanket-labeled as “third rate”—and I guess you meant to include World Bank staff in this judgment also—will feel better if they know they are in the same company as the great Paul Volcker. Second, it is emblematic of the supreme self-confidence you brought with you to Washington, where you were confronted with policy problems just a little bit more difficult than anything in our mathematical models. This confidence brims over in your new 282 page book.

The tone here strikes me as civil, but the overall point is fairly harsh: Joe is an overconfident jerk, and you can see it in his book and in real life. Should Ken Rogoff have left this personal criticism out of the review? If it’s true then it is important information for readers to know. After all, having an idea of how meta-rational writers are matters, and even the most civil among us contain the un-meta-rationality of others as a subtle theme in our writing. Is outright accusing someone of being a poor thinker too harsh, while subtly implying it as an underlying blog theme acceptable? This feels like what the civil part of the blogosphere thinks is correct, though it’s hard to say since we talk so little about optimal civility and rarely are transparent about our thoughts on it. This rule kind of makes sense as a way to criticize and lower the status of some thinkers without raising the defensiveness of that thinker and their fans that I mentioned earlier. But I do think that many readers miss these subtle underlying themes and so don’t have as good of an appreciation of meta-rationality as other readers do.

In addition, I mentioned earlier that incivility can drive people out of the debate entirely if they don’t want to be the target of harsh personal criticisms. But some people should not be involved in some debates. If they have an audience and bring only noise and misinformation, then their non-participation is important. I am skeptical however that this the best mechanism to enforce non-participation. Dislike of harsh responses is not very strongly correlated with poor thinking, and I think this mechanism drives out more good arguments than bad.

Even the most civil writers recognize the importance of sometimes being harsh. Take Tyler Cowen’s review of Naomi Klein‘s Shock Doctrine, which closed with this:

In the same interview, Ms. Klein also tellingly remarked, “I believe people believe their own bulls—. Ideology can be a great enabler for greed.”

When it comes to the best-selling Shock Doctrine, that is perhaps the bottom line on what Klein herself has been up to.

This is funny, harsh, and useful for readers to see Tyler saying because it is a signal of the extent to which he sees her writing and thinking as low quality. But in part this harshness retains it’s value because Tyler is overall a very civil writer who is spare with his harshness. This same sentence from Brad DeLong, who is harsh and uncivil regularly, would not catch the readers attention nearly as much or carry much informational weight.

So where does this leave us? Overall I regard the blogosphere having too little rather than too much civility. But harshness, personal criticisms, and attempts to lower the status of other thinkers does have it’s place. What is the optimal amount of incivility then? I don’t have a good answer for this even a very coherent way to think about it, as you’ve clearly seen by now. Many people may be bored by discussions of how civil we should be, or more broadly how we should write, and consider it political correctness taken too seriously. But I think this applies simply to calls for more or less civility. I’m asking for arguments for optimal civility, and optimal writing style in general. If you wish to make the case for more, less, or status quo civility then let’s have it defended on these terms. Ω

[Adam Ozimek is a senior analyst at Econsult Solutions. He joined the Philadelphia constulting firm in 2008. Ozimek holds a Bachelor of Arts in Economics from West Chester University and received a Masters of Arts in Economics from Temple University and he is pursuing his PhD in economics from Temple University, with specializations in industrial organization and public policy, and applied econometrics.]

Copyright © 2013 Forbes

Since the Google Reader will go dark on July 1, 2013, another site is available tor readers of a lot of blogs (or a single blog). The alternative is Feedly. For a review of Feedly by the NY FIshwrap's David Pogue, click here.

Creative Commons License
Sapper's (Fair & Balanced) Rants & Raves by Neil Sapper is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License. Based on a work at sapper.blogspot.com. Permissions beyond the scope of this license may be available here.



Copyright © 2013 Sapper's (Fair & Balanced) Rants & Raves

Wednesday, May 29, 2013

If "A Rose By Any Other Name Would Smell As Sweet," A Conflict Called A "War On Terror" Smells Like $hit!

Bummer! Professor Andrew J. Bacevich predicts that the yet-unnamed conflict in the Middle East that has raged for most of the 21st century will be known as The Eternal War. In a way, our perpetual war replicates former NFL-coach Joe Schmidt's favorite saying: "Life is a $hit sandwich and every day you take another bite." As far as the Eternal War is concerned, every day we take another bite. If this is (fair & balanced) nomenclatural speculation, so be it.

[x HNN]
How We Name Our Wars Matters
By Andrew J. Bacevich

Tag Cloud of the following article

created at TagCrowd.com

(Click to embiggen)

For well over a decade now the United States has been “a nation at war.” Does that war have a name?

It did at the outset. After 9/11, George W. Bush's administration wasted no time in announcing that the U.S. was engaged in a Global War on Terrorism, or GWOT. With few dissenters, the media quickly embraced the term. The GWOT promised to be a gargantuan, transformative enterprise. The conflict begun on 9/11 would define the age. In neoconservative circles, it was known as World War IV.

Upon succeeding to the presidency in 2009, however, Barack Obama without fanfare junked Bush’s formulation (as he did again in a speech at the National Defense University last week). Yet if the appellation went away, the conflict itself, shorn of identifying marks, continued.

Does it matter that ours has become and remains a nameless war? Very much so.

Names bestow meaning. When it comes to war, a name attached to a date can shape our understanding of what the conflict was all about. To specify when a war began and when it ended is to privilege certain explanations of its significance while discrediting others. Let me provide a few illustrations.

With rare exceptions, Americans today characterize the horrendous fraternal bloodletting of 1861-1865 as the Civil War. Yet not many decades ago, diehard supporters of the Lost Cause insisted on referring to that conflict as the War Between the States or the War for Southern Independence (or even the War of Northern Aggression). The South may have gone down in defeat, but the purposes for which Southerners had fought — preserving a distinctive way of life and the principle of states’ rights — had been worthy, even noble. So at least they professed to believe, with their preferred names for the war reflecting that belief.

Schoolbooks tell us that the Spanish-American War began in April 1898 and ended in August of that same year. The name and dates fit nicely with a widespread inclination from President William McKinley’s day to our own to frame U.S. intervention in Cuba as an altruistic effort to liberate that island from Spanish oppression.

Yet the Cubans were not exactly bystanders in that drama. By 1898, they had been fighting for years to oust their colonial overlords. And although hostilities in Cuba itself ended on August 12, they dragged on in the Philippines, another Spanish colony that the United States had seized for reasons only remotely related to liberating Cubans. Notably, U.S. troops occupying the Philippines waged a brutal war not against Spaniards but against Filipino nationalists no more inclined to accept colonial rule by Washington than by Madrid. So widen the aperture to include this Cuban prelude and the Filipino postlude and you end up with something like this: The Spanish-American-Cuban-Philippines War of 1895-1902. Too clunky? How about the War for the American Empire? This much is for sure: rather than illuminating, the commonplace textbook descriptor serves chiefly to conceal.

Strange as it may seem, Europeans once referred to the calamitous events of 1914-1918 as the Great War. When Woodrow Wilson decided in 1917 to send an army of doughboys to fight alongside the Allies, he went beyond Great. According to the president, the Great War was going to be the War To End All Wars. Alas, things did not pan out as he expected. Perhaps anticipating the demise of his vision of permanent peace, War Department General Order 115, issued on October 7, 1919, formally declared that, at least as far as the United States was concerned, the recently concluded hostilities would be known simply as the World War.

In September 1939 — presto chango! — the World War suddenly became the First World War, the Nazi invasion of Poland having inaugurated a Second World War, also known as World War II or more cryptically WWII. To be sure, Soviet dictator Josef Stalin preferred the Great Patriotic War. Although this found instant — almost unanimous — favor among Soviet citizens, it did not catch on elsewhere. [Editor's note: the "Great Patriotic War was itself an attempt to wrap up the struggle against the Nazis with the same Russian mythohistorical allure as the "Patriotic War," the fight against Napoleon in 1812.]

Does World War II accurately capture the events it purports to encompass? With the crusade against the Axis now ranking alongside the crusade against slavery as a myth-enshrouded chapter in U.S. history to which all must pay homage, Americans are no more inclined to consider that question than to consider why a playoff to determine the professional baseball championship of North America constitutes a “World Series.”

In fact, however convenient and familiar, World War II is misleading and not especially useful. The period in question saw at least two wars, each only tenuously connected to the other, each having distinctive origins, each yielding a different outcome. To separate them is to transform the historical landscape.

On the one hand, there was the Pacific War, pitting the United States against Japan. Formally initiated by the December 7, 1941, attack on Pearl Harbor, it had in fact begun a decade earlier when Japan embarked upon a policy of armed conquest in Manchuria. At stake was the question of who would dominate East Asia. Japan’s crushing defeat at the hands of the United States, sealed by two atomic bombs in 1945, answered that question (at least for a time).

Then there was the European War, pitting Nazi Germany first against Great Britain and France, but ultimately against a grand alliance led by the United States, the Soviet Union, and a fast-fading British Empire. At stake was the question of who would dominate Europe. Germany’s defeat resolved that issue (at least for a time): no one would. To prevent any single power from controlling Europe, two outside powers divided it.

This division served as the basis for the ensuing Cold War, which wasn’t actually cold, but also (thankfully) wasn’t World War III, the retrospective insistence of bellicose neoconservatives notwithstanding. But when did the Cold War begin? Was it in early 1947, when President Harry Truman decided that Stalin’s Russia posed a looming threat and committed the United States to a strategy of containment? Or was it in 1919, when Vladimir Lenin decided that Winston Churchill’s vow to “strangle Bolshevism in its cradle” posed a looming threat to the Russian Revolution, with an ongoing Anglo-American military intervention evincing a determination to make good on that vow?

Separating the war against Nazi Germany from the war against Imperial Japan opens up another interpretive possibility. If you incorporate the European conflict of 1914-1918 and the European conflict of 1939-1945 into a single narrative, you get a Second Thirty Years War (the first having occurred from 1618-1648) — not so much a contest of good against evil, as a mindless exercise in self-destruction that represented the ultimate expression of European folly.

So, yes, it matters what we choose to call the military enterprise we’ve been waging not only in Iraq and Afghanistan, but also in any number of other countries scattered hither and yon across the Islamic world. Although the Obama administration appears no more interested than the Bush administration in saying when that enterprise will actually end, the date we choose as its starting point also matters.

Although Washington seems in no hurry to name its nameless war — and will no doubt settle on something self-serving or anodyne if it ever finally addresses the issue — perhaps we should jump-start the process. Let’s consider some possible options, names that might actually explain what’s going on.

The Long War: Coined not long after 9/11 by senior officers in the Pentagon, this formulation never gained traction with either civilian officials or the general public. Yet the Long War deserves consideration, even though — or perhaps because — it has lost its luster with the passage of time.

At the outset, it connoted grand ambitions buoyed by extreme confidence in the efficacy of American military might. This was going to be one for the ages, a multi-generational conflict yielding sweeping results.

The Long War did begin on a hopeful note. The initial entry into Afghanistan and then into Iraq seemed to herald “home by Christmas” triumphal parades. Yet this soon proved an illusion as victory slipped from Washington’s grasp. By 2005 at the latest, events in the field had dashed the neo-Wilsonian expectations nurtured back home.

With the conflicts in Iraq and Afghanistan dragging on, “long” lost its original connotation. Instead of “really important," it became a synonym for “interminable.” Today, the Long War does succinctly capture the experience of American soldiers who have endured multiple combat deployments to Iraq and Afghanistan.

For Long War combatants, the object of the exercise has become to persist. As for winning, it’s not in the cards. The Long War just might conclude by the end of 2014 if President Obama keeps his pledge to end the U.S. combat role in Afghanistan and if he avoids getting sucked into Syria’s civil war. So the troops may hope.

The War Against Al-Qaeda: It began in August 1996 when Osama bin Laden issued a "Declaration of War against the Americans Occupying the Land of the Two Holy Places,” i.e., Saudi Arabia. In February 1998, a second bin Laden manifesto announced that killing Americans, military and civilian alike, had become “an individual duty for every Muslim who can do it in any country in which it is possible to do it.”

Although President Bill Clinton took notice, the U.S. response to bin Laden’s provocations was limited and ineffectual. Only after 9/11 did Washington take this threat seriously. Since then, apart from a pointless excursion into Iraq (where, in Saddam Hussein’s day, al-Qaeda did not exist), U.S. attention has been focused on Afghanistan, where U.S. troops have waged the longest war in American history, and on Pakistan’s tribal borderlands, where a CIA drone campaign is ongoing. By the end of President Obama’s first term, U.S. intelligence agencies were reporting that a combined CIA/military campaign had largely destroyed bin Laden’s organization. Bin Laden himself, of course, was dead.

Could the United States have declared victory in its unnamed war at this point? Perhaps, but it gave little thought to doing so. Instead, the national security apparatus had already trained its sights on various al-Qaeda “franchises” and wannabes, militant groups claiming the bin Laden brand and waging their own version of jihad. These offshoots emerged in the Maghreb, Yemen, Somalia, Nigeria, and — wouldn’t you know it — post-Saddam Iraq, among other places. The question as to whether they actually posed a danger to the United States got, at best, passing attention — the label “al-Qaeda” eliciting the same sort of Pavlovian response that the word “communist” once did.

Americans should not expect this war to end anytime soon. Indeed, the Pentagon’s impresario of special operations recently speculated — by no means unhappily — that it would continue globally for “at least ten to twenty years.” Freely translated, his statement undoubtedly means: “No one really knows, but we’re planning to keep at it for one helluva long time.”

The War For/Against/About Israel: It began in 1948. For many Jews, the founding of the state of Israel signified an ancient hope fulfilled. For many Christians, conscious of the sin of anti-Semitism that had culminated in the Holocaust, it offered a way to ease guilty consciences, albeit mostly at others’ expense. For many Muslims, especially Arabs, and most acutely Arabs who had been living in Palestine, the founding of the Jewish state represented a grave injustice. It was yet another unwelcome intrusion engineered by the West — colonialism by another name.

Recounting the ensuing struggle without appearing to take sides is almost impossible. Yet one thing seems clear: in terms of military involvement, the United States attempted in the late 1940s and 1950s to keep its distance. Over the course of the 1960s, this changed. The U.S. became Israel’s principal patron, committed to maintaining (and indeed increasing) its military superiority over its neighbors.

In the decades that followed, the two countries forged a multifaceted “strategic relationship.” A compliant Congress provided Israel with weapons and other assistance worth many billions of dollars, testifying to what has become an unambiguous and irrevocable U.S. commitment to the safety and well-being of the Jewish state. The two countries share technology and intelligence. Meanwhile, just as Israel had disregarded U.S. concerns when it came to developing nuclear weapons, it ignored persistent U.S. requests that it refrain from colonizing territory that it has conquered.

When it comes to identifying the minimal essential requirements of Israeli security and the terms that will define any Palestinian-Israeli peace deal, the United States defers to Israel. That may qualify as an overstatement, but only slightly. Given the Israeli perspective on those requirements and those terms — permanent military supremacy and a permanently demilitarized Palestine allowed limited sovereignty — the War For/Against/About Israel is unlikely to end anytime soon either. Whether the United States benefits from the perpetuation of this war is difficult to say, but we are in it for the long haul.

The War for the Greater Middle East: I confess that this is the name I would choose for Washington’s unnamed war and is, in fact, the title of a course I teach. (A tempting alternative is the Second Hundred Years War, the "first" having begun in 1337 and ended in 1453.)

This war is about to hit the century mark, its opening chapter coinciding with the onset of World WarI. Not long after the fighting on the Western Front in Europe had settled into a stalemate, the British government, looking for ways to gain the upper hand, set out to dismantle the Ottoman Empire whose rulers had foolishly thrown in their lot with the German Reich against the Allies.

By the time the war ended with Germany and the Turks on the losing side, Great Britain had already begun to draw up new boundaries, invent states, and install rulers to suit its predilections, while also issuing mutually contradictory promises to groups inhabiting these new precincts of its empire. Toward what end? Simply put, the British were intent on calling the shots from Egypt to India, whether by governing through intermediaries or ruling directly. The result was a new Middle East and a total mess.

London presided over this mess, albeit with considerable difficulty, until the end of World War II. At this point, by abandoning efforts to keep Arabs and Zionists from one another's throats in Palestine and by accepting the partition of India, they signaled their intention to throw in the towel. Alas, Washington proved more than willing to assume Britain’s role. The lure of oil was strong. So too were the fears, however overwrought, of the Soviets extending their influence into the region.

Unfortunately, the Americans enjoyed no more success in promoting long-term, pro-Western stability than had the British. In some respects, they only made things worse, with the joint CIA-MI6 overthrow of a democratically elected government in Iran in 1953 offering a prime example of a “success” that, to this day, has never stopped breeding disaster.

Only after 1980 did things get really interesting, however. The Carter Doctrine promulgated that year designated the Persian Gulf a vital national security interest and opened the door to greatly increased U.S. military activity not just in the Gulf, but also throughout the Greater Middle East (GME). Between 1945 and 1980, considerable numbers of American soldiers lost their lives fighting in Asia and elsewhere. During that period, virtually none were killed fighting in the GME. Since 1990, in contrast, virtually none have been killed fighting anywhere except in the GME.

What does the United States hope to achieve in its inherited and unending War for the Greater Middle East? To pacify the region? To remake it in our image? To drain its stocks of petroleum? Or just keeping the lid on? However you define the war’s aims, things have not gone well, which once again suggests that, in some form, it will continue for some time to come. If there’s any good news here, it’s the prospect of having ever more material for my seminar, which may soon expand into a two-semester course.

The War Against Islam: This war began nearly 1,000 years ago and continued for centuries, a storied collision between Christendom and the Muslim ummah. For a couple of hundred years, periodic eruptions of large-scale violence occurred until the conflict finally petered out with the last crusade sometime in the fourteenth century.

In those days, many people had deemed religion something worth fighting for, a proposition to which the more sophisticated present-day inhabitants of Christendom no longer subscribe. Yet could that religious war have resumed in our own day? Professor Samuel Huntington thought so, although he styled the conflict a “clash of civilizations.” Some militant radical Islamists agree with Professor Huntington, citing as evidence the unwelcome meddling of “infidels,” mostly wearing American uniforms, in various parts of the Muslim world. Some militant evangelical Christians endorse this proposition, even if they take a more favorable view of U.S. troops occupying and drones targeting Muslim countries.

In explaining the position of the United States government, religious scholars like George W. Bush and Barack (Hussein!) Obama have consistently expressed a contrary view. Islam is a religion of peace, they declare, part of the great Abrahamic triad. That the other elements of that triad are likewise committed to peace is a proposition that Bush, Obama, and most Americans take for granted, evidence not required. There should be no reason why Christians, Jews, and Muslims can’t live together in harmony.

Still, remember back in 2001 when, in an unscripted moment, President Bush described the war barely begun as a “crusade”? That was just a slip of the tongue, right? If not, we just might end up calling this one the Eternal War. Ω

[Andrew J. Bacevich graduated from West Point in 1969 and served in the U.S. Army during the Vietnam War, serving in Vietnam from the summer of 1970 to the summer of 1971. Afterwards he held posts in Germany, the United States, and the Persian Gulf up to his retirement from the service with the rank of Colonel in the early 1990s. He holds a Ph.D. in American Diplomatic History from Princeton University, and taught at West Point and Johns Hopkins University prior to joining the faculty at Boston University in 1998 as a professor of international relations and director of its Center for International Relations (from 1998 to 2005). Bacevitch is the author of several books, including American Empire: The Realities and Consequences of US Diplomacy (2002) and The New American Militarism: How Americans are Seduced by War (2005). Bacevich's newest book will be released in September 2013: Breach of Trust: How Americans Failed Their Soldiers and Their Country. He has been "a persistent, vocal critic of the US occupation of Iraq, calling the conflict a catastrophic failure." In March of 2007, he described George W. Bush's endorsement of such "preventive wars" as "immoral, illicit, and imprudent."

On May 13, 2007, Bacevich's son, also named Andrew J. Bacevich, died in action in Iraq, when he was killed by a suicide bomber south of Samarra in Salah Ad Din Province. The younger Bacevich, 27, was a First Lieutenant. He was assigned to the 3rd Battalion, 8th U.S. Cavalry Regiment, 1st Cavalry Division.]

Copyright © 2013 History News Network

Since the Google Reader will go dark on July 1, 2013, another site is available tor readers of a lot of blogs (or a single blog). The alternative is Feedly. For a review of Feedly by the NY FIshwrap's David Pogue, click here.

Creative Commons License
Sapper's (Fair & Balanced) Rants & Raves by Neil Sapper is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License. Based on a work at sapper.blogspot.com. Permissions beyond the scope of this license may be available here.



Copyright © 2013 Sapper's (Fair & Balanced) Rants & Raves

Tuesday, May 28, 2013

Hold The Phones! An "Inner Voice" Speaks To Tom Tomorrow During A CNN Report On The OK-Tornado!

In this scandal-ridden season, Tom Tomorrow has gone all Benghazi with today's 'toon. In the Tom Tomorrow version, Wolf Blitzer receives a non-religionist sermon from an OK-tornado survivor. Did it happen? Only Tom Tomorrow knows. If this is (fair & balanced) scandalogy, so be it.

[x YouTube/WashingtonFreeBeacon Channel]
Wolf Blitzer Asks Tornado Survivor if She Thanked the Lord; Replies She's an Atheist



[x This Modern World]
Wolf, Theologian
By Tom Tomorrow (Dan Perkins)


(Click to embiggen — H/T to Daily Kos — or use the zoom feature of your browser) Ω

Tom Tomorrow/Dan Perkins

[Dan Perkins is an editorial cartoonist better known by the pen name "Tom Tomorrow". His weekly comic strip, "This Modern World," which comments on current events from a strong liberal perspective, appears regularly in approximately 150 papers across the U.S., as well as on Daily Kos. The strip debuted in 1990 in SF Weekly. Perkins, a long time resident of Brooklyn, New York, currently lives in Connecticut. He received the Robert F. Kennedy Award for Excellence in Journalism in both 1998 and 2002. When he is not working on projects related to his comic strip, Perkins writes a daily political weblog, also entitled "This Modern World," which he began in December 2001.]

Copyright © 2013 Tom Tomorrow (Dan Perkins)

Since the Google Reader will go dark on July 1, 2013, another site is available tor readers of a lot of blogs (or a single blog). The alternative is Feedly. For a review of Feedly by the NY FIshwrap's David Pogue, click here.

Creative Commons License
Sapper's (Fair & Balanced) Rants & Raves by Neil Sapper is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License. Based on a work at sapper.blogspot.com. Permissions beyond the scope of this license may be available here.



Copyright © 2013 Sapper's (Fair & Balanced) Rants & Raves

Monday, May 27, 2013

Roll Over St. Hofstadter! Make Way For A Real MythBuster! Heeere's Patrick Smith!

As we take another walk through MythWorld (aka the United States of America), it is good to remember that The Great Hofstadter taught us that

"By myth..., I do not mean an idea that is simply false, but rather one that so effectively embodies men's values that it profoundly influences their way of perceiving reality and hence their behavior. In this sense, myths may have varying degrees of fiction or reality...."

And, today, Patrick Smith teaches us that the Teabaggers are not clinging to their guns and religion as much as they are clinging to their myths of national exceptionalism and national omnipotence. Before reading Patrick Smith's diagnosis of what ails us, this blog recommends (not requires) that you go to your closet and drag out your hairshirt and wear it while reading his sensible pronouncement. If this is (fair & balanced) national reality, so be it.

[x Salon]
American Exceptionalism Is A Dangerous Myth
By Patrick Smith

Tag Cloud of the following article

created at TagCrowd.com

(Click to embiggen)

At one end of the Reflecting Pool in Washington, DC, in the expanse between the Washington Monument and the Lincoln Memorial, the Bush administration authorized a memorial to World War II. This was a matter of months before the events of September 11. It seemed a strange design when it was first shown in the early summer of 2001, and so it proved when the monument was finished and open to the public in 2004. It consists of fifty-six granite pillars arranged in two half-circles around a pool, each pillar standing for a state or territory, each endowed with a bronze wreath. Each side of the entranceway—graceful granite steps down to the level of the pool—is lined with a dozen bas-relief bronzes depicting important moments in either the European or the Pacific war. At the opposite end of the small circular pool, a “freedom wall” commemorates the 400,000 American dead with 4,000 gold stars.

This message, chiseled into a stone tablet, greets the visitor to the World War II Memorial:

Here in the presence of Washington and Lincoln, one the eighteenth-century father and the other the nineteenth-century preserver of our nation, we honor those twentieth-century Americans who took up the struggle during the Second World War and made the sacrifices to perpetuate the gift our forefathers entrusted to us, a nation conceived in liberty and justice.

One must spend a certain time at the memorial to grasp the message it is conveying.This has to do with the monument’s style, as the bas-relief bronzes and the welcoming inscription suggest. This is not a memorial built by people of the early twenty-first century. Part of its purpose, indeed, is to erase all that Americans did between 1945 and 2001 so that we might insert ourselves into the morally pure era (supposedly, as we have reimagined it) of the Second World War. It functions, then, a little like Williamsburg or Sturbridge Village: It is history that is not-history, or not-history dressed up as history. It is history, in short, for those who are devoid of memory. The architect—Friedrich St. Florian, whose studio is in Rhode Island— accomplished this by designing in the style sometimes called modern classical. The modern classical style was popular in the 1930s and forties. It is characterized by mass and volume in its forms and simplified articulations of minimal detail. Roosevelt might have built in this style, as Stalin or Mussolini might have.

St. Florian’s project, then, is a monument to forgetting, not remembering. There is no bas-relief dedicated to the atomic bomb attacks on Japan or the fire-bombings in Germany; all that occurred after 1945 disappears into the memorial’s antiquated style. We have a hint of this if we consider the date of its conception and construction. The first decade of our new century was marked by a strong, quite evident nostalgia for the Second World War. One found it in best-selling books (The Greatest Generation) and in popular films (“Pearl Harbor,” “Schindler’s List”). The monument is of a piece with these cultural productions. It is a memorial as we imagine such a thing would have been made at the time being memorialized. It is a reenactment of a sorrow that is beyond us to feel now. One cannot say this about the other monuments ranged around the Reflecting Pool. They are not reenactments; they are not in quotation marks. In this case, one is placed back in the 1940s so as to see the forties. It is history for people who cannot connect with history. Nostalgia is always an expression of unhappiness with the present, and never does it give an accurate accounting of the past. What are we to say about a monument to a nostalgia for nostalgia?

* * *

The various symptoms of America’s dysfunctional relationship with its past are all in evidence in the Tea Party, the political movement formed in 2009 and named for the Boston Tea Party of 1773. It would be remiss not to note this. Much has been written about the Tea Party’s political positions: Its members are radically opposed to taxation and favor a fundamentalist idea of the infallibility of markets and an almost sacramental interpretation of the Constitution. They cannot separate religion from politics, and they consider President Obama either a socialist or a Nazi or (somehow) both. They hold to a notion of the individual that the grizzliest fur trapper west of the Missouri River 170 years ago would have found extreme. When the Tea Party first began to gather national attention, many considered it a caricature of the conservative position that held too distorted an idea of American history to last any consequential amount of time. Plainly this has been wrong, at least so far, given the number of seats the movement won in the legislative elections of November 2010: At this writing, they number sixty-two in the House of Representatives.

“Take our country back” is among the Tea Party’s more familiar anthems. And among skeptics it is often asked, “Back to what?” I have heard various answers. Back to the 1950s is one, and this is plausible enough, given the trace of the movement’s bloodlines back to the John Birch Society and others among the rabidly anticommunist groups active during the Cold War’s first decade. But the answer I prefer is the eighteenth century—or, rather, an imaginary version of the eighteenth century. A clue to the collective psychology emerged in the movement’s early days, when adherents dressed in tricorn hats, knee breeches, and brass-buckled shoes. This goes to the true meaning of the movement and explains why it appeared when it did. One cannot miss, in the movement’s thinking and rhetoric, a desire for a mythical return, another “beginning again,” a ritual purification, another regeneration for humanity.

Whatever the Tea Party’s unconscious motivations and meanings—and I count these significant to an understanding of the group—we can no longer make light of its political influence; it has shifted the entire national conversation rightward—and to an extent backward, indeed. But more fundamentally than this, the movement reveals the strong grip of myth on many Americans—the grip of myth and the fear of change and history. In this, it seems to me, the Tea Party speaks for something more than itself. It is the culmination of the rise in conservatism we can easily trace to the 1980s. What of this conservatism, then? Ever since Reagan’s “Morning in America” campaign slogan in 1984 it has purported to express a new optimism about America. But in the Tea Party we discover the true topic to be the absence of optimism and the conviction that new ideas are impossible. Its object is simply to maintain a belief in belief and an optimism about optimism. These are desperate endeavors. They amount to more expressions of America’s terror in the face of history. To take our country back: Back to its mythological understanding of itself before the birth of its own history is the plainest answer of all.

I do not see that America has any choice now but to face this long terror. America’s founding was unfortunate in the fear and apprehension it engendered, and unfortunate habits and impulses have arisen from it. These are now in need of change—a project of historical proportion. Can we live without our culture of representation, our images and symbols and allusions and references, so casting our gaze forward, not behind us? Can we look ahead expectantly and seek greatness instead of assuming it always lies behind us and must be quoted? Can we learn to see and judge things as they are? Can we understand events and others (and ourselves most of all) in a useful, authentic context? Can we learn, perhaps most of all, to act not out of fear or apprehension but out of confidence and clear vision? In one way or another, the dead end of American politics as I write reminds us that all of these questions now urgently require answers. This is the nature of our moment.

* * *

In some ways the American predicament today bears an uncanny resemblance to that of the 1890s. At home we face social, political, and economic difficulties of a magnitude such that they are paralyzing the nation and pulling it apart all at once. Abroad, having fought two costly and pointless wars since 2001, we are challenged to define our place in the world anew—to find a new way of venturing forth into it. The solutions America chose a century ago are not available to us now. But the choices then are starkly ours once again.

Our first choice is to accept the presence of these choices in our national life. This is a decision of considerable importance. To deny it is there comes to a choice in itself—the gravest Americans can make. When America entered history in 2001, it was no one’s choice, unless one wants to count Osama bin Laden. This means that America’s first choice lies between acceptance and denial. The logic of our national reply seems perfectly evident. To remain as we are, clinging to our myths and all that we once thought made us exceptional, would be to make of our nation an antique, a curiosity of the eighteenth century that somehow survived into the twenty-first. Change occurs in history, and Americans must accept this if they choose to change.

But how does a nation go about accepting fundamental changes in its circumstances—and therefore its identity, its consciousness? How does a nation begin to live in history? In an earlier essay I wrote about what a German thinker has called the culture of defeat and its benefits for the future. Defeat obliges a people to reexamine their understanding of themselves and their place in the world. This is precisely the task lying at America’s door, but on the basis of what should Americans take it up? “Defeat” lands hard among Americans. The very suggestion of it is an abrasion. We remain committed to winning the “war on terror” Bush declared in 2001, even if both the term and the notion have come in for scrutiny and criticism. Who has defeated America such that any self-contemplation of the kind I suggest is warranted?

The answer lies clearly before us, for we live among the remains of a defeat of historical magnitude. We need only think carefully to understand it. We need to think of defeat in broader terms— psychological terms, ideological terms, historical terms. We need to think, quite simply, of who we have been—not just to ourselves but to others. Recall our nation’s declared destiny before and during its founding. The Spanish-American War and all that followed—in the name of what, these interventions and aggressions? What was it Americans reiterated through all the decades leading to 2001—and, somewhat desperately, beyond that year? It was to remake the world, as Condoleezza Rice so plainly put it. It was to make the world resemble us, such that all of it would have to change and we would not. This dream, this utopia, the prospect of the global society whose imagining made us American, is what perished in 2001. America’s fundamentalist idea of itself was defeated on September 11. To put the point another way, America lost its long war against time. This is as real a defeat as any other on a battlefield or at sea. Osama bin Laden and those who gave their lives for his cause spoke for no one but themselves, surely. But they nonetheless gave substantial, dreadful form to a truth that had been a long time coming: The world does not require America to release it into freedom. Often the world does not even mean the same things when it speaks of “freedom,” “liberty,” and “democracy.” And the world is as aware as some Americans are of the dialectic of promise and self-betrayal that runs as a prominent thread through the long fabric of the American past.

Look upon 2001 in this way, and we begin to understand what it was that truly took its toll on the American consciousness. Those alive then had witnessed the end of a long experiment—a hundred years old if one counts from the Spanish war, two hundred to go back to the revolutionary era, nearly four hundred to count from Winthrop and the Arbella. I know of no one who spoke of 2001 in these terms at the time: It was unspeakable. But now, after a decade’s failed effort to revive the utopian dream and to “create reality,” ww would do best not only to speak of it but to act with the impossibility of our inherited experiment in mind—confident that there is a truer way of being in the world.

* * *

Where would an exploration rooted in a culture of defeat land Americans, assuming such an exercise were possible? That it would be a long journey is the first point worth making. There is time no longer for our exceptionalist myths, but to alter our vision of ourselves and ourselves in the world would be no less formidable a task for Americans than it would be (or has been) for anyone else. History suggests that we are counting in decades, for there would be much for Americans to ponder—much that has escaped consideration for many years. History also suggests that the place most logically to begin would be precisely with history itself. It is into history, indeed, that this exploration would deliver us.

In the late 1990s, a time of considerable American triumphalism at home and abroad, the University of Virginia gathered a group of scholars, thinkers, historians, and writers to confer as to an interesting question.The room was filled with liberals and left-liberals.Their question was, “Does America have a democratic mission?”

It seemed significant even that the topic would be framed as a question. Would anyone in Wilson’s time have posed one like it? This would not, indeed, have been so just a few years earlier—or a few years later. But it was so then, a line of inquiry launched not quite a decade after the Cold War’s end, three years before the events of September 11. Not so curiously, many of those present tended to look to the past. Van Wyck Brooks’s noted phrase, “a usable past,” was invoked: If we are to understand our future, and whatever our “mission” may be, we had better begin by examining who we have been.

Any such exercise would require a goodly measure of national dedication. It would require “a revolution in spirit,” as the social historian Benjamin Barber has put it. But it would bring abundant enhancements. It would begin to transform us. It would make us a larger people in the best sense of the phrase. There is a richness and diversity to the American past that most of us have never registered. Much of it has been buried, it seems to me, because it could not be separated from all that had to be forgotten. Scholarship since the 1960s has unearthed and explored much of this lost history. But scholarship—as has been true for more than a century—proceeds at some distance from public awareness. We now know that the Jeffersonian thread in the American past, for instance, was much more complex, more dense and layered, than Americans have by tradition understood it. In the supposed torpor of the early nineteenth century we find variations of political movements as these were inherited from England. We find among the Democrats the roots of the Populists, the Progressives, democratic socialists, and social democrats. These groups were not infrequently the product of ferment within the liberal wings of various Christian denominations. There was nothing “un-American” about any of them, and all of them were at least partly historicist: They saw America as it was and as it was changing. They understood the need for the nation to move beyond its beginnings to take account of the new.

One need not subscribe to the politics of these or any other formations in history to derive benefit from an enriched and enlivened knowledge of them. They enlarge and revitalize the American notion of “we.” And in so doing, history opens up more or less countless alternatives—alternative discourses, alternative ideas of ourselves, alternative politics, alternative institutions. All this is simply to cast history as a source of authentic freedom. At the moment our standard view of the American past lies behind us like a “flattened landscape,” as one of our better historians put it some years ago. We are thus unaccustomed to a depth and diversity in our past that present us with a privilege, a benefit, and a duty all at once.

Could Americans bear an unvarnished version of their past—a history with its skin stripped back? History as we now have it seems necessary to bind Americans, to make Americans American. Think merely of the twentieth century and all the wreckage left behind in it in America’s name, and it is plain that the question is difficult and without obvious answers. But something salutary is already occurring in our midst. Historians of all kinds have begun new explorations of the past. There are African-American projects, Native American projects, projects concerning foreign affairs, diplomacy, war, and all the secrets these contain. This is the antitradition I mentioned in an earlier essay coming gradually into its own. It is remarkable how sequestered from all this work our public life has proven.The temptations of delusion are always great, and most of America’s political figures succumb to them. But time will wear away this hubris. In the best of outcomes, the antitradition will be understood as essential to understanding the tradition.

I once came across a small but very pure example of a nation altering its relation to its past. It was in Guatemala. The long, gruesome civil war there, which ended in the 1990s, had made of the country at once a garden of tragic memories and a nation of forgetters.The Mayans were virtually excluded from history,as they always had been, and the country was deeply divided between los indigenes and those of Spanish descent.

Then a journalist named Lionel Toriello, whose forebears had been prominent supporters of the Arbenz government in the 1950s (until Americans arranged a coup in 1954), assembled two million dollars and 156 historians. They spent nearly a decade researching, writing, editing, and peer-reviewing work that was eventually published as a six-volume Historia General de Guatemala. Its intent was “pluralistic,” Toriello explained during my time with him. It provided as many as three points of view on the periods and events it took up. So it purported to be not a new national narrative so much as an assemblage of narratives from which other narratives could arise. It was a bed of seed, then. Inevitably, Toriello’s project had critics of numerous perspectives. Unquestionably, the Historia General was the most ambitious history of themselves Guatemalans had ever attempted.

It was an unusual experiment. One of the things Toriello made me realize was that one needs a new vocabulary if one is to explore the past, render it in a new way, and then use it to assume a new direction. A culture of defeat requires that the language must be cleansed. All the presumption buried in it must be identified and removed. Another thing Toriello showed me was that this could be done, even in a small nation torn apart by violence and racial exclusion. The renovated vocabulary arises directly from the history one generates.

None of this, it seems to me, is beyond the grasp of Americans. To consider it so is merely to acknowledge the extent to which the nation famous for its capacity to change cannot change. It is to give in to the temptations of delusion. I do not think “change” took on so totemic a meaning during Barack Obama’s 2008 campaign by coincidence. I also think the ridicule of this thought coming from Obama’s critics bears interpretation. Change is a testament to strength. But as so often in the past, Americans came to fear what they desired, causing many to take comfort in the next set of constructed political figures promising that, no, nothing at all need change.

An inability to change is symptomatic of a people who consider themselves chosen and who cannot surrender their chosenness. When we look at our nation now, do we see the virtuous republic our history has always placed before us as if it were a sacred chalice? The thought seems preposterous. America was exceptional once, to go straight to the point. But this was not for the reasons Americans thought of themselves as such. America was exceptional during the decades when westward land seemed limitless—from independence until 1890, if we take the census bureau’s word for the latter date. For roughly a century, then, Americans were indeed able to reside outside of history—or pretend they did. But this itself, paradoxically, was no more than a circumstance of history. Americans have given the century and some since over to proving what cannot be proved. This is what lends the American century a certain tragic character: It proceeded on the basis of a truth that was merely apparent, not real. Do Americans have a democratic mission? Finally someone has asked. And the only serious answer is, “They never did.”

* * *

Recognizing the truth of this is likely to lead Americans toward a distinction they have heretofore ignored. It is the distinction between a strong nation and one that is merely powerful. One senses that the difference between the two was plain to Americans of the eighteenth century. But then America left this distinction behind. And how fitting, we may now note, that America led the rest of the world into the twentieth century, for if the nineteenth was the century of history, the twentieth was the century of power.

Power is a material capability. It is a possession with no intrinsic vitality of its own. It has to do with method as opposed to purpose or ideals—techne as against telos. It is sheer means, deployment. Power tends to discourage authentic reflection and considered thought, and, paradoxically, produces a certain weakness in those who have it. This is the weakness that is born of distance from others. In the simplest terms, it is an inability to see and understand others and to tolerate difference. It also induces a crisis of belief. Over time a powerful democracy’s faith in itself quivers, while its faith in power and prerogative accumulates. It is true that in the modern world power derives primarily from science. But it is not manipulated—extended or operated, if you like—by scientists. Neither does the use of power require a scientist’s intelligence. It is thus that one may find in twentieth-century history modern technologies deployed by people of premodern consciousness. And we cannot exclude Americans when we consider this latter occurrence.

Americans found in power an especially compelling temptation when it began to accrue to them. It was the temptation of certainty without anxiety. It seemed, from the Spanish war onward, within America’s grasp to leave behind its old apprehensions at last. The twentieth century thus became the century of power because Americans, as I have already suggested, became ever more reliant upon power alone as its years and decades went by. When power functions by itself, means and ends are inevitably confused; and means, eventually, are taken to be their own end: Power is manifest, that is to say, with no intent other than to manifest itself. The Spanish war was therefore a good introduction to the century we would name for ourselves. Americans claimed to feel deeply for the victims of Spanish oppression, but their own, notably in the Philippines, turned out to be other than an improvement. The true purpose of the Spanish campaign, as the histories make plain, was display—a demonstration of power. At the other end of the century, it is useful to review Washington’s various “nation-building” projects in this light.

To reflect upon those final years before 2001, it is not difficult to understand in our contemporary terms the distinction between a powerful nation and a strong one. Strength derives from who one is—it is what one has made of oneself by way of vision, desire, and dedication. It has nothing to do with power as we customarily use this term. Paradoxically, it is a form of power greatly more powerful than the possession of power alone. Strength is a way of being, not a possession. Another paradox: Power renders one vulnerable to defeat or failure, and therefore to fear. Strength renders one not invulnerable—no one ever is—but able to recover from defeats and failures. The history of the past century bears out these distinctions very clearly. Most of all, a strong nation is capable of self-examination and of change. It understands where it is in history—its own and humankind’s.

It is curious to return briefly to Woodrow Wilson’s list of complaints about American democracy at the start of the American century. “We have not escaped the laws of error that government is heir to,” Wilson wrote in 1901. Then came his litany: riots and disorder, an absence of justice, clashes between management and labor, poorly governed cities. “As we grow older, we also grow perplexed and awkward in the doing of justice and in the perfecting and safeguarding of liberty,” Wilson concluded. “It is character and good principle, after all, which are to save us, if we are to escape disorder.”

Wilson wrote at a curious moment in terms of American power and American strength. What he described, plainly enough, was a nation nervous about losing its strength. And with the invasions of Cuba and the Philippines, America began the effort to make itself a powerful nation instead of a strong one. This was the choice it made when it determined to express itself by way of conquest abroad rather than reformation at home. And from Wilson’s day until ours, the progress has proven to be from one to the other, strength to power, as if the one excluded the other. Wilson was a historicist; many intellectuals were by his day. But Wilson was a deeply certain believer, too. He preserved America’s exceptionalism as Frederick Jackson Turner did: by placing America ever at history’s forward edge.

Among Wilson’s useful insights was that Americans possessed a system that did not have the perpetual capacity to self-correct. It required the attention of those living in it. Otherwise it would all come to “disorder.” And this is among the things Americans are now faced with in a different way: Theirs is a system, a set of institutions, that yet less possesses the ability to correct its errors and injustices and malfunctions. Time, to put it another way, has taken its toll. This is a stinging judgment, fraught with implications. But at least since the Cold War, it has been necessary to cancel all previous assumptions that American political and social institutions are able to correct themselves as they are currently constituted. The presidential election of 2000 can be considered a tragedy of historic importance in this respect. Institutional frailty is among the attributes of republics as they mature and come to be in need of repair. It is a sign that strength has deserted them. The polity requires tending. Its institutions cannot, any longer, be left to themselves.

* * *

What are America’s first steps forward, then, given these inheritances?

The first is to look and listen in another way, to see and hear from within the space of history. It is to achieve a condition of history with memory. This means to come gradually to accept that one lives in historical time and is as subject to its strictures, its triumphs, and its miseries as anyone else. It means accepting that encounters with others are an essential feature of the world we enter upon. Equally, we must begin to make certain links so that we know who we are and what it is we have been doing—the connections between feeling and time and between vigilance and distance and history are examples. Others have done this, made the passage I am suggesting is upon us. In time, history teaches, it becomes clear that it is more painful to resist this than it is to accept it.

I have become fascinated with the character of early Americans—even if it is an idealized self-image cultivated by slaveowners, murderers of Native Americans, and witch-hunting zealots. A people of sentiment, an affectionate people, a people of virtue and understanding, gentle toward others: It is like holding up a mirror and not recognizing the face staring back from it. Even the vocabulary: It has a faintly eighteenth-century scent to it. Mercy Otis Warren’s History is full of this terminology. But consider these attributes as they might be understood in our time. There are twenty-first-century ways to describe them— terms developed among philosophers concerned with the progress of human ties. We can now speak of empathy, meaning that one sees another not simply as an object but as another subject—an equivalent. This is achieved through a recognition of another’s perspective, intentions, and emotions. This makes one’s objective experiences available to all other subjects: One feels oneself to be a subject among other subjects. These concepts are drawn from what I will call for simplicity’s sake the discourse of self and Other, which developed in Europe at mid-twentieth century. This line of thought did not travel well in America. Like the ideas that animated Europe in the nineteenth century, it arrived among Americans in brackets: This is what they are up to across the water. The discourse of Self and Other concerns the evolution of human relations, which are recognized as plural as opposed to unified. And human relations, as the philosopher Emmanuel Lévinas pointed out, take place in time. As I have already suggested, time is our shared medium.

In all of these matters Americans grew deficient during the last century. One must have a strong sense of self to encounter others and accept difference, and Americans came to lack this. The Cold War, in particular, produced a certain personality such that the concepts I have just described may seem foreign, or fey, or faintly beside the point. This reflects our error. And to understand this error now would equip Americans with the vocabulary, the character and good principle that will be useful in the century to come. To know others well, or let us say better than Americans do, will be part of what it means to be a strong nation in the twenty-first century. The thought seems to imply a reconstruction of the American identity. This is precisely the intended meaning. The project has been accomplished before.

Two American figures are worth considering in this context. I have already noted both. One is Wendell Willkie, the failed Republican presidential candidate in 1940. Midway through World War II Roosevelt dispatched Willkie to tour the world and describe his thoughts as to how our planet was likely to emerge from the war. One World was the result, a now-forgotten book that was at the time widely and eagerly read. The other figure is Jimmy Carter, our thirty-ninth president. Both of these men are often sources of derision among Americans. A certain wide-eyed fatuousness commonly attaches to them. I am not unaware of their reputations in this regard. I simply take issue with such presuppositions. In my view both represented lost opportunities: Willkie by way of the idealism of the immediate postwar period, which was palpable even if brief, and Carter in the chance to begin again in a new direction during the post-Vietnam period—also a window briefly opened. Both men displayed many of the qualities the current century will ask of us. Both were clear in the matter of history. Both drew from rich but obscured traditions in the American past. Both understood, it seems to me, the difference between strength and power. Both knew that the former requires more courage than the latter—the courage to interact with those of different beliefs, the confidence to stay the use of force, the poise to put America’s inbred fear aside and act not out of vengeance but from considered wisdom.

We should remember figures such as Willkie and Carter better than we do. It would enlarge our idea of who we are and of what it means to be American. The inability to advance beyond common caricatures of these two and others is nothing more than a measure of our inability to reimagine ourselves. It is by way of such people, whoever they turn out to be, that we can regain some realistic idea of utopia—utopia in this sense meaning simply a future that transcends the present. Democracy has always been fragile—as delicate as a length of eighteenth-century lace. It is evanescent: Much is done in its name that is not genuinely a reflection of it. Our moment in history, our debt to the future, requires us to begin conceiving of an extensively reorganized society. It requires demilitarization and re-democratization, to take ready examples.

Our difficulties in both respects reflect a failure to keep pace with the progress we have engendered, with the speed we have ourselves created—with history’s acceleration, which is, in the end, our own doing. “The acquisition of new implements of power too swiftly outruns the necessary adjustment of habits and ideas to the novel conditions created by their use.” That is the historian Carl Becker, lecturing at Stanford in 1935. It is prescient by half a century, perhaps more. The core issue is one of control—control over what we are able to do. Closer to our time, the French thinker Paul Virilio suggests that we have to add to our technological revolutions a revolution of consciousness, of ideas, such that our thinking and our purposes are elevated to a value equivalent to our capabilities. We do not typically recognize it, but at present these are unmatched. Science can no longer converge with technology alone, Virilio argues; in our time it must also be animated by philosophy. This is one of the twentieth century’s more profound failings.

All this begins to define our responsibility as we free ourselves of national myths. If there is a case for optimism, it lies in a reconstitution of our thought, our intelligence, in this fashion. Much that is now accepted as fated and beyond our capacity to change must be understood otherwise. We live within a strange contradiction, sour fruit of the century now gone by. In the spheres of science and technology we assume ourselves to be without limit. But we give ourselves no credit for being able to make social, economic, or political change—anthropological change altogether. In 2012 our shared supposition is that there are no new ideas—only old ideas to be tried again. That is what is enacted in our culture of representation today. And we must advance beyond it.

There are implications. Such an endeavor will unmask us. We would have to regain a lost confidence among us in “we.” We would have to look forward and see that a new kind of society is possible. And the project requires us—and notably our leaders—to begin speaking in a language of authentic alternatives.

* * *

The claim to exceptionalism is remarkable for its resilience. Little else remains of the old, not-much-regarded myths. But even now America as the world’s exception is asserted at home and abroad. It is a consequence of history, perhaps: America was an idea before it was a nation. “In 2008, it is absolutely clear that we will be involved in nation-building for years to come,” Condoleezza Rice, Bush’s secretary of state, wrote that year in Foreign Affairs. It was Bush’s last in office. Woodrow Wilson could have asserted this same remark a hundred years earlier. It is pre-historicist. It is exceptionalism as baldly stated as it can be in policy terms—in terms of what America proposes to do. No lessons drawn from the previous century? One would think America remains deaf and blind even now.

Nations are eventually made by those who live in them, no matter whether it is in a great power’s interest to fashion one or another of them to its liking. Americans should know this better than anyone, though the point seems to elude them. Now they have an opportunity to learn this truth from the Afghanistan and Iraq wars. Both have been failures in the standard sense of an American “mission,” or as new demonstrations of American prerogative. In both nations, what will finally well up from the Afghan and Iraqi earth will be by way of millions of conversations, interests, persuasions, alliances, oppositions—the very fiber of a political culture, none of it having anything to do with America. As for Americans, they were warriors in wars they did not understand. I do not think this will any longer be possible in the century we inhabit. And in the best of outcomes, those final two failures will lead to what I will call a post-Wilsonian idealism. It may be that there is nothing to salvage from Wilson’s thought, for we have found it defective from the first. But for the sake of continuity let us assume it is something to build upon.

The turning forward of the Wilsonian ethos would involve restraint as much as it would assertion. It would also mean accepting that what America exported in the way of “democracy” during the twentieth century was often fraudulent, a duping, a false promise. It would mean looking back at America’s democracy and recognizing that Americans alone had to make it. Is this to say that post-Wilsonian Americans are to sit and watch as others suffer? My answers to this are two. First of all, there is little doubt that the span of American interventions beginning in 1898 and ending now in Afghanistan has caused more suffering than it has relieved. This is so by a wide margin, to put the point mildly. Second, the post-Wilsonian would act abroad rigorously according to his or her ideals and not some hollowed-out version of them, as even Wilson did. He or she would also act with the greatest of delicacy. Understanding one’s own history also means being attentive to others’. The post-Wilsonian will be supremely mindful of this, elevating self-determination to the highest of values.

We have distinguished between relative and absolute decline, noting that the former is inevitable in an age of rising powers. Many of us believe ours to be the “Pacific century,” implying that America’s frontage on the Pacific lake will be its salvation. I do not think this will prove so: The same was said at the end of the nineteenth century. America is a Pacific power; it is now called upon to recognize that this does not make it an Asian power. By the same token, it does not seem to me that we have entered an “Asian century,” either. It will be a century that cannot be named, in my view, because too great a variety of people will contribute to it.

This is a positive prospect. But much hangs on whether Americans are capable of accepting it as such. For at the horizon, relative and absolute decline turn out to meet. If Americans do not accept the advance of history, relative decline will devolve into absolute decline: The rise of others will translate into America being left uncompetitively behind because it has not understood the tasks at hand. But if Americans are able to accept a place in the world that is distinct from all they have assumed since 1898, the nation’s relative decline will prove an experience of benefit. It will change the American character, so far as one can speak of such a thing, and much for the better. It will alter Americans’ stance toward others and their stances toward one another. It will engender that process of self-examination I have already dwelled upon, leading Americans to recognize the tasks before them. Here is the paradox of our moment: Only if Americans resist the defeat I have described will they be defeated. In our refusal to admit defeat would lie our true defeat, for we would have no access to renewal, we would not be able to think anew.

I propose the taking of an immense risk. It is the risk of living without things that are linked in the American psyche: the protection of our exceptionalism, the armor of our triumphalist nationalism, our fantastical idea of the individual and his or her subjectivity. For Americans to surrender this universe of belief, emotion, and thought may seem the utmost folly. A century ago Americans flinched at the prospect. What followed was often called heroic, but in many cases it was just the opposite, for the American century was so often an exercise in avoidance of genuinely defined responsibility. True enough, it ended as it began, with uncertainty and choices. But the outcome need not be the same now, for there is too much more to be gained than lost this time. Ω

[Patrick Smith was the International Herald Tribune’s bureau chief in Hong Kong and then Tokyo from 1985 to 1992, when he also wrote “Letter from Tokyo” for the New Yorker. He contributes frequently to the New York Times, Business Week, Time, and other publications. His most recent book is Time No Longer: Americans After the American Century (2013) from which this article was excerpted. Smith is a resident writer and workshop leader for the Norfolk Writers’ and Artists’ Retreat in Norfolk, CT.]

Copyright  2013 Salon Media Group, Inc.

Since the Google Reader will go dark on July 1, 2013, another site is available tor readers of a lot of blogs (or a single blog). The alternative is Feedly. For a review of Feedly by the NY FIshwrap's David Pogue, click here.

Creative Commons License
Sapper's (Fair & Balanced) Rants & Raves by Neil Sapper is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License. Based on a work at sapper.blogspot.com. Permissions beyond the scope of this license may be available here.



Copyright © 2013 Sapper's (Fair & Balanced) Rants & Raves