Saturday, May 31, 2014

The Rejection Of The Legacies Of Vietnam, Afghanistan, And Iraq Is True Patriotism

Just today, the Austin fishwrap ran an op-ed piece by Charles Krauthammer ("Emptiness at West Point") that was highly critical of the POTUS 44. This blogger didn't bother to read the rightist blather and the thought occurred at the time that Abraham Lincoln was termed a clown and a buffoon unfit to speak at the dedication of the Gettysburg battle cemetery. As a corrective to Krauthammer and his jingo-brethren who pant for U.S. military intervention anywhere, today's post to this blog offers a sensible look at the Obama Doctrine of a small-ball foreign policy. If this is (fair & balanced) presidential patriotism, so be it.

[x The Atlantic]
He's Like Ike
By Peter Beinart

Tag Cloud of the following piece of writing

created at TagCrowd.com

The debate between President Obama and his hawkish critics comes down to this. Obama is—as he said yesterday [5/28/14] at West Point—“haunted” by the wars in Afghanistan and Iraq. His hawkish critics are haunted by the fact that he’s haunted.

From this core divide comes a fundamentally different reading of the history of American foreign policy. For hawks, the story of the last 75 years goes something like this: From Franklin Roosevelt through Harry Truman through John F. Kennedy, the United States pursued a muscular, internationalist and moral foreign policy. Then, because of Vietnam, America’s leaders lost faith in American might and American ideals. As a result, the Soviet Union began to win the Cold War, until Ronald Reagan rebuilt American power and American pride, and the Cold War was won. Now, as a result of Afghanistan and Iraq, another American leader—Obama—is losing faith in American power. The enemies of freedom are again gaining strength. And they will keep gaining strength until a new Reagan saves the day.

In this narrative, Vietnam, Iraq, and Afghanistan don’t matter much in and of themselves. They were either winnable wars lost through a failure of will or honest mistakes that say nothing important about the limits or fallibility of American power. The problem isn’t the wars themselves. It’s the way American leaders reacted to them. Jimmy Carter’s sin was to believe that Vietnam called into question the wisdom of intervening militarily against communist movements. Obama’s sin is to believe that Iraq and Afghanistan call into question the wisdom of intervening militarily against terrorist movements and anti-American dictators.

For Obama, by contrast, Vietnam, Iraq, and Afghanistan are not aberrations. They reveal a recurring pattern of American hubris. “Since World War II,” he told the cadets, “some of our most costly mistakes came not from our restraint but from our willingness to rush into military adventures without thinking through the consequences.” For Obama, that hubris stems from an excessive fear of America’s enemies, who America can generally defeat by building alliances and harnessing our democratic legitimacy and economic strength, as the United States is doing in Ukraine. And it stems from an excessive faith in war, which once unleashed often spirals out of America’s control.

It’s no surprise that at West Point, Obama yet again quoted Dwight Eisenhower. Like Obama, Eisenhower spent much of his presidency arguing against critics who claimed that the United States needed to spend more on defense, or intervene more militarily, because America’s enemies were gaining ground. Ike never believed that. He worried less that the Soviet Union would vanquish the U.S. militarily than that it would provoke an overreaction that bankrupted America economically. The Soviets, he argued, “have hoped to force upon America and the free world an unbearable security burden leading to economic disaster.”

Eisenhower feared that by endorsing NSC-68, the document that committed America to spend virtually unlimited sums battling global communism, the Truman administration was giving the Soviets exactly what they wanted. He fought back by ensuring that his secretary of the treasury and budget director sat in on all National Security Council meetings. (Obama did something similar when he conspicuously brought Office of Management and Budget director Peter Orszag into meetings on the Afghan surge). Ike worked so hard to keep the defense budget low that three army chiefs of staff quit. He ended the Korean War, although many in his party wanted to escalate it. And he refused to intervene to save the French in Vietnam.

That’s clearly Obama’s model: End costly, unwinnable wars, don’t start new ones, and rebuild the economic foundation of American power. I suspect Obama takes comfort in the fact that for the past several decades, many historians have applauded Eisenhower’s foreign policy. One influential academic essay even calls him a foreign-policy “genius.”

The bad news is that by the end of his presidency, Eisenhower was widely derided as passive and weak, a practitioner, in the words of Arthur Schlesinger Jr., of “the politics of fatigue.” Behold the Eisenhower doll, went a joke at the time: Wind it up and it does nothing for eight years.

Eisenhower’s problem was that his foreign policy was not heroic. He was content, in Obama’s words, to “hit singles.” He had, after all, seen more than enough bloodstained heroism on the beaches and meadows of Europe. At West Point, Obama quoted him as calling war “mankind’s most tragic and stupid folly.” The idea—so common in today’s foreign-policy discourse—that an inclination to use military force represents “idealism” would have struck Ike as beneath contempt.

I suspect Obama feels the same way. “I am haunted by those deaths,” he said about the cadets who died in the Afghan surge. “I am haunted by those wounds.”

Thank goodness. Obama should be haunted. We should all be, because there was nothing in Iraq, or in the fight against the Taliban (as opposed to al-Qaeda) worth sending young men and women to die for. And we should be extremely wary of letting people who have still not reckoned with their role in those catastrophes push the United States toward military action in places about which they are equally ignorant.

At West Point, Obama said he would not be pushed. Ike would be proud. Ω

[Peter Beinart is a contributing editor at both The Atlantic and National Journal, an associate professor of journalism and political science at the City University of New York, and a senior fellow at the New America Foundation. Beinart received a B.A. (history and political science) from Yale University and was a Rhodes Scholar at University College, Oxford University, where he earned an M.Phil. in international relations. His books include The Good Fight: Why Liberals—and Only Liberals—Can Win the War on Terror and Make America Great Again (2006), The Icarus Syndrome: A History of American Hubris (2010), and The Crisis of Zionism (2012).]

Copyright © 2014 The Atlantic Monthly Group



Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2014 Sapper's (Fair & Balanced) Rants & Raves

Friday, May 30, 2014

Our Foreign Policy Conundrum: Too Many Entrances, Not Enough Exits

Eags reviews the road not taken after the election of 2008; the POTUS 44 (winner in '08) speaks sense and The Geezer (loser in 08) still speaks nonsense. The Dumbos/Morons vilify the POTUS 44 as weak and cheer the nonsense babbled by The Geezer and his ilk. The loons love to hear their own call. Eags concludes his essay with a consideration of hell (if such a place exists): it has many entrances and fewer exits. Vietnam, Afghanistan, and Iraq are enough. To hell with The Geezer and his dreams of sending the troops into Nigreria, Syria, Libya, Iran, and Crimea. The Geezer spent 5+ years in hell — a POW prison in Vietnam — because he disobeyed his orders and flew at a lower altitude during his mission. End result: The Geezer's wingman died and The Geezer spent the worst 5+ years of his life in captivity. If this is (fair & balanced) truth to jingoism, so be it.

[x NY Fishwrap]
The Wars Not Fought
By Timothy Egan

Tag Cloud of the following piece of writing

created at TagCrowd.com

We owe Mother Jones, the magazine, a public service nod for a graphic tour last year of all the countries that John McCain has wanted to attack. Spanning the globe, the fist-first senator has called for violent regime change in more than half a dozen nations, ranging from all-out ground invasions to airstrikes to arming sides in endless sectarian conflicts.

The map of McCain’s wars is worth considering as a what-if had the would-be vice president Sarah Palin and her running mate in 2008 prevailed. McCain continues to play quick-draw commander in chief to this day. He said he’d send troops into Nigeria “in a New York minute,” to rescue the girls kidnapped by Islamic terrorists, even without permission of the sovereign country. And just after President Obama’s speech Wednesday at West Point, McCain lamented that America’s young men and women were not still in the Iraqi city of Falluja.

Yes, Falluja — where tribal militias loyal to one warped religious tenet or another continue to slaughter each other with abandon. It’s a hard truth for a country as prideful as the United States to accept, but most Americans have now concluded that the Iraq War was a catastrophic mistake. Obama, at least, has tried to learn something from it.

Al Qaeda was never in Falluja before the American invasion. They have a stronghold in Falluja now, for which McCain blames the withdrawal of United States troops. Think about that: it’s not our fault because we opened the doors to the factions of hell; it’s our fault because we withdrew from hell.

As Obama tries to pivot from foreign policy by bumper sticker, McCain and an intellectually bankrupt clutch of neocons are trying to present themselves as the alternative. Dick Cheney, the warrior with five draft deferments, is in this diminishing camp, calling Obama “certainly the weakest” president in his lifetime. But both McCain and Cheney are outliers, blustery relics with little backing in either party. Only seven percent of Americans expressed support for even considering a military option after Russia forced Crimea into its fold. That’s a sea change in sentiment from 2001, or even 2008.

The nation’s future military leaders embody this shift. The biggest response from the cadets at West Point came when Obama said, “you are the first class to graduate since 9/11 who may not be sent into combat in Iraq or Afghanistan.” They cheered.

But all of that is not to let Obama off the hook. His big foreign policy speech was flat and passionless, with no central vision. The fault may lie with this particular moment in world history. The Cold War was easy to frame. The War on Terror was as well, at least at first. Now, things are more muddled. How do we help the newly elected government of Ukraine? If we aggressively arm one side in Syria, what happens if they turn out to be religious extremists who want to put women back in the 9th century?

Obama didn’t specifically say so, but the guiding principle for this era of nuance and shadows may be no more complex than this: Stay out of wars of unintended consequence.

“Since World War II, some of our most costly mistakes came not from our restraint,” said Obama, “but from our willingness to rush into military adventure — without thinking through the consequences; without building international support and legitimacy for our action, or leveling with the American people about the sacrifice required. Tough talk draws headlines, but war rarely conforms to slogans.”

Is that weakness, or wisdom? Well, neither. But it’s a realistic reaction to the hard fact that the last 50 years have produced the three longest wars in American history. And it’s a pitch-perfect reflection of where most Americans are today.

Afghanistan was supposed to be a swift move to crush a regime that allowed terrorists to flourish — not 13 years, and counting, of nation-building. Vietnam was billed as a blow for freedom against global communism — not a 10-year military muddle in a civil war posing no threat to the United States. Iraq was going to be clean and quick — we’ll be greeted as liberators! — not eight years in one of the most ghastly places on earth, at a cost of more than $2 trillion and a loss of at least 190,000 lives on all sides.

Obama’s foreign policy is a lot like his economic policy. Give him credit for preventing something awful from happening. The financial collapse could have been truly catastrophic, save for the action the president and the Federal Reserve took in the first year following the meltdown. For that, history will be kind. The wars not fought by Obama are the alternative to John McCain’s map. For that, the verdict of the ages is less certain. After 50 years, what a war-weary nation does know is this: the doors into hell are many; the exits, fewer. Ω

[Timothy Egan writes "Outposts," a column at the NY Fishwrap online. Egan — winner of both a Pulitzer Prize in 2001 as a member of a team of reporters who wrote the series "How Race Is Lived in America" and a National Book Award (The Worst Hard Time in 2006) — graduated from the University of Washington with a degree in journalism, and was awarded an honorary doctorate of humane letters by Whitman College in 2000 for his environmental writings. Egan's most recent book is The Big Burn: Teddy Roosevelt and the Fire that Saved America (2009).]

Copyright © 2014 The New York Times Company



Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2014 Sapper's (Fair & Balanced) Rants & Raves

Thursday, May 29, 2014

Forget Net Neutrality, How About Some Blog Neutrality?

Today, this blog provides a primer on "Net Neutrality" and what it means for you if the Federal Communications Commission adopts this proposal. If this is (fair & balanced) geek-hysteria, so be it.

[x The Nation]
The FCC’s Net Neutrality Proposal Explained
By Leticia Miranda

Tag Cloud of the following piece of writing

created at TagCrowd.com

On May 15, the Federal Communications Commission voted [PDF] to move forward with their proposed rules for net neutrality, the principle that all Internet traffic should be treated equally. The proposal, which will now be open for public comment for four months, would dramatically change the Internet. The new rules would allow Internet service providers (ISPs) like Verizon or AT&T to charge websites like Facebook and Twitter for faster service. This has a whole range of consequences for you, the avid Internet user. We’ve put together this explainer to help you understand what the proposal means and how you can tell the FCC what you think about the proposed rules.

What is net neutrality?

For a detailed look at what net neutrality is and the history behind legislating for an open Internet, see our earlier explainer here. As I wrote in December, the principle of net neutrality “guarantees a level playing field in which Internet users do not have to pay Internet service providers more for better access to online content, and content generators do not have to pay additional fees to ensure users can access their websites or apps.” In other words, all Internet traffic should be treated equally.

What has happened up until this point?

The history of the current proposal goes back to 2010, when the FCC issued an Open Internet Order. This created some net neutrality rules, which prohibited Internet service providers from blocking content and prioritizing certain kinds of traffic. Consumer rights advocates criticized the rules as too weak because they did not cover mobile web providers. Telecommunications companies, though, countered that the rules were too strong.

Currently, Internet service providers are legally classified as “information services,” and the law says that no discrimination or price regulations are “necessary for consumer protection.” This means that the FCC has no authority to regulate those services, though the commission does have indirect authority to regulate interstate and international communications. After the FCC released its Open Internet Order, Verizon filed a lawsuit against the FCC claiming that the commission didn’t have the authority to make those rules or enforce them over Internet service providers like itself. In January of this year, the DC Court of Appeals agreed with Verizon and said that the FCC can’t stop Internet service providers from blocking or discriminating against websites or any other Internet traffic unless the Internet is reclassified as a public utility. But the court also said the FCC does have some authority to implement net neutrality rules so far as it promotes broadband deployment across the country.

The FCC took that small window of opportunity and worked on a new proposal over the last few months. On Thursday, they voted to present the proposal to the public for comment, which is what’s on the table now. It’s called a notice of proposed rule-making.

What is a notice of proposed rule-making?

A notice of proposed rule-making, or NPRM, is a bureaucratic term to describe an announcement that a government agency is thinking of making rules about an issue and is giving the public the opportunity to weigh in. This NPRM process happens for every government agency with some authority to make new rules. By opening up an NPRM, the FCC is saying to the country, “Hey, we’re considering this proposed rule. What do you think of it?”

What’s in the NPRM?

The entire NPRM document is ninety-nine pages and you can read it here [PDF]. The FCC’s proposed rules, as previously mentioned, leave the door open for content creators to pay to get faster service. Internet service providers have been considering arrangements like this for some time. In these agreements, ISPs like Verizon or AT&T would approach companies like Netflix or Facebook and say, “We’ll give your users faster access to your website if you pay us a fee.” The proposal would create a fast lane for content creators who can afford to “pay to play,” while keeping a slower lane for sites that can’t afford to make such agreements.

In an attempt to try to balance that with the public interest, the Commission introduced three rules to keep the Internet “as an open platform enabling consumer choice, freedom of expression, end-user control, competition, and the freedom to innovate without permission.” Those three rules would require transparency about broadband providers’ practices; prohibit blocking any lawful website or app and ban “commercially unreasonable practices.”

What do these rules mean?

The transparency rule would require Internet service providers to publicly share a performance report that would include information about their Internet speed and traffic congestion, as well as any instances of blocking content or pay-for-priority agreements like the ones described above.

The no blocking rule would prohibit any Internet service company from outright blocking lawful content for any reason. This would prevent an Internet service provider from, for example, blocking the sites of their competitors.

The third rule, banning “commercially unreasonable practices,” is not as clearly defined, but it generally bans unfair business practices. FCC Chairman Tom Wheeler has said in the past that he wouldn’t accept, for example, practices that would slow down your access to a particular website if you paid for a certain Internet speed. So if you paid for high-speed Internet access and you want to read something on TheNation.com, your Internet service company can’t slow your access to DSL speeds because that would be “commercially unreasonable.”

So what are people saying about the proposal?

Consumer rights advocates like Free Press and Consumers Union argue that these rules would essentially create two tiers of Internet service, which does not amount to “real” net neutrality, which is intended to treat all Internet traffic equally. These groups have also argued that the Internet should be reclassified under the law and regulated as a public utility. The Consumers Union has argued in previous comments that public utility regulations have “protected consumers in the traditional phone marketplace and [are] appropriate for broadband Internet access.”

On the other end of the debate are Internet service companies like Verizon and AT&T who say that the FCC should regulate from afar under a less rigorous enforcement scheme. They prefer that the Internet stay under a “light-touch” regulation scheme as an information service rather than a public utility. Verizon, in a statement after the proposed rules were announced, said that utility regulation on the Internet “would lead to years of legal and regulatory uncertainty and would jeopardize investment and innovation in broadband.”

Some content providers have been quite critical of the proposal. A group of 150 companies, including Amazon, Google and Netflix, sent a letter to the FCC in May pressing the commission to ban paid prioritization for services. Some companies, including Netflix, say they might reluctantly sign on to a paid prioritization agreement with an ISP, but they do not want to pay extra just to reach their customers with quality video service. The proposal “could legalize discrimination, harming innovation and punishing US consumers with a broadband experience that’s worse than they already have,” a Netflix spokesperson said on May 15.

What happens next?

Now the public has the opportunity to submit comments to the docket—the place where people file their ideas about the proposed rules. The public has until September 10 to submit comments, but really you can submit comments until the FCC puts the rules on their meeting agenda. The FCC is supposed to take all comments into consideration, but of course heavy lobbying from all sides of the debate influences that process. That doesn’t mean your comment doesn’t matter, but just be aware it’s not necessarily a fair process.

How can I tell the FCC what I think?

If you have access to the Internet, you can go to the FCC’s e-filing website and click ”submit a filing.” {Enter “14-28” into the proceeding number field.]

How else can I file a comment?

You can also file by paper. You can submit your comment by hand or messenger delivery, by commercial overnight courier or by first-class or overnight US Postal Service mail. All filings must be addressed to the commission’s secretary: Office of the Secretary, Federal Communications Commission. US Postal Service first-class, Express and Priority mail should be sent to FCC Headquarters at 445 12th Street SW, Washington, DC, 20554. Commercial overnight mail must be sent to 9300 East Hampton Drive, Capitol Heights, MD, 20743.

If you want to drop off your comment, you can go to room Room TW-A325 at the FCC Headquarters address above, Monday through Friday, between 8 am–7 pm.

For people with disabilities, you can request materials in accessible formats (Braille, large print, electronic files, audio format) by sending an e-mail to fcc504@fcc.gov or calling the Consumer & Governmental Affairs Bureau at 202-418-0530 (voice) or at 202-418-0432 (text telephone).

What happens after the comment period closes in September?

The FCC will review the comments and might make adjustments to its proposal based on the input they’ve received from the public and the meetings they’ve had with businesses and public interest groups. Once they’re ready to vote, they’ll put the rules on their open meeting agenda for an official vote. The FCC is expected to make their final decision by the end of the year. Ω

[Leticia Miranda is an Investigative Unit Intern at CBS News and prior to that, she was an Editorial Intern for The Investigative Fund of The Nation Institute. Miranda received a BA (Feminist Studies, Latin American/Latino Studies) from the University of California at Santa Cruz as well as an MA (Journalism) from New York University.]

Copyright © 2014 The Nation



Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2014 Sapper's (Fair & Balanced) Rants & Raves

Wednesday, May 28, 2014

Forgot Your P'word? Call Your Mother, Or Follow These Helpful(?) Suggestions

Ah, the P'word, that peesky memory test. As this blogger ages like cheap wine, his mental acuity is dwindling day-by-day. The constant reminder is the error message that accompanies a p'word entry to gain access to this or that web site. Of course, a request to reset the p'word presents more memory-hurdles: the security questions that no one else could answer. If this is a (fair & balanced) perennial Catch-22, so be it.

[x Slate]
Two Stupid Password Tricks
By Doug Harris

Tag Cloud of the following piece of writing

created at TagCrowd.com

This isn’t a post telling you that you should use a different password for every site, that you should use multifactor authentication for your email, or that you should use a password manager to store strong passwords. You should do those. (And you should eat less dessert, exercise more, and call your mother. — [See Above.])

This is a post to share two stupid password tricks that will make your online life a little more secure without the (perceived) hassle of those other measures.

The first stupid password trick is a way to improve the “security questions” that sites have you set up in case you need to recover your password. What’s your mother’s maiden name? What street did you grow up on? Who was your first-grade teacher?

The idea is that only you will know the answer to these questions. By answering them correctly, the site verifies that you are you and lets you reset your password.

Ask Sarah Palin how that worked out for her. The flaw is that you aren’t the only person who knows the answer to these questions. It’s not just the public figures who are vulnerable. We’re all Googleable, and those #TBT posts on Facebook and Twitter could give away a lot about your early years. Someone who’s determined to get access to your email can do a little research and unlock your account.

My trick? Lie and keep telling the same lie.

  • What’s your favorite ice cream flavor? Louis Armstrong.
  • What was the name of your high school? Louis Armstrong.
  • In what city did you have your first job? Louis Armstrong.

Don’t give correct answers. Use the same stupid answer for all of your security questions. (If you’re worried you’ll forget the stupid answer, store it in a password manager.)

Stupid password trick No. 2 was inspired by a friend’s tweet:

My first reaction to this was, “Why aren’t you using a password manager?” But the more I thought about this, the more I think this password dance is really a simple method of implementing something like one-time passwords. Why use a memorable password at all?

Choose something really random, don’t worry about saving it or remembering it, and force the site to re-authenticate you through email!

You get security without the need to add random sites to a password vault and don’t need to install LastPass or anything new.

A few caveats:

  • If you really went to Louis Armstrong High School, don’t use Louis Armstrong as the answer for your security questions.
  • Don’t use Louis Armstrong anyway. It’s been used as an example here. (Just like you shouldn’t use “correct horse battery staple” as your password)
  • Yes, I know they’re not really one-time passwords. True one-time passwords would be enforced to single-use only by the authenticating server. These are enforce to single-use only by your mind (the same weak mind that thinks that you don’t need a password manager).
  • There are almost certainly e-commerce sites that limit how frequently you can change your password. That would cause havoc with this password management method if you plan to visit the site regularly.

These password tricks are stupid. They’re the equivalent of justifying the calories of the ice cream sundae by parking on the far side of the parking lot. It’s better than not, but you can do more.

You should exercise more, call your mother, and take stronger measures to secure your online existence. Ω

[Doug Harris is Slate's Chief Software Architect. He received a BA (cognitive science) from Vassar College.]

Copyright © 2014 The Slate Group



Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2014 Sapper's (Fair & Balanced) Rants & Raves

Tuesday, May 27, 2014

Dumbos/Teabaggers Myth A Lot More Than They Hit

Speaking of myths, St. Richard (Hofstadter) taught this blogger that

"By myth..., I do not mean an idea that is simply false, but rather one that so effectively embodies men's values that it profoundly influences their way of perceiving reality and hence their behavior. In this sense, myths may have varying degrees of fiction or reality...."

Peter Beinart examines the myth of a war on religion that is a staple of Faux News bloviating. Truth be known, religion itself is a myth, but this blogger will leave that to the comedian Bill Maher as he mocks talking snakes and other myths. Beinart finds that the Dumbo/Teabagger outrage over the war on religion is really a ploy by the Dumbos/Teabaggers to play the victim. If this is (fair & balanced) reality — not myth — so be it.

[x The Atlantic]
The Myth Of A War On Relgion
By Peter Beinart

Tag Cloud of the following piece of writing

created at TagCrowd.com

Last week, the Public Religion Research Institute published a study showing that Americans want their fellow citizens to think they are more religiously observant than they really are. When asked by a live human being on the telephone how often they attend religious services, respondents were more likely to say they attend frequently. When filling out a self-administered online survey, by contrast, they were more likely to admit that they do not.

Surprising? Not terribly. But this may be: Liberals were more likely to exaggerate their religious attendance than conservatives. Liberals attend services less frequently than conservatives do. Yet their desire to be thought more religiously observant than they actually are is greater.

Why does this matter? Because it’s more evidence that the claim that liberals are waging a “war on religion” is absurd. You can hardly listen to a GOP presidential hopeful or flip on Fox News without hearing the charge. In 2012, Rick Perry promised that if elected he’d “end Obama’s war on religion.” Bobby Jindal recently warned that “the American people, whether they know it or not, are mired in a silent war” against “a group of like-minded [liberal] elites, determined to transform the country from a land sustained by faith into a land where faith is silenced, privatized, and circumscribed.” Ann Coulter explains, “Liberals hate religion because politics is a religion substitute for liberals and they can’t stand the competition.”

Notice the claim. It’s not merely that liberals are not religious themselves. It’s that they disdain people who are, and this disdain creates a cultural stigma (and a legal barrier) to religious observance. “Bigotry against evangelical Christians is the last acceptable form of bigotry in the country,” Ralph Reed said recently.

The truth is almost exactly the reverse. Over the past few decades, liberals have—far more than conservatives—turned away from religious affiliation, though not necessarily belief in God. But while they may feel proud of their views on religion-informed issues like evolution and gay marriage, they’re not particularly proud of their lack of religious observance per se. Indeed, they’re aware that they’re violating a cherished social norm. Asking liberals to admit that they are disproportionately secular is like asking conservatives to admit that they are disproportionately white. It’s a truth they find embarrassing. Liberals love left-leaning religious figures like Sister Simone Campbell, the immigrant-rights-championing nun who addressed the 2012 Democratic National Convention, for the same reason conservatives love right-wing African Americans like Herman Cain and Dr. Ben Carson: They defy a negative stereotype.

After all, if liberals really stigmatized the religious, wouldn’t some of them have objected when John Kerry flaunted his Catholicism in 2004 or Barack Obama flaunted his adult embrace of Christianity in 2008? Is there a single example, even in the most liberal city or district, of one Democratic candidate trying to outdo the other by proclaiming herself more hostile to religious belief?

I doubt it, because most secular liberals understand—even if Fox News commentators don’t—that America’s last acceptable religious prejudice isn’t against evangelical Christians. It’s against atheists. According to a 2008 poll, more than two-thirds of American atheists said they feared the repercussions in their community if they openly declared their belief that there is no god.

They were right to be worried. When three University of Minnesota sociologists surveyed American religious attitudes in 2006, they found “not only that atheists are less accepted than other marginalized groups but also that attitudes toward them have not exhibited the marked increase in acceptance that has characterized views of other racial and religious minorities over the past forty years.” Americans are today more likely to say they would vote for a Muslim or a gay or lesbian for president than an atheist. In a recent Pew study, even nonreligious Americans said they wanted their presidential candidates to be believers—regardless of what faith they profess. Seven states still officially bar atheists from holding office.

Social practices can retain, or even increase, their prestige while becoming less common. Think about military service, which is lionized more today than it was during the Vietnam War, even though fewer citizens serve. Something similar has happened with religion. Americans, especially left-leaning Americans, are less likely than they were a generation ago to go to church. But they’d rather you not know how much less, because religious practice—like service in the military—enjoys prestige as a marker of morality and self-discipline. And the more Americans fret that those values are being lost, the more they value religious observance for carrying them on, if they aren’t religiously observant themselves.

That’s what the “war on religion” types don’t get. Liberals may dislike the political views that religious conservatives espouse, but they’re quite sympathetic to religion itself. Of course, admitting that would make it harder for religious conservatives to play the victim—which is what the “war on religion” is really all about. Ω

[Peter Beinart is a contributing editor at both The Atlantic and National Journal, an associate professor of journalism and political science at the City University of New York, and a senior fellow at the New America Foundation. Beinart received a B.A. (history and political science) from Yale University and was a Rhodes Scholar at University College, Oxford University, where he earned an M.Phil. in international relations. His books include The Good Fight: Why Liberals—and Only Liberals—Can Win the War on Terror and Make America Great Again (2006), The Icarus Syndrome: A History of American Hubris (2010), and The Crisis of Zionism (2012).]

Copyright © 2014 The Atlantic Monthly Group



Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2014 Sapper's (Fair & Balanced) Rants & Raves

Lemmings Of The World, Unite! You Have Nothing To Lose But Your Cliffs!

Peter Beinart performs a post-mortem on our delusion of exceptionalism and ultimately proclaims that "the city upon a hill" might be resuscitated after all. We shall see, we shall see. In the meantime, we caught in a hamster wheel of our own making. If this is (fair & balanced) national introspection, so be it.

[x NJ]
The End Of American Exceptionalism
By Peter Beinart

Tag Cloud of the following piece of writing

created at TagCrowd.com

From the moment Barack Obama appeared on the national stage, conservatives have been searching for the best way to describe the danger he poses to America's traditional way of life. Secularism? Check. Socialism? Sure. A tendency to apologize for America's greatness overseas? That, too. But how to tie them all together?

Gradually, a unifying theme took hold. "At the heart of the debate over Obama's program," declared Rich Lowry and Ramesh Ponnuru in an influential 2010 National Review cover story, is "the survival of American exceptionalism." Finally, a term broad and historically resonant enough to capture the magnitude of the threat. A year later, Newt Gingrich published A Nation Like No Other: Why American Exceptionalism Matters (2011), in which he warned that "our government has strayed alarmingly" from the principles that made America special. Mitt Romney deployed the phrase frequently in his 2012 campaign, asserting that President Obama "doesn't have the same feelings about American exceptionalism that we do." The term, which according to Factiva appeared in global English-language publications fewer than 3,000 times during the Bush administration, has already appeared more than 10,000 times since Obama became president.

To liberals, the charge that Obama threatens American exceptionalism is daft. He is, after all, fond of declaring, "In no other country on Earth is my story even possible." For some progressive pundits, things hit rock bottom when conservative Washington Post columnist Kathleen Parker flayed Obama for not using the words "American exceptionalism" in his 2011 State of the Union speech, even though he had called America a "light to the world" and "the greatest nation on Earth." The entire discussion, declared liberal Post blogger Greg Sargent, had become "absurd," "self-parodic," and an exercise in "nonstop idiocy."

But that's not quite right. When conservatives say American exceptionalism is imperiled, they're onto something. In fundamental ways, America is becoming less exceptional. Where Gingrich and company go wrong is in claiming that the Obama presidency is the cause of this decline. It's actually the result. Ironically, the people most responsible for eroding American exceptionalism are the very conservatives who most fear its demise.

To understand what's threatening American exceptionalism, one must first understand what its contemporary champions mean by the term. American exceptionalism does not simply mean that America is different from other countries. (After all, every country is different from every other one.) It means that America departs from the established way of doing things, that it's an exception to the global rule. And from Alexis de Tocqueville, who chronicled America's uniqueness in the 1830s, to Joseph Stalin, who bemoaned it in the 1920s, to social scientists like Louis Hartz, who celebrated it during the Cold War, the established way of doing things has always been defined by Europe. What makes America exceptional, in other words, is our refusal to behave like the Old World. "Exceptionalism," wrote historian Joyce Appleby, "is America's peculiar form of Eurocentrism."

As America and Europe have changed over time, so have the attributes that exceptionalists claim distinguish us from them. But for the contemporary Right, there are basically three: our belief in organized religion; our belief that America has a special mission to spread freedom in the world; and our belief that we are a classless society where, through limited government and free enterprise, anyone can get ahead. Unfortunately for conservatives, each of these beliefs is declining fast.

THE RISE OF ANTICLERICALISM

For centuries, observers have seen America as an exception to the European assumption that modernity brings secularism. "There is no country in the world where the Christian religion retains a greater influence over the souls of men than in America," de [sic] Tocqueville wrote. In his 1996 book, American Exceptionalism: A Double-Edged Sword, Seymour Martin Lipset quoted Karl Marx as calling America "preeminently the country of religiosity," and then argued that Marx was still correct. America, wrote Lipset, remained "the most religious country in Christendom."

Today's conservatives often cast themselves as defenders of this religious exceptionalism against Obama's allegedly secularizing impulses. "Despite the fact that our current president has managed to avoid explaining on at least four occasions that we are endowed by our creator," declared Gingrich at a 2011 candidates forum, "the fact is that what makes American exceptionalism different is that we are the only people I know of in history to say power comes directly from God."

But in important ways, the exceptional American religiosity that Gingrich wants to defend is an artifact of the past. The share of Americans who refuse any religious affiliation has risen from one in 20 in 1972 to one in five today. Among Americans under 30, it's one in three. According to the Pew Research Center, millennials—Americans born after 1980—are more than 30 percentage points less likely than seniors to say that "religious faith and values are very important to America's success." And young Americans don't merely attend church far less frequently than their elders. They also attend far less than young people did in the past. "Americans," Pew notes, "do not generally become more [religiously] affiliated as they move through the life cycle"—which means it's unlikely that America's decline in religious affiliation will reverse itself simply as millennials age.



















Americans remain far more willing than Europeans to affirm God's importance in their lives (although that gap has closed somewhat among the young). But when the subject shifts from belief in God to association with churches, America's famed religious exceptionalism virtually disappears. In 1970, according to the World Religion Database, Europeans were over 16 percentage points more likely than Americans to eschew any religious identification. By 2010, the gap was less than half of 1 percentage point. According to Pew [PDF], while Americans are today more likely to affirm a religious affiliation than people in Germany or France, they are actually less likely to do so than Italians and Danes.

Even more interesting is the reason for this change. Many of the Americans who today eschew religious affiliation are neither atheists nor agnostics. Most pray. In other words, Americans aren't rejecting religion, or even Christianity. They are rejecting churches. There are various explanations for this. As Princeton's Robert Wuthnow notes in his book After the Baby Boomers (2007), the single and childless historically attend church at lower rates than married parents do. And women who work outside the home attend less than women who don't. Which means that with women marrying later, having children later, and working more outside the home, it's logical that church attendance would drop.

But it's not just changes in family and work patterns that drive the growth of religious nonaffiliation. It's politics. In the mid-20th century, liberals were almost as likely to attend church as conservatives. But starting in the 1970s, when the Religious Right began agitating against abortion, feminism, and gay rights, liberals began to identify organized Christianity with conservative politics. In recent years, the Religious Right's opposition to gay marriage has proved particularly alienating to millennials. "The actions of the Religious Right," argue sociologists Michael Hout and Claude Fischer, "prompted political moderates and liberals to quit saying they had a religious preference." In their book, American Grace: How Religion Divides and United Us (2010), Robert D. Putnam and David E. Campbell cite a study suggesting that many "young Americans came to view religion … as judgmental, homophobic, hypocritical, and too political." Today, according to Pew, the religiously unaffiliated are disproportionately liberal, pro-gay-marriage, and critical of churches for meddling too much in politics. Not coincidentally, so are America's young.

What is growing in contemporary America, in other words, is something long associated with Europe: anticlericalism. In Europe, noted the late political scientist James Q. Wilson in a 2006 essay on American exceptionalism, the existence of official state religions led secularists to see "Christians as political enemies." America, Wilson argued, lacked this political hostility to organized religion because it separated church and state. But today, even without an established church, the Religious Right plays such a prominent and partisan role in American politics that it has spurred the kind of antireligious backlash long associated with the old world. Barack Obama is the beneficiary of that backlash, because voters who say they "never" attend religious services favored him by 37 percentage points in 2008 and 28 points in 2012. But he's not the cause. The people most responsible for America's declining religious exceptionalism are the conservatives who have made organized Christianity and right-wing politics inseparable in the minds of so many of America's young.

NONINTERVENTIONISM

If the champions of American exceptionalism see religion as one key dividing line between the new and old worlds, they see America's special mission overseas as another. "I believe," declared Romney in 2011, that "we are an exceptional country with a unique destiny and role in the world... that of a great champion of human dignity and human freedom." For many Washington conservatives, that unique world role gives America unique obligations: We cannot stand aside while evil triumphs. But it also gives America unique privileges: We need not be bound by the opinions of others. As George W. Bush declared in his 2004 State of the Union address, America does not need a "permission slip" from other nations to protect itself and fulfill its mission in the world.

But young Americans are far less likely than their elders to endorse this exceptional global role. They want the U.S. to do less overseas; and what America must do, they want done more consensually. Americans under 30, for instance, are 23 percentage points more likely than older Americans to say the United States should take its allies' interests into account, even if that means compromising our own. They are 24 points more favorable to the United Nations than Americans over 50, the largest age gap in the 17 countries that Pew surveyed. And as with religious affiliation, this generation gap within the United States is eroding the gap between Americans and Europeans. Among respondents over 50, Pew found in 2011, Americans were 29 percentage points more likely than Britons to deny that their country needed U.N. approval before going to war. Among respondents under 30, by contrast, the gap was only 8 points.

Were young Americans merely embracing multilateralism over unilateralism, this shift wouldn't be so fundamental. But for conservatives, America's exceptional role in the world isn't merely about what we do overseas. What we do overseas expresses our belief in ourselves. It's no coincidence that Romney's campaign manifesto was titled No Apology: Believe in America (2010), a reference to Obama's supposed tendency to apologize for America's global misdeeds. In Lowry and Ponnuru's words, Obama threatens American exceptionalism because he threatens "America's civilizational self-confidence."

That's where things get interesting, because, as conservatives suspect, Americans' declining belief in our special virtue as a world power really is connected to our declining belief in our special virtue as a people. And the young are leading the way. A 2013 poll by the Public Religion Research Institute found that while almost two in three Americans over 65 call themselves "extremely proud to be American," among Americans under 30 it is fewer than two in five. According to a Pew study in 2011, millennials were a whopping 40 points less likely than people 75 and older to call America "the greatest country in the world."


















Young Americans, in fact, are no more "civilizationally self-confident" than their European counterparts. When Pew asked respondents in 2011 whether "our culture is superior" to others, it found that Americans over the age of 50 were, on average, 15 points more likely to answer yes than their counterparts in Britain, France, Germany, and Spain. Americans under 30, by contrast, were actually less likely to agree than their peers in Britain, Germany, and Spain. And as the millennials, who are still reaching adulthood, constitute an ever-growing share of America's adult population, Americans are becoming a people no more likely to assert their national supremacy than are Europeans. In 2002, according to Pew [PDF], Americans were 20 percentage points more likely than Germans to declare their culture superior to that of other nations. By 2011, the gap was down to 2 points.

One reason for this shift is demographic. According to the Public Religion Research Institute, African-Americans and Hispanics, who comprise a larger share of America's young than of its old, are less likely to call themselves "extremely proud" of the United States than whites are. In their skepticism of unilateral foreign policy and overt patriotism, young Americans are also reflecting broader national and international trends. Millennials are coming of age at a time when America's relative power overseas has declined. They're also products of an educational system that, more than in the past, emphasizes inclusion and diversity, which may breed a discomfort with claims that America is better than other nations.

But however important these long-term trends, they can't explain the abruptness of the shift away from exceptionalist attitudes about America's role in the world. For this, we must look to George W. Bush.

Ever since Karl Mannheim's writing [PDF] in the 1920s, sociologists have observed that people are most influenced by events that occur in their late teens and early 20s—once they separate from their parents but before they establish stable lifestyles and attitudes of their own. For most millennials, these plastic years coincided with the Bush presidency. And it is Bush's vision of America's aggressive, unfettered world role, especially as manifested in the Iraq War, that young Americans are rebelling against.

Young Americans actually began the Bush presidency more supportive of invading Iraq than the population at large. But their disillusionment has proved far more intense. Between 2002 and 2008, the percentage of older Americans who supported the Iraq War dropped 15 points. Among Americans under 30, by contrast, it dropped a whopping 47 points. As young Americans turned against the war, they turned against Bush's exceptionalist vision of an America with unique burdens and privileges. Even more fundamentally, they turned against the chest-thumping, "We're No. 1" brand of patriotism that often accompanied it. In 2004, Jon Stewart—whose comedy show that year regularly drew more young viewers than any other cable news show—published America (The Book) [2004], in which, according to one reviewer, "no aspect of our patriotic pride is too sacred to be sacrificed on the altar of irony." The following year, Stewart's colleague, Stephen Colbert, launched "The Colbert Report," which occasionally featured him wrapped nude in the American flag. Between 2003 and 2011, according to Pew, the percentage of Americans calling themselves "very patriotic" dropped by less than 3 points among older Americans but by 10 points among millennials.

This turn against exceptionalist foreign policy—like young America's turn against organized religion—has undoubtedly boosted Obama's political career. Had he not opposed the Iraq War, and then seen the war prove catastrophic, it's unlikely he would have won the Democratic nomination, let alone the presidency. Among antiwar voters, he beat John McCain by 54 points. But as with dwindling religious affiliation, Obama's presidency has been more the result of the decline American exceptionalism than its cause. If any president bears responsibility for the public's souring on the idea that the United States can play by its own rules on the world stage, it is Bush, assisted by many of the same conservative politicians and pundits who now bemoan American exceptionalism's demise.

CLASS-CONSCIOUSNESS

American exceptionalism's third, and most fundamental, contemporary meaning is about neither religion nor foreign policy. It's about mobility. Starting in the 19th century, foreign observers began noting that white Americans were less likely than Europeans to be prisoners of their birth. Because America's white poor could more easily rise above their parents' station, they did not constitute a static, aggrieved working class—and were less tempted by socialism. In the words of Princeton historian Daniel Rodgers, "Socialism's weakness in the United States was taken as further proof of the point: that the old rules of caste and class relations had been superseded."

For the most part, today's conservatives lustily endorse this exceptionalist narrative. "Class is not a fixed designation in this country," declared Paul Ryan in 2011. Unlike Europe, where "masses of the long-term unemployed are locked into the new lower class," America is "an upwardly mobile society." Lowry and Ponnuru add, "In America, there really hasn't been a disaffected proletariat—because the proletariat has gotten rich."

But conservatives worry that by encouraging reliance on government and discouraging individual initiative, Obama is making America more like Europe. Obama, warns former Republican presidential candidate Michele Bachmann, is hooking Americans on the "crack cocaine of [government] dependency." "It's not a traditional America anymore," Fox's Bill O'Reilly despaired on the night Obama won reelection. "People feel that they are entitled to things" from the state.

When conservatives worry that America is not as economically exceptional anymore, they're right. A raft of studies suggests that upward mobility is now rarer in the United States [PDF] than in much of Europe. But if America's exceptional economic mobility is largely a myth, it's a myth in which many older Americans still believe. Among the young, by contrast, attitudes are catching up to reality. According to a 2011 Pew poll [PDF], young Americans were 14 points more likely than older Americans to say that the wealthy in America got there mainly because "they know the right people or were born into wealthy families" rather than because of their "hard work, ambition, and education." And as young Americans internalize America's lack of economic mobility, they are developing the very class consciousness the United States is supposed to lack. In 2011, when Pew asked Americans to define themselves as either a "have" or a "have-not," older Americans chose "have" by 27 points. In contrast, young Americans, by a 4-point margin, chose "have-not." According to the exceptionalist story line, Americans are all supposed to consider themselves "middle class," regardless of their actual economic fortunes. For seniors, that's largely true. According to a 2012 Pew study, they were 43 points more likely to call themselves "middle" than "lower" class. Among young Americans, by contrast, the percentage calling themselves "middle" and "lower" class was virtually the same.



















And in the final undoing of the exceptionalist narrative, young Americans are expressing greater interest in "socialism," although it's unclear what they mean by it. A 2011 Pew study found that while Americans over 30 favored capitalism over socialism by 27 points, Americans under 30 narrowly favored socialism. Compared with older Americans, millennials are 36 points more likely [PDF] to prefer a larger government that provides more services over a smaller one that provides fewer.

As millennials grow older, Americans as a whole—whose actual economic mobility is no longer exceptional—are becoming less exceptional in their attitudes about class. Between 1988 and 2011, the percentage of Americans who identified as "have-nots" doubled, from fewer than one in five to more than one in three. In 1988, Americans earning under $30,000 a year were 18 points more likely to call themselves "haves." By 2011, those numbers, adjusted for inflation, had flipped: The poorest Americans were 15 points more likely to call themselves "have-nots."

Americans are also becoming less exceptional in their views of capitalism. In 2003, according to GlobeScan, Americans were more than 14 percentage points more likely than Italians, Britons, Canadians, and Germans to say the "free market economy is the best system on which to base the future of the world." By 2010, they were almost 2 points less likely.

When conservatives acknowledge these trends, they often chalk them up to Obama's policies, which have supposedly drained Americans of their rugged individualism and habituated them to government handouts. "Once the public is hooked on government health care," Lowry and Ponnuru note, "its political attitudes shift leftward." But Obama is less the driver of this shift in economic attitudes than the beneficiary. It's certainly true that Obama won the votes of Americans skeptical that they can rise via the unfettered market. Among the majority of 2012 voters who believe America's economic system favors the wealthy, Obama beat Romney by 45 points. But Obama is not the reason so many Americans believe that. For more than a century, commentators have chalked up Americans' support for capitalism and lack of economic resentment to America's exceptional upward mobility. It's unclear when exactly American upward mobility began to decline. But it's not surprising that, eventually, that decline would cause class attitudes to harden.

The question exceptionalists should be asking is why America, once vaunted for its economic mobility, now trails much of the advanced world. Single-parent families clearly play a role, since poor children born into two-parent homes are far more upwardly mobile than those who are not. Housing patterns that segregate the poor from the middle class also seem to limit poor kids' chances of getting ahead. But economic inequality is also a big part of the story. Across the world, the University of Ottawa's Miles Corak has demonstrated, countries with higher inequality suffer lower mobility. The same is true inside the United States: The flatter a city is economically, the more likely its poor will rise.

Part of the reason is "opportunity hoarding." In recent decades, the wealth gap between the richest Americans and everyone else has dramatically widened. Rich Americans have used this influx of cash to give their children special advantages that keep them from losing their spots atop the income ladder to children born with lesser means. Think about test preparation, which became a national industry only in the 1970s. Or the way wealthy parents subsidize unpaid internships or buy expensive houses to gain access to the best public schools. In the early 1970s, rich families spent four times as much on their children's education as poor ones. Today, they spend almost seven times as much. Culture plays a large role in this. If the rich didn't value education, they wouldn't spend their cash on it. But until recently, they didn't have so much cash to spend. As a paper [PDF] by Stanford sociologists Pablo Mitnik, Erin Cumberworth, and David Grusky notes, "Inequality provides privileged families with more resources that can then be lavished on their children, resources that raise their chances of securing desirable class positions for themselves." Whether this lavishing has contributed to an absolute decline in upward mobility in the United States in recent decades, it has certainly contributed to America's decline relative to other advanced countries.

All of which begs another question that conservative exceptionalists should be asking: What's behind skyrocketing inequality? Why do the top 1 percent of Americans, who took in roughly 11 percent of national income in the mid-1970s, account for more than double that today? Globalization and technology are clearly part of the story. If you're an American who works with your hands, you're competing with low-paid workers across the globe, not to mention machines, to an extent scarcely imaginable a few decades ago. That competition pushes down wages for Americans without a college degree, and widens the gap between rich and poor.

What globalization and technology can't explain is why inequality is so much higher in America than in Europe, where the same tectonic forces are at play. Indeed, if you eliminate government policies on taxing and spending, America is about as unequal as Sweden, Norway, and Denmark and a bit more equal than Finland, Germany, and Britain. America claims its place as the most unequal major Western country only when you add in government policy. Which is to say that while globalization and technology may be increasing inequality everywhere, they are increasing it more in the United States because, compared with Europe, the United States redistributes less money from rich to poor.

Which brings us back to conservatives, because it is their champions—Ronald Reagan in the 1980s, Newt Gingrich in the 1990s, George W. Bush in the 2000s—who pushed many of the policies that have boosted inequality. In the mid-1970s, the federal government's top tax rate for regular income was 70 percent and its top rate for long-term capital gains was almost 40 percent. When Bush left office, the rate on regular income had fallen to 35 percent and the rate on long-term capital gains was down to 15 percent. (That has crept up under Obama to almost 40 percent on regular income and 20 percent on capital gains for individuals making over $400,000.) These huge shifts in tax policy have been partially offset by antipoverty spending, which has grown significantly since the 1970s, largely because skyrocketing health care costs have made Medicaid far more expensive. But even if you take that increase into account, America is still doing far less to combat inequality than other advanced democracies.

If you believe, as academics increasingly do, that economic inequality goes hand in hand with calcified class relations, then decades of conservative policy have contributed to America's relative lack of economic mobility.

This, in turn, has soured young Americans on the belief that through the free market they can rise above the circumstances of their birth. Which means that, when it comes to declining faith in the American Dream of upward mobility, as with declining faith in organized religion and declining faith in America's special mission in the world, conservatives have helped foment the very backlash against American exceptionalism that they decry.

THE TURNAROUND

But in all three areas, this backlash may actually prove a source of hope. It may not entirely restore public belief in America's unique virtues, but it may reverse some of the trends that sapped that belief in the first place.

Start with religion. To some, the rise in religious nonaffiliation is a frightening departure from American tradition. It may turn out, however, to be just the challenge American Christianity needs.

Historically, American religion has benefited greatly from its independence from the state. In recent decades, however, that independence has been compromised. The Religious Right has become a wing of the Republican Party, led by power brokers who speak biblically but act politically. In response, many young Americans have begun voting against the GOP on Sundays by declining to attend church.

Their alienation has jolted religious leaders and contributed to a new willingness to question the corrupting entanglement between churches and partisan politics. "When I talk to neighbors or strangers and tell them that I try my best to follow Jesus," wrote David Kuo, an evangelical who worked for Ralph Reed, John Ashcroft, William Bennett, and George W. Bush, "their first thoughts about me are political ones—they figure I don't care about the environment, I support the war in Iraq, I oppose abortion.... That is what they associate with my faith." So disturbing was this realization that Kuo in 2006 published a book arguing, "It is time for Christians to take a temporary step back from politics, to turn away from its seductions."

That's beginning to happen. According to John S. Dickerson, an influential young evangelical pastor, "The pulse of evangelicalism is … shifting, in many ways for the good, from American politics to aid for the global poor." Inspired by Pope Francis, prominent Catholic Republicans such as Paul Ryan are questioning whether a Christianity that blesses the lobbying agenda of the chamber of commerce will ever truly challenge secular society or reengage America's disaffected young.

So far, there's no evidence this shift is stemming the rising tide of religious nonaffiliation. Even Francis, although widely admired by American Catholics, hasn't yet brought them back to the pews. Still, the new spirit of humility and self-criticism among America's church leaders is healthy. And it's unlikely it would be occurring had young people not shattered the stereotype of Americans as unquestioning churchgoers. Moreover, since most of these young Americans reject a partisan church—but not a loving God—they may one day create a constituency for religious institutions that spurns the temptations of state power. Which is, in a way, what American religious exceptionalism was supposed to be all about.

The backlash against America's special mission in the world may prove heartening, too. Over the last decade, that special mission has justified policies—such as the invasion, occupation, and failed reconstruction of Afghanistan and Iraq—that have cost the United States massively in money and blood. And it has justified ignoring international norms, most importantly on torture, which has sapped America's moral authority. Yet many hawkish elites remain loath to acknowledge the limits of American power, let alone American wisdom.

In desiring a more modest and consensual foreign policy, young people are recapturing the wisdom of an earlier era. In the 1950s, after a painful and costly war in Korea, Dwight Eisenhower warned that by dispatching troops to oppose every communist advance, America would undermine its economic strength and democratic character even as it extended its military reach. Today, whether it is their support for a smaller, cheaper military or their skepticism about unchecked government surveillance, young Americans are the age group most sensitive to the financial and moral costs of continuing Bush's expansive "war on terror." Eisenhower's fear of overreach led him to resist calls for sending U.S. troops to Vietnam; young Americans are today 30 points more likely than their elders to say the United States should avoid war with Iran.

Underlying this more modest foreign policy vision is a more modest assessment of America itself, a modesty that may look to conservatives such as Lowry and Ponnuru like "lack of civilizational self-confidence." But here, too, young Americans are reclaiming the insights of an earlier time. In 1947, with politicians drawing ever brighter lines between the virtue of American democracy and the evil of Soviet totalitarianism, George Kennan told students at the National War College, "There is a little bit of totalitarian buried somewhere, way down deep, in each and every one of us." Kennan, and like-minded mid-20th-century intellectuals such as Walter Lippmann and Reinhold Niebuhr, considered America's political system superior to the Soviet Union's. But they argued that, paradoxically, what made it superior was its recognition of American fallibility. America, unlike the U.S.S.R., bound its leaders within restraining systems of law that denied them the right to unfettered action no matter how convinced they were of their own good intentions. That same spirit led the United States to help build institutions like the United Nations and NATO, which gave smaller nations some voice over America's behavior, and won the United States a measure of legitimacy among its allies that the Soviet Union never enjoyed.

As young men, Lippmann and Niebuhr had seen two epic visions—Woodrow Wilson's dream of a war to end war, and the socialist dream of a revolution to end class oppression—turn ugly. And it was their disillusionment with political crusades that woke them to the importance of building restraints against America's capacity to do evil rather than merely unleashing its supposedly innate inclination to do good. Perhaps young Americans, having in their formative years watched Bush's epic post-9/11 vision breed lies, brutality, and state collapse, and America's celebrated capitalist system descend into financial crisis, have gained their own appreciation of American fallibility. Let's hope so, because as Niebuhr and Lippmann understood, the best way to ensure that America remains an exceptional power—better than the predatory empires of the old world—is to remember that we are not inherently better at all.

The third backlash may prove most significant of all. Americans are right to cherish economic mobility. But the myth that America still enjoys exceptional mobility has become an opiate impeding efforts to make that mobility real again. When newly elected New York Mayor Bill de Blasio called for raising taxes on the wealthy to fund preschool and after-school programs, he was instantly accused of "class warfare," as if sullying the natural, classless, reality of New York City life. Critics of the inheritance tax often invoke a mythic America where the people passing on multimillion-dollar estates to their children are latter-day Horatio Algers who have gotten rich because of their gumption and hard work. They do so even though the estate tax affects just over 0.1 percent of American families, the same tiny elite that in recent decades has used its massive economic gains to insulate its children from competition from the very economic strivers that opponents of the inheritance tax celebrate.

Since the 1970s, the conservative movement has used the myth of a classless America to redistribute wealth upward, thus hardening class divisions, at least relative to other nations. It's no surprise that the young, having no memory of the more equal, more mobile America of popular legend, see this reality more clearly. And because they do, they are more eager to change it. Unlike every other age group, which opposed the Occupy movement by double digits, millennials supported it [PDF] by double digits.

As millennials constitute a larger share of the electorate [PDF]—rising from 29 percent of eligible voters in 2012 to a projected 36 percent in 2016 and 39 percent in 2020—they are creating a constituency for politicians willing to both acknowledge America's lack of class mobility and try to remedy it. The key to such an effort is increasing the number of poor students who graduate from college. Having a college degree quadruples someone's chances [PDF} of moving from the poorest fifth of the population to the wealthiest. But educationally, many poor students fall so far behind so early that their chances of attending college are crippled by the time they leave elementary school. By eighth grade, children from wealthy families are already an astonishing four grade levels ahead of children who grow up poor.

There is evidence from France and Denmark that expanding preschool enrollment can significantly close this performance gap [PDF]. A Brookings Institution study found that enrolling low-income children in high-quality preschools could boost their lifetime earnings by as much as $100,000. Building on such data, de Blasio has famously proposed making preschool universal in New York City, to be paid for with a tax on people earning over $500,000 a year. Now New York Governor Andrew Cuomo has gone one better, promising universal preschool throughout the state. President Obama proposed something similar in his recent State of the Union address.

These efforts still face resistance, but they stand any chance at all only because of the growing recognition that America is not the highly mobile nation its cheerleaders proclaim it to be. To Mitt Romney, the public's growing alienation from this and other national myths may reflect a disturbing refusal to "believe in America." But "discontent," Thomas Edison once quipped, "is the first necessity of progress." And by challenging the comforting stories we tell about ourselves, a new American generation might just begin the long, hard work of making America exceptional again. Ω

[Peter Beinart is a contributing editor at National Journal and The Atlantic, an associate professor of journalism and political science at the City University of New York, and a senior fellow at the New America Foundation. Beinart received a B.A. (history and political science) from Yale University and was a Rhodes Scholar at University College, Oxford University, where he earned an M.Phil. in international relations. His books include The Good Fight: Why Liberals—and Only Liberals—Can Win the War on Terror and Make America Great Again (2006), The Icarus Syndrome: A History of American Hubris (2010), and The Crisis of Zionism (2012).]

Copyright © 2014 by National Journal Group



Creative Commons LicenseSapper's (Fair & Balanced) Rants & Raves by Neil Sapper is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License. Based on a work at sapper.blogspot.com. Permissions beyond the scope of this license may be available here.



Copyright © 2014 Sapper's (Fair & Balanced) Rants & Raves