Monday, January 18, 2010

WWWD — Realism And Idealism?

Stanford history professor David M. Kennedy offers another take on the Nobel Peace Prize awarded to the POTUS (44) in 2009. The eight years of the previous administration with its unilateral swagger (mistakenly called the Texas version of "walking" by Bush 43) are gone and the world looks forward to a new day. Will the POTUS (44) be able to serve up a blend of idealism and realism in the days and months to come? The deliberations on that verdict are ongoing at this time. If this is (fair & balanced) watchful waiting, so be it.

[x Atlantic Monthly]
What Would Wilson Do?
By David M. Kennedy

Tag Cloud of the following article

created at TagCrowd.com

To the delight of late-night television comedians, President George H. W. Bush used to talk incessantly about “prudence,” but in fact the term is a deadly serious watchword for the “realist” school of foreign policy. It was on realist grounds that the elder Bush refused to press on to Baghdad after defeating the Iraqi army in the Gulf War in 1991. However alluring the goal, he said, pursuing it “would have incurred incalculable human and political costs”; he was expressing the kind of unsentimental caution that is realism’s most important characteristic. In contrast, his son, George W. Bush, was arguably among the most idealistic of American presidents. The younger Bush believed that there is but a “single sustainable model for national success: freedom, democracy, and free enterprise”—the kind of universalizing ideological claim that idealists have traditionally embraced. On those grounds, he set out to sweep into Baghdad, depose Saddam Hussein, and compel Iraq to embrace that unique model—an instance of unchecked idealism whose full consequences remain to be seen. President Barack Obama must now strike his own balance between the claims of realism and idealism. But are these ways of thinking about foreign policy as incompatible as they seem?

After the Peace of Westphalia ended Europe’s religious wars in 1648, the principle of sovereignty came to dominate the theory and practice of international relations. Sovereignty was trumps. States had no right to intervene in the internal affairs of other states. Yet no “higher authority,” papal, Protestant, or otherwise, stood above them. Only the ceaseless exercise of power, especially by the weighty “great powers,” might hold contentious states in tenuous equilibrium.

That concept was eventually accepted as the foundation of the international system, forming the bedrock on which the foreign-policy school of realism has ever after rested its case.

Realism also has roots in antiquity, as famously recorded in Thucydides’ account of the invading Athenians’ diktat to the hapless inhabitants of Melos during the Peloponnesian War, in 416 B.C. The Athenians dismissed the Melians’ appeals to morality and justice with the chill calculus of might: “The strong do what they will and the weak suffer what they must.” If one sentence could be said to comprise a Realist Manifesto, that would be it. Realism insists that moral considerations and reveries about “international law” or permanent peace are not only utopian but dangerous. Raison d’etat, or the national interest, is the sole standard that should guide any sovereign state’s external relations.

When Britain’s North American colonies struck for their independence in 1776, they simultaneously invoked and defied Westphalian principles. In the process, they introduced a new concept—“idealism”—into the lexicon of international politics.

The Declaration of Independence sounded a Westphalian note when it pronounced the Americans to be “one people” claiming their “separate and equal station” as a sovereign state. Yet at the same time, the American revolutionaries challenged inherited notions of sovereignty when they asserted that only certain kinds of states could be regarded as fully legitimate.

Thomas Paine’s 1776 pamphlet, Common Sense, is credited with convincing the American colonists that their cause was independence, rather than reconciliation with Britain—mainly because without independence they could not hope for foreign assistance. But we’d do well to remember that Paine opened his tract with a treatise on government. He eloquently anticipated the declaration’s claim that only republican states deriving “their just powers from the consent of the governed” were rightfully constituted. He then laid out the case for the kind of foreign policy that would secure the American republic—and suggested that a world composed of republics might be markedly more peaceful than the world of monarchies and empires that the Americans were repudiating. Common Sense was both the inspiration for American nationhood and the founding charter for American foreign policy.

From its birth, the United States thus infused its diplomacy with a revolutionary ideology that looked to the creation of a novus ordo seclorum in the international sphere as well as the domestic. That ideology constitutes the core of the idealist tradition in foreign policy. It is value-driven, morally infused, and devoted to concepts of international law and lasting peace. From its inception, idealism melded with an inherited sense of America’s providential mission to redeem the world. It also fitted easily with America’s emerging culture of mass democracy, and adumbrated some conspicuous facts about all popularly elected governments thereafter: that democratically accountable leaders must justify their foreign policies on grounds that the mass of citizens will accept—and that self-interest alone often appears inadequate to that end. In democracies, history attests, a measure of idealism may be necessary if a state is to sustain a coherent foreign policy.

Idealism was a prudential policy for an infant republic that could no more plausibly wield conventional power against the colossus of imperial Britain than Melos could against Athens. In that sense, idealism’s appeal to transcendent standards of justice was arguably what made it America’s most realistic policy in the nation’s early days. What’s remarkable is the degree to which the United States continued to honor those idealist precepts well after it ascended to great-power status.

Yet American diplomacy has proved most successful when it has tempered its idealistic aspirations with that canonical realist precept: the importance of limits. Balancing ambition with feasibility, ideational goals with material costs, has been perhaps the highest realist doctrine. Secretary of State John Quincy Adams gave it classic formulation in a Fourth of July address in 1821:

What has America done for the benefit of mankind? Let our answer be this... wherever the standard of freedom and Independence has been or shall be unfurled, there will her heart, her benedictions and her prayers be. But she goes not abroad, in search of monsters to destroy.... She well knows that by once enlisting under other banners than her own … she would involve herself beyond the power of extrication, in all the wars of interest and intrigue, of individual avarice, envy, and ambition, which assume the colors and usurp the standard of freedom.

In Adams’s day, the United States had limited capacity indeed to export its ideals. But in time, when its influence would reach to the farthest corners of the planet, how would the United States pursue its transformative agenda?

Throughout the 19th century, the United States remained a peripheral country preoccupied with subduing the North American continent. Jacketed by two oceans, with no neighbors to fear, it enjoyed a degree of security that few states have ever known. In the Old World chanceries where the great game of geopolitics was played, the United States mattered little. As late as 1890, the U.S. Army ranked 14th in size, after Bulgaria’s. In 1900, the Department of State counted a mere 91 employees in Washington. These were not the attributes of even a middling power. America’s ambition to transform the international order hovered beyond the horizon of some indefinite tomorrow.

But as the 20th century dawned, the United States was no longer so easy to ignore. It had grown to be the most populous country in the Western world, save Russia. It was the leading producer of wheat, coal, iron, steel, and electricity. It would soon command the world’s largest pool of investment capital. The Spanish-American War in 1898 and the subsequent annexation of Puerto Rico and the Philippine Islands dramatically announced that the United States had acquired the means to project its power well beyond its home continent.

That America now wielded immense potential strength to work its will in the world was evident. But when, if ever, and how, if at all, would that potential be realized? And what, exactly, was America’s will? Those questions excited an intense discussion in the 20th century’s opening decades, one with resonant echoes in our own time.

Anti-imperialists like Mark Twain advocated a return to isolationism. Unapologetic realists like Theodore Roosevelt urged the country to start behaving like a conventional great power, pursuing worldwide interests commensurate with its capacities. But Woodrow Wilson, whose ideas would eventually triumph, did not want his country to become just another great power.

The ideals of the Founders illuminated Wilson’s entire diplomatic program. His proposals, he said, constituted “no breach in either our traditions or our policy as a nation, but a fulfillment, rather, of all that we have professed or striven for.” When in 1919 he presented to the Senate the Versailles treaty, which included the Covenant of his beloved League of Nations, he declared: “It was of this that we dreamed at our birth.” But like John Quincy Adams, Wilson understood the difference between ideological aspiration and historical possibility. Like Adams, he fitted his ideals to the circumstances he confronted.

Perhaps the most famous distillation of Wilson’s thinking is to be found in his war address of April 2, 1917, when he said, “The world must be made safe for democracy.” That maxim, and the entire scheme of “Wilsonianism” that it is thought to represent, have often been derided as hopelessly idealistic. The realist George F. Kennan excoriated Wilson for “the colossal conceit of thinking that you could suddenly make international life over into what you believed to be your own image.”

That is a formidable criticism, but it is as misdirected at Wilson as it would have been at the Founders. Properly understood, Wilson’s simple declarative sentence—“The world must be made safe for democracy”—constituted a realistic as well as an idealistic lodestar for American foreign policy. It guided American diplomacy in the season of its greatest success, the half century following World War II. Wilson, maligned as a dewy-eyed idealist, should instead be celebrated as the original architect of America’s most realistic—and successful—foreign policies.

More clearly than his critics, Wilson recognized that the world now bristled with dangers that no single state could contain, even as it shimmered with prospects that could be seized only by states acting together—that it presented threats incubated by emerging technologies, and opportunities generated by the gathering momentum of the Industrial Revolution. Making such a world safe for democracy required more than the comforting counsels of isolation, and more than taking the inherited international order as a given and conducting the business of great-power diplomacy as usual. It required, rather, active engagement with other states to muzzle the dogs of war, suppress weapons of mass destruction, and improve both standards of living and international comity through economic liberalization. Most urgently, it required new institutions that would import into the international arena at least a modicum of the trust, habits of reciprocity, and rule of law that obtained in well-ordered national polities. These were ambitious goals, but they were also realizable, as time would tell.

At the heart of Wilson’s program lay the League of Nations. Yet for all the league’s apparent novelty, in Wilson’s view it honored a Westphalian objective: “affording mutual guarantees of political independence and territorial integrity to great and small states alike.” The league is best understood not as a revolutionary menace to the Westphalian system, but as an evolutionary adaptation of venerable practices to modern circumstances. Respect for sovereignty was its essence. Only “fully self-governing”—that is, sovereign—states, dominions, or colonies were eligible for membership. Most actions required unanimous consent. Lacking an armed force, the league depended on its member states, especially the great powers, for enforcement of its provisions.

Nor did Wilson propose a wholesale cession of American sovereignty to the new body. He was offering a kind of grand bargain: the United States would abjure its historic isolationism and agree to play an engaged international role—but only if the rules of the international system were altered in accordance with American goals, putting the world on a pathway to more international cooperation and better international behavior.

The ironic result is well known. Wilsonianism was stillborn at the end of World War I, with consequences that spawned the Great Depression and the Second World War. But when the United States emerged from that latter struggle, the story was dramatically different. Understanding the singular blend of idealism and realism that crystallized in that pivotal moment is essential to comprehending the success of the international post–World War II order—and the danger of forgetting its relevance to the 21st century.

America’s power at the end of World War II was exponentially greater than the power Wilson had wielded in 1918. The United States possessed the only intact large-scale advanced industrial economy on the globe. It held a monopoly, for the moment, on nuclear weapons. It boasted the world’s largest navy and massive long-range strategic air capability. It commanded fully half the planet’s manufacturing capacity. It held half of the world’s gold stocks and foreign-currency reserves. It was the leading petroleum producer and a leading exporter.

A cohort of American leaders decided to use that power in ways that finally set in train the kind of transformation the Founders had dreamt of and Wilson had sought in vain, bringing about what the political scientist John Ikenberry has called “America’s distinctive contribution to world politics.”

For all its might and occasional bluster, the United States in the post–World War II era preferred not to rule with preemptive Olympian majesty. It became not a traditional imperial power, but a “hegemon.” The word’s Greek roots denote a guide, or a leader, and leadership has been well defined as a relationship between consenting adults, not an arrangement between a capricious master and sullen subordinates.

On the occasion of the first gathering of the United Nations, in San Francisco on April 25, 1945, President Harry S Truman used words that could have been Wilson’s—or Paine’s: “The responsibility of great states is to serve and not to dominate the peoples of the world.” And although the United States undeniably continued to pursue what Wilson once scorned as its own “aggrandizement and material benefit” (considerations never absent from American foreign policy, nor should they be), what is most remarkable is the way Washington created what the Norwegian scholar Geir Lundestad has called an “empire by invitation.”

At its best, that unconventional “empire” paid deference to the norms of Westphalian sovereignty even while artfully modifying them. Washington did not compel subordination from other states, but rather provided incentives for willing participation. The empire’s architects understood the realist wisdom that, as Robert Kagan has written, “Predominance is not the same thing as omnipotence.” They appreciated that this was the moment to use America’s unrivaled power to shape an order that would, among other things, provide a hedge against the inevitable moment when America’s power was less. It was as if the Athenians had taken the Melians’ plea to heart: that it is in the interest of all, the strong as well as the weak, to recognize and obey stipulated rules of international behavior—not least because today’s strong states can become tomorrow’s weak ones. (The Athenians, who eventually lost the Peloponnesian War, had ample opportunity to reconsider the lesson of Melos.)

The Americans of that era helped to erect an array of multilateral institutions, including the United Nations, the International Monetary Fund, the International Bank for Reconstruction and Development (or World Bank), and the General Agreement on Tariffs and Trade, which would later evolve into the World Trade Organization. Membership in those institutions was generally open to all, excepting nations in the Communist bloc. Participating states ceded only marginal elements of their sovereignty. The United Nations in particular, held in check by the veto power of each permanent member of the Security Council, could not plausibly be described as a supranational world government. But taken together, these innovative institutions brought a measure of law and reciprocity to international politics.

The framework provided by this international regime began subtly, incrementally, to fulfill the Founders’ promise of a new world order, one that nurtured new norms of interstate behavior. The Marshall Plan, announced in 1947, catalyzed the process that eventually yielded the European Union. The North Atlantic Treaty Organization, formed in 1949, provided the security guarantees for Western Europe throughout the Cold War that made the maturation of the EU possible. The Nuremberg and Tokyo war-crimes trials, along with the UN’s Universal Declaration of Human Rights and its Convention on the Prevention and Punishment of the Crime of Genocide, established at least a minimal basis in international law for superseding sovereignty in the face of egregious crimes against humanity—though those precedents proved feeble against the likes of Pol Pot, Slobodan Milošević, and the predators of Rwanda and Darfur. Less palpably but no less importantly, that web of multilateral structures bred some measure of trust among sovereign states that had eyed each other warily at least since Westphalia.

Whatever their limitations, for nearly three generations those institutions constituted the major pillars supporting a global economic expansion of unprecedented reach. They also underwrote the advance of self-determination and democracy, as the colonial powers withdrew from Africa and Asia, the Soviet empire disintegrated, and open elections became the norm in countries that had not seen them in generations, if ever. No grand guerre erupted on anything remotely approximating the scale of the two world wars. The European continent was pacified after centuries of conflict—itself an accomplishment sufficient to distinguish the age. In this same era, Americans enjoyed economic prosperity and personal security unmatched even in the country’s singularly fortunate history. As international regimes go, the post–World War II era, despite the chronic tensions and occasional blunders of the Cold War and especially the tragedy of Vietnam, was on the whole a felicitous season in the world’s sorry history.

Most of those multilateral structures are now more than 50 years old. Many, probably all, need substantial reform. And even this impressive matrix of institutions may be ill-suited to the tasks of extinguishing radical Islamist terrorism and adapting to climate change. But international institutions well matched to the dawning century can rise only from the foundations of mutual trust that a half century of multilateral life cemented—and that the policies of the past several years have shaken.

Here is where the full meaning of Wilson’s call, “The world must be made safe for democracy,” becomes clear and compelling. Wilson tempered his diplomatic ideals with a pragmatic comprehension of the modern world, of its possibilities and its dangers. He respected the pride and the prerogatives of other peoples. He shrewdly calculated the reach as well as the limits of American power. Perhaps most important, he was attentive to what kind of foreign policy, resting on principles of moral legitimacy, the American public would embrace.

Franklin Roosevelt and Harry Truman took the lessons. They asked only that the world be made safe for democracy, not that the world be made democratic. They understood the complexities of human cussedness and the constraints on even America’s formidable power. They would surely have hesitated to wage a preemptive war against Iraq that grossly overestimated America’s capacity to achieve its goals.

The damage done by this distortion of the Wilsonian legacy has yet to be fully calculated. Future historians will take its measure not only in the worldwide surge of anti-American sentiment, but also in the erosion of confidence in the multinational institutions—including NATO, the UN, the IMF, the World Bank, and the WTO—that the United States itself had so painstakingly nurtured across decades.

In an age awakening to the global dimensions of environmental degradation, the fungibility of employment across national frontiers, massive international migrant flows, the unprecedented scale of international capital transactions, the contagious volatility of financial markets, and the planetary menace of nuclear proliferation—not to mention the threat of terrorism—that erosion of confidence threatens to deny the world the very tools it needs most to manage the ever more interdependent global order of the 21st century. It would be folly to abandon those tools, or to let them rust through inattention, particularly as new great powers arise to rival the last century’s hegemon. To do so would leave all nations, including the United States, markedly less secure. It would make a mockery of realism’s admonition to see the world as it is, not as we wish it to be. And it would dishonor more than two centuries of America’s unique tradition of interweaving idealistic and realistic principles in pursuit of a novus ordo seclorum.

The keys to American foreign policy in the post-WWII years were four: grounding policy in well-articulated, easily comprehended American values that recruited and sustained domestic support; honoring inherited notions of sovereignty and the rights of other states; seeking multilateral cooperation where possible and acting unilaterally only in extremis; and remembering that even America’s enormous power was finite and fluid, and should therefore be deployed to shape a world in which all states, not only the momentarily strong, had a stake. Taken together, those principles constitute a blend of realism and idealism, not a stark choice between them, and their careful application over several decades represents a singular achievement for American diplomacy. When they have not been followed, United States foreign policy has failed, sometimes catastrophically, as in Vietnam and Iraq. But when those principles have informed policy, they have achieved salutary results and, not incidentally, bolstered the nation’s moral stature in the world. In his approach to conundrums like Iran’s nuclear program and securing Afghanistan, President Obama has shown that he understands the imperative to restore America to the footing of principled realism that has undergirded its most successful foreign policies. It is far too soon to judge the depth of his commitment to this long-term project, much less whether he will succeed. Perhaps receipt of the Nobel Peace Prize will strengthen his resolve. Surely its award testifies to the fact that the world shares with many Americans the hope that the United States might continue to champion the diplomatic principles that have served it so well since its birth. Ω

[David M. Kennedy is a Pulitzer Prize-winning historian specializing in the history of the Untied States. He is the Donald J. McLachlan Professor of History at Stanford University and the Director of the Bill Lane Center for the American West. He won both the Pulitzer Prize and the Francis Parkman Prize for Freedom From Fear: The American People in Depression and War, 1929-1945 (1999). Kennedy received his A.B. in History from Stanford and M.A. and Ph.D. from Yale.]

Copyright © 2010 The Atlantic Monthly Group

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.

Copyright © 2010 Sapper's (Fair & Balanced) Rants & Raves

A Change In Thinking?

When this blogger was a wee lad (and living in a cave), he would interrupt his maternal grandfather during an anecdote about the "old days." The lad would ask, "Where was I, Grandpa?" And the old fellow (younger then than this blogger now) would reply: "You were on the side hill, eating grapes." So, in this roundabout reference to the time when this blogger virtually was on the side hill eating grapes, this post from Edge brought memories of a time when this blogger forwarded unsolicited e-mail (with attachments or enclosed links) to a collection of acquaintances in Amarillo, TX. In the case of Edge, the recipients were members of the (pardon the oxymoron) Amarillo intelligentsia. That was then and this is now — the Age of the Blog. So, dear reader, if you get this far, rest assured that you are a member of the Cyberspace Intelligentsia because you clicked on a link to this blog. If this is (fair & balanced) self-delusion, so be it.

[x Edge]
The Edge Annual Question — 2010
How Is The Internet Changing The Way You Think?
By John Brockman



Tag Cloud of the following article

created at TagCrowd.com

Read any newspaper or magazine and you will notice the many flavors of the one big question that everyone is asking today. Or you can just stay on the page and read recent editions of Edge....

Playwright Richard Foreman asks about the replacement of complex inner density with a new kind of self-evolving under the pressure of information overload and the technology of the "instantly available". Is it a new self? Are we becoming Pancake People — spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button.

Technology analyst Nicholas Carr wrote the most notable of many magazine and newspaper pieces asking "Is Google Making Us Stupid". Has the use of the Web made it impossible for us to read long pieces of writing?

Social software guru Clay Shirky notes that people are reading more than ever but the return of reading has not brought about the return of the cultural icons we'd been emptily praising all these years. "What's so great about War and Peace?, he wonders. Having lost its actual centrality some time ago, the literary world is now losing its normative hold on culture as well. Is the enormity of the historical shift away from literary culture now finally becoming clear?

Science historian George Dyson asks "what if the cost of machines that think is people who don't?" He wonders "will books end up back where they started, locked away in monasteries and read by a select few?".

Web 2.0 pioneer Tim O'Reilly, ponders if ideas themselves are the ultimate social software. Do they evolve via the conversations we have with each other, the artifacts we create, and the stories we tell to explain them?

Frank Schirrmacher, Feuilleton Editor and Co-Publisher of Frankfurter Allgemeine Zeitung, has noticed that we are apparently now in a situation where modern technology is changing the way people behave, people talk, people react, people think, and people remember. Are we turning into a new species — informavores? — he asks.

W. Daniel Hillis goes a step further by asking if the Internet will, in the long run, arrive at a much richer infrastructure, in which ideas can potentially evolve outside of human minds? In other words, can we change the way the Internet thinks?

What do you think?

This year's Question is "How is the Internet changing the way YOU think?" Not "How is the Internet changing the way WE think?" We spent a lot of time going back on forth on "YOU" vs. "WE" and came to the conclusion to go with "YOU", the reason being that Edge is a conversation. "WE" responses tend to come across like expert papers, public pronouncements, or talks delivered from stage.

We wanted people to think about the "Internet", which includes, but is a much bigger subject than the Web, an application on the Internet, or search, browsing, etc., which are apps on the Web. Back in 1996, computer scientist and visionary Danny Hillis pointed out that when it comes to the Internet, "Many people sense this, but don't want to think about it because the change is too profound. Today, on the Internet the main event is the Web. A lot of people think that the Web is the Internet, and they're missing something. The Internet is a brand-new fertile ground where things can grow, and the Web is the first thing that grew there. But the stuff growing there is in a very primitive form. The Web is the old media incorporated into the new medium. It both adds something to the Internet and takes something away."

This year, I enlisted the aid of Hans Ulrich Obrist, Curator of the Serpentine Gallery in London, as well as the artist April Gornik, one of the early members of "The Reality Club" (the precursor to the online Edge) to help broaden the Edge conversation — or rather to bring it back to where it was in the late 80s/early 90s, when April gave a talk at a "Reality Club" meeting, and discussed the influence of chaos theory on her work, and when Benoit Mandelbrot showed up to discuss fractal theory and every artist in NYC wanted to be there. What then happened was very interesting. The Reality Club went online as Edge in 1996 and the scientists were all on email, the artists not. Thus, did Edge surprisingly become a science site when my own background (beginning in 1965 when Jonas Mekas hired me to manage the Film-Makers' Cinematheque) was in the visual and performance arts.

To date, 167 essayists (an array of world-class scientists, artists, and creative thinkers) have created a 130,000-page document. (Click here to go directly to the responses). Ω

[John Brockman studied at the Babson Institute of Business Administration (renamed Babson College in 1969) and Columbia University. He is the founder of the nonprofit Edge Foundation, Inc. and editor of Edge, the highly acclaimed website devoted to discussions of cutting edge science by many of the world's brilliant thinkers, the leaders of what he has termed "the third culture".]

Copyright © 2010 Edge Foundation, Inc.

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.

Copyright © 2010 Sapper's (Fair & Balanced) Rants & Raves