Saturday, May 01, 2010

Quo Vadis, Europa?

In the dawn of time, when people lived in caves, this blogger studied political geography. Halford Mackinder (British), Friedrich Ratzel and Karl Haushofer (both German), and Karl Haushofer (British and U.S.) all furthered the thrust of geopolitics in this blogger's understanding of the global dynamic. In the following essay, Walter Laqueur considers the plight and fate of the European Union in 2010. If this is (fair & balanced) consideration of the entire world as the appropriate sphere for a nation's influence, so be it.

[x AI]
Better Fifty Years Of Europe?
By Walter Laqueur

Tag Cloud of the following article

created at TagCrowd.com

Europe used to be, within the living memory of many of us, the cockpit of world power, prosperity and prestige. Today it is raw material for an ouija board. Predictions about Europe’s future range from its impending suicide to its emergence as a unified, leading economic and political superpower. Of late most predictions, especially those coming out of Europe, have been on the dour and pessimistic side. So it is refreshing to come across a book like Steven Hill’s Europe’s Promise, which reaffirms the earlier optimistic take: The European model is not only superior to the American in almost every possible way, but also, as its subtitle proclaims, the world’s “best hope in an insecure age.” According to Hill, Europe’s vastly superior stores of smart power will even allow it to solve the problem of the Iranian bomb.

Optimism can be refreshing, however, even when it is neither correct nor justified. Hill describes the main features of a European way of doing things: legal-institutional, multilateral but elite-managed, consensual and slow to change. He refers to these as “social capitalist” impulses rather than the more common term, “social democratic,” but he does so without really defining what a European model is. And it’s never clear which features prevail in which countries, or whether the “promise” applies to the whole—the European Union—which in some ways exists and in others does not (a state of institutional indecisiveness that Hill considers a virtue).

It is not, therefore, unfair to ask what pieces of Europe’s “promise” should America and others look to for guidance? To Spain’s nearly 20 percent unemployment rate? To Italy’s surreal political melodramas under Silvio Berlusconi? To near bankrupt Greece, Portugal or Ireland? To the para-democratic Balkans or still struggling Eastern Europe? Surely not to Britain, which does not belong to the Eurozone. That seems to leave us with perhaps France and Germany, but their present leaders wouldn’t recommend their own present models for their want of far-reaching reforms. There remains Scandinavia, of course, but Sweden, the biggest northern country, has fallen back substantially on the prosperity index. Norway has been doing well; its per capita average income is now $53,000, and it has the lowest murder rate in the world. But there are problems with generalizing the Norwegian model. What works in a country of only 4.8 million inhabitants is not necessarily applicable to a country of 300 million. Besides, Norway has the special advantage of North Sea oil, and it isn’t even a member of the European Union. Maybe a more accurate subtitle for Mr. Hill’s book would be “Why the Danish and Norwegian ways are the best hope in a secure age.”

I am perhaps being unfair. It may be too easy to ridicule Euro-optimists these days, especially ones whose writing resembles the prospectuses of travel agencies recommending luxury itineraries at cut-rate prices than to serious political description and analysis. The Europeanists have gotten themselves into a strange fix. They have expanded their Union to the point of decision-making paralysis but would consider expanding still further. They cannot deepen the Union, lest residual memories of democratic accountability roil Europe’s individual national souls. But the Union may have to be deepened, for, as the Belgian politician Leo Tindemans noted in a famous report on the future of Europe more than thirty years ago—a house half finished will not last. As Greece (among others) has shown, economic union without considerably more political union will not work. The European Union has established new central offices but dare not staff them adequately. They have created a common currency and a bank to manage it but not the political counterpart to steady it in rough weather. The liberal immigration protocols they have enacted are stimulating a widespread anti-immigrant backlash, yet the demographic collapse of the native populations demand immigration to keep economies from collapsing as well. In nearly every sense, then, the European model, and the European promise with it, is locked in a “crisis of wishing.” The further the Europeanists try to go forward, the harder it is for them to move anywhere at all.

And yet, while the European model, whatever exactly we decide it is, may be in grave trouble today, the spirit that created and sustained it over half a century deserves serious consideration. From this experience, if not from any model, plenty can be learned.

The postwar generation of European elites aimed to create more democratic societies. They wanted to reduce the extremes of wealth and poverty and provide essential social services in a way that prewar governments had not. They wanted to do all this not just because they believed it was morally right, but because they saw social equity as a way to temper the anger and frustrations that lead to violence and ultimately to war. They had had quite enough of war.

For several decades, many West European countries nearly achieved these aims, and they had every reason to be proud of that fact. The reaction to this accomplishment in some circles in the United States, which fell under the Cold War-induced fear of “socialism”, bordered on the hysterical. Otto von Bismarck, the godfather of the welfare state, was not, after all, an extreme socialist. The usual line in the United States in those days was that Americans did not want to levy the extravagantly high taxes necessary to pay for such achievements, or imperil liberty by financing the activities of a more activist state.

I for one never understood why the United States “could not afford”, as it was often phrased, the benefits of the so-called European welfare state, since America, richer than Europe, could have financed them at a considerably lower tax rate. To some extent that is what eventually happened anyway in the late 1960s and 1970s, but it happened unevenly. The United States ended up spending almost 17.3 percent of its GNP on its health services, yet could not deliver care as comprehensive and equal in quality as that of France, which, like most other European countries, spends only about half that percentage. Extravagance and inefficiency are, one has to admit, relative and even fungible terms. And did a somewhat less activist state better preserve American liberty than would otherwise have been the case? Did more activist European states stifle democracy? These are very hard cases to make.

True, during the past few decades the European welfare state has been under growing pressure. Services have had to be cut, and anxieties mounted as expenditures continue to rise and budgets to shrink. The political economy of the welfare state is based on the assumption of substantial economic growth—a Ponzi scheme of sorts, yes, but not an unreasonable one. What if growth dwindles, however, or ceases altogether? These issues are now widely discussed in Europe.

Even if the welfare state in its present form proves unsustainable, however, it is only one dimension of the European model. Economic problems or not, no one seriously worries that EU-member state foreign policies, or even state economic policies, will be radically re-nationalized, or that there will be another war in the heart of Europe. There has been a great deal of talk lately, mainly in the context of the Greek economic crisis and rescue, about Germany being less willing to play banker and economic engine for Europe’s less well-performing members that have overspent themselves into penury. But this is just Germany becoming normal, Germany acting logically in a context (a united Germany and a much larger European Union) that is quite different from the one it signed on to in the mid-1950s.

Europe’s problems, however, belong to America as well as to Europe. Its crisis of wishing, if it cannot be resolved, will harm American interests. America is passing through a crisis of its own, not just an economic or financial crisis, but a crisis of both confidence and governance. Recovery may take years, as it did in the 1860s and the 1930s. It is one thing to ask, as Lawrence Summers did, how long the world’s biggest borrower can remain the world’s strongest power. It is another to ask how America can provide global common goods when it can’t solve its own elemental national domestic problems. In such times the international scene wants for a strong European Union that shares democratic values with the United States.

According to the Euro-optimists and some of the declarations coming out of Brussels lately, Europe understands the situation and is prepared to step up. But more realistic voices argue that Europe will not be a partner world power with America; it will rather play a mediating, pacifying role in world politics. Put another way, Europe may well be inclined not to offset America’s decline but only to help Americans to manage it, as Europeans managed their own decline some decades earlier.

This may explain why, according to the polls—the Pew Global Attitudes project as well as others—many more countries believe that Europe will play a more positive role in world affairs than the United States. One suspects that European popularity rests precisely on the assumption that Europe is powerless to interfere in other nations’ affairs, exert pressure, or complain about violations of human rights and other such pesky matters. A wealthy region that punches below its weight can be attractive for any number of self-interested reasons.

Europe, for example, is the most important global donor to needy countries, contributing about €60 billion out of a total of €80 billion. America and the European Union also cover more than half of the UN operating budget. One would expect such massive soft power to translate into influence, but this has not been the case. European influence at the United Nations has been “hemorrhaging” in the words of the Guardian hardly a stalwart supporter of the West or a bitter enemy of the UN. Whether the issue has been Zimbabwe, Sudan or Burma, or some other place where blatant violations of human rights were taking place, the West invariably has been outvoted. This trend has been clearest in the Human Rights Council, where European representatives have been marginalized into despair and a pervasive sense of futility. That is no surprise, of course. The Council, after all, was constructed to protect, not pressure, human rights violators. It is unusual, however, when Europe is sidelined at events like the recent Copenhagen climate change conference; when the American President has to be reminded by his staff to mention Europe and NATO in major foreign policy pronouncements; and when China and even Russia have not been very respectful either.

All of this is self-evident; but how do we explain it? Several reasons come to mind. For one thing, Europe’s decline reflects the changing global balance of power. Europe’s prior source of strength, its economy, is no longer so vibrant. Europe will recover to a certain extent, but for demographic reasons if for no others it will not recover its former leading position. Europe’s weakness also stems from its energy dependence on Russia and the Middle East, and from social unrest linked to large numbers of unassimilated immigrants.

Immigration may be necessary to keep European economies going and its welfare states financed, but it causes political tensions. The decisive issue is not even whether European cities will have a Muslim majority thirty years from now, but whether the immigrants will be integrated, whether they will contribute to the culture, competitiveness and general strength of their adopted countries as earlier waves of immigrants did. Integration will take place in the long run; predictions of Eurabia are, I think, exaggerated. But it will take at least a few generations and the strain of mutual adaptation will affect European foreign policies in the meantime.

That’s foreign policies, plural. As has already been noted, the elusive promise of genuine unity lurks right at the center of Europe’s crisis of wishing. If Europe were serious about maintaining its status as a global power, its elite would hammer out common foreign, defense and energy policies, and build the institutions to sustain them. But the elites have barely been able to manage a common agricultural policy, for all that has been worth to Europe as a whole. To build genuine coordination, let alone genuine agreement, in political and security spheres has proved impossible, so the elites have been reduced to pretending. According to the Lisbon Treaty, national interests and national sovereignty will be subordinated to the resolutions of the institutions of the European Union. If this were to actually happen, it would be a revolutionary step toward actual European unity. But this resolution is not worth the paper on which it is written. It is unthinkable that France (or, indeed, any European country) will subordinate its own national interest to those of the European Union. The European governments do not want it, and European voters want it even less.

Europe hasn’t even been very good at pretending that it is serious about a common European foreign and defense policy. If it were, it would have chosen some politicians of international renown to give the new set-up the appearance of importance. Instead, the wizards behind the curtain chose two unknowns who lack both experience and reputation: the British Baroness Lady Cathy Ashton, who began her political career with the Campaign for Nuclear Disarmament in 1977–83 (embarrassing questions arose during this period about the financial aid given to this group by the Soviet government); and Herman van Rompuy, a former Belgian Prime Minister who has left even most Belgians impressionless. Their welcome has not been enthusiastic: One widely cited source called them “garden gnomes.” It would be churlish for me to comment further on their qualifications.

Meanwhile, what progress has been made in solving real problems such as Europe’s weaknesses in energy and defense? “Energy is what makes Europe tick” and “The time is ripe” are the official slogans for the former issue. Europe declared a “sustainable energy week” in March and set up a nuclear energy academy; but the dependence on Russia and the Middle East is effectively growing as North Sea oil reserves are shrinking.

The slogan for European defense is, “A secure Europe in a better world.” Some small forces have indeed been stationed in eastern Chad (a task more fit for the United Nations or the Organization of African States). But the small rapid-reaction force that has been in the making for twenty years or more still needs sixty days to deploy—no sensible or literate person’s definition of rapid. In any case, none of these groups has ever been deployed, and there is the suspicion that they exist only on paper.

But it is not Europe’s economic, institutional and military weakness that is really the key to its troubles, and to the problem Europe’s weakness poses for the United States. At the core, the problem is conceptual. The Euro-elite thought it saw the shape of the future. It believed it was aligning itself with key global trends and would be in a position to advance those trends. The introduction of a common currency was such an epochal event, the elites believed, because, as the Lisbon conference of 2000 put it, a “quantum shift” would enable Europe “to become the most competitive and dynamic knowledge-based economy in the world.” In other words, military strength was outmoded. Power would now be measured in globalist economic terms. It was against this background that a new literature and a new ideology appeared: The 21st century was to be the century of Europe. Its values Europe would become the values of the world: exemplary democracy, an unqualified respect for human rights, sustainable economic growth, stability-orientated monetary policy, social justice. With its enormous transformative power, Europe would run the 21st century.

Nice try; bad guess. And the bad guess has exposed the crisis of wishing. The Euro-elites were hardly ever serious about building a political union that would require far-reaching concessions concerning national sovereignty; they saw no need for such sacrifices in a world in which power politics no longer played a significant role. Now they find themselves in a world in which power politics still matters, and they are weaker and less prepared to engage in such politics than ever.

The idea that economics would trump politics supposed, implicitly for the most part, that morale could flow from affluence and social security alone. It does not seem to have worked out that way. Europe has been affluent and its population socially secure for the most part, but it has been suffering a subacute case of Abulia—a psychological term first used in the 19th century to connote listlessness and apathy. No one has as yet provided a satisfactory explanation for this condition, either regarding individuals or societies. It has been connected, of course, with a decline in Europe’s self-confidence, but that just begs the question of why Europe’s self-confidence has been declining.

It seems to have nothing to do with economics and everything to do with beliefs—specifically, belief in the values for which the society stands. Many Europeans cannot figure out for sure what those values are, for the Euro-elites seem to have been struck dumb in this sphere as in no other. The sense of involvement in a great mission, of preaching the virtues of a better world, has vanished. The closest thing to a shared noble cause is now an anodyne, lowest-common-denominator environmentalism. It is hard to generate much enthusiasm for the commandment to separate green glass from brown. The European model has thus approached that of Latin America, whose countries have a common ancestral culture, generally live in peace with each other, and fail to cause the rest of the world much trouble.

What does the model promise? What will Europe be like ten or twenty years from now? With a little luck it will gradually recover from its present economic difficulties. With a little luck, too, the domestic transformation resulting from the changes in its ethnic composition will be gradual and relatively peaceful. Will Europe be a political and cultural center? The prospects are poor, it will have to soften its voice as a champion of human rights as befitting its reduced standing in the world.

Some will smile at Europe’s comeuppance. Oh, how the braggarts have been brought low, the insufferably smug do-gooders put in their place. But Schadenfreude would be unwarranted, especially coming from Americans. It is not as if there were no need for a world power that expresses European values and validates the European aspirations and achievements of the past half century. The hopeful assertions of Kishore Mahbubani and others about the loss of Western moral authority and the ascendancy of Eastern leadership seem a little premature, or we should in any event hope so. New Asia might be more efficient than old Europe for the time being, but as for moral values, Alfred Lord Tennyson’s feelings, expressed some 150 years ago, still seem closer to reality: “Better fifty years of Europe than a cycle of Cathay.” Ω

[German-born Walter Laqueur attended the Hebrew University in Jerusalem in 1938-1939. Laqueur lived and worked in Israel, France, and Britain during and after WWII. In 1967 Laqueur moved to the United States where he appointed Professor of the History of Ideas at Brandeis University. In September 1977, Laqueur was appointed university professor in the Department of Government at Georgetown University. Concurrently, he was a faculty member of the Center for Strategic and International Studies in Washington, DC. Laqueur is the author most recently of The Last Days of Europe (2007) and Best of Times, Worst of Times: Memoirs of a Political Education (2009).]

Copyright © 2010 The American Interest

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.Copyright © 2010 Sapper's (Fair & Balanced) Rants & Raves

Presidential Ethics? — Or, Just Be Thankful Johnny Edwards (D-NC) Wasn't Elected POTUS!

Robert Dallek, presidential historian par excellence, offers a review of presidential duplicity (and other charming behavior tics) since the time of Woodrow Wilson (early 20th century). If this is (fair & balanced) praise of folly, so be it.

PS: This blogger offers a mea culpa. The links to the sources and the endnotes will require the poor visitor to scroll back in the text to that point of virtual departure.

[x PSQ]
Presidential Fitness And Presidential Lies: The Historical Record And A Proposal For Reform
By Robert Dallek

Tag Cloud of the following article

created at TagCrowd.com

Is an ethical presidency—one that can be defined as lawful and honest in its public dealings—possible in the twenty-first century? Thomas Jefferson would have had his doubts: "Whenever a man has cast a longing eye on [power]," he wrote, "a rottenness begins in his conduct."

In the last century, numerous commentators on politics in general and on America's presidents in particular saw ethics and government as incompatible. George Orwell, for example, did not think that politicians and political parties were capable of truth telling. "Political language," he declared in 1946, "is designed to make lies sound truthful and murder respectful, and to give an appearance of solidity to pure wind." Alexander Solzhenitsyn (1974) said that in Soviet Russia, "the lie has become not just a moral category but a pillar of the State."

There are, of course, vast differences between the moral transgressions of an Adolf Hitler or a Josef Stalin declaring the sanctity of their crusades to justify their killing machines and the moral obtuseness or ethical lapses observed in American politics and displayed by American presidents. Yet several aspects of presidential conduct—past, present, and future—raise ethical issues that merit serious examination. While nothing like the evil afflicting Nazi Germany or Soviet Russia captured American politics in the twentieth century, the history of the American presidency over the last hundred years has been less than a model of moral purity—either by the individuals holding the office or by their administrations. Misleading the public, in particular, has been an all too common feature of the post–Civil War presidency. Presidents have misled the public through both omission and commission—through concealment and secrecy as well as lies and deceptions. This essay examines three types of such conduct: lying to the public about personal health problems; lying about public policies, especially in the areas of foreign relations and national security; and hiding wrongdoing within the executive branch.

Perhaps ironically, presidents are mindful of the public's insistence on believing that the nation's principal office holder, as the representative of its highest values, is a person of unquestionable morals. Yet presidents, and candidates, have repeatedly risked their ability both to win the White House and to govern by taking, sanctioning, or turning a blind eye to questionable actions. Why?

The answer, I believe, is that these ambitious politicians crossed ethical lines in the conviction that doing so was necessary to ensure their own victories, whether in gaining the Oval Office, remaining there through reelection, or implementing policies that they considered essential both to the national well-being and to their own reputations as effective or even great presidents.

America's "moral flabbiness," William James (1920) said, is an inevitable product of the country's obsession with "the bitch-goddess success." Given how driven someone must be to run for and then win the presidency, small wonder that such individuals play fast and loose with ethical standards that they fear might inhibit their freedom first to get to the White House and then to achieve big things. Moreover, however much sitting presidents deny a preoccupation with their historical standing, they are intensely concerned with their reputations. They are especially aware of the small number of presidents who have had the public appeal to win more than one term—only 7 out of 18 in the twentieth century—and how many of their predecessors faded into obscurity or have been ranked as undistinguished occupants of the Oval Office.

American involvement in international politics and the rise of the national security state in the 1940s made the need to protect America from foreign dangers an additional reason for presidents to see ethical abuses as acceptable. In 1948, General Omar Bradley described the world as having "achieved brilliance without wisdom, power without conscience." The postwar world, Bradley said, "is a world of nuclear giants and ethical infants" (1967, 410). President Richard M. Nixon later gave resonance to Bradley's complaint when he famously dismissed accusations of presidential wrongdoing in domestic and foreign affairs by asserting that "when the President does it, that means that it is not illegal" (Nixon 1977). Nixon was a highly intelligent man, of course, as well as a lawyer by training. This statement, however, suggests a rather stunted sense of political ethics as well as a faulty comprehension of constitutional law.

Nixon clearly went too far in his casual regard for ethical standards, paying the price when the Watergate scandal compelled him to become the only American president ever to resign, but he was hardly the only occupant of the White House to cut ethical corners at home and abroad. The catalogue of breaches is too lengthy to recount in detail, but some of the more glaring examples make the point. This essay examines such examples in three areas of presidential conduct. The first involves cases in which chief executives concealed or lied to the public about their personal medical conditions. The second involves episodes in which the president sought to hide some form of scandal or corruption within a presidential administration, sometimes involving the president directly, sometimes not. Finally, the third involves instances in which the president concealed or lied to the public about matters of state involving U.S. foreign relations and national security policy. These cases are arguably the most important and most complex. The modern presidency provides ample examples of each.

Health Conditions and Presidential Ethics

In 1888, when Benjamin Harrison credited the dictates of Providence for his narrow Electoral College victory over Grover Cleveland—who actually beat Harrison by 100,000popular votes (American Presidency Project n.d.), Pennsylvania Republican boss Matthew Quay said, "Providence indeed! If only he knew how many men had to approach the gates of the penitentiary to put him in that office." Like Harrison, the man who defeated him in 1888, Grover Cleveland was also tone-deaf to inappropriate behavior, most specifically in deceiving the public about his health, which casts a retrospective shadow over his presidency. During the 1892 campaign, Cleveland hid a cancer of the jaw from the public that might have denied him a second presidential term. After his reelection, surgeons secretly removed the cancer, and fortunately for Cleveland and the country, he was not incapacitated. But it was an act of deception that elevated his ambition for the presidency over the national well-being. The public did not learn of Cleveland's medical problem until after he died in 1917, when his lead surgeon published an article in a popular magazine. Cleveland's long-held concealment provides the first of several examples of modern presidents hiding health problems from the public.

Woodrow Wilson, a great moral visionary who aimed to end war and make the world safe for democracy, committed an inexcusable ethical breach in 1919-20 when he refused to resign the presidency after suffering a major stroke. The White House, led by Mrs. Wilson, hid the seriousness of the president's condition from the public. In the words of one historian, Wilson's affliction shattered his "physical constitution, psyche, and sense of reality" (Link 2002, 365-88). He was incapable of dealing with a resurgence of American isolationism, a major recession, and the Red scare. At a critical time in the country's history, the executive branch of government came to a standstill.

Wilson was not the last president to deceive the country about personal medical problems that jeopardized the success of an administration. In 1944, Franklin D. Roosevelt was in failing health when he ran for a fourth term. Although his doctors did not tell him that he was suffering from an enlarged heart and severe hypertension that threatened his life, Roosevelt experienced significant weight loss, headaches, fatigue, and an inability to concentrate for sustained periods of time. Surely the president was aware of these symptoms. If so, he must have recognized that he was in declining health, might not be able to function effectively for another four years, and even risked dying in office. His gray, gaunt, slack-jawed appearance, dull eyes, and trembling hands were the signs of an aging man in sharp decline. At Yalta in February 1945, Lord Moran, Winston Churchill's physician, observed that Roosevelt was "a very sick man. He had all the symptoms of hardening of the arteries of the brain in an advantaged stage." Moran gave the president "only a few months to live" (Dallek 1970, 1979).

In running for reelection under these circumstances, FDR committed a terrible ethical breach, as did the people close to him, and at a time when the country faced grave end of war and postwar problems. Just the fact that Roosevelt never informed Vice President Harry S. Truman of the development of the atomic bomb during the two and a half months of their administration is enough to make the ethical case against FDR for deciding to serve a fourth presidential term. One can only imagine the difficulties the country would have faced if increasing incapacity instead of a life-ending aneurysm had afflicted Roosevelt during the nearly four years remaining in his term.

The concealed health problems of John F. Kennedy and Ronald Reagan raise additional ethical questions about the public's right to know a presidential candidate's full health history. In 1960, JFK purposely denied the variety of physical troubles that had hospitalized him nine times in the late 1950s. Indeed, this medical history only came to light when I gained access to the Kennedy medical records in 2002. An honest acknowledgment of his medical condition during the 1960 campaign—spastic colitis, Addison's disease, severe back pain that drove him to take a variety of pain killers and sleep medications—would surely have jeopardized his bid for the White House, which he won by the narrowest of margins (Dallek 2003).

In 1984, as he ran for reelection, Ronald Reagan publicly showed no signs of any lingering effects of the March 1981 Hinckley shooting that had partly incapacitated him. For six months after he was shot, Reagan was unable to resume his full presidential duties. During that time, according to biographer Lou Cannon, Reagan would frequently become "exhausted to the point of incoherence" (1991, 410). Yet neither Reagan nor his cabinet acted to transfer the president's authority to the vice president, as provided for under the Twenty-fifth Amendment. Another biographer, Edmund Morris (1999), asserts that Reagan never fully recovered from the 1981 shooting. Morris believes that during the last two years of his presidency, Reagan's health problems caused him to lose control of his administration. We will not know the full extent of this development unless and until we gain access to White House records about Reagan's behavior.

Ironically, the public has often been in league with sitting presidents in hiding their medical problems. Public complicity in these cover-ups stems from a widespread reluctance to learn that a president is too unhealthy to attend to the country's business: an incapacitated president is a threat to the stability of the economy and to the government's ability to deal with foreign dangers. In the public culture, as in the individual psyche, such troubling thoughts are frequently repressed. As the constitutional scholar Sanford Levinson (2006) has asserted, in a book calling for a convention to reform what he sees as our outdated constitution, this is the sort of denial one would expect in a monarchy, where, whatever the sovereign's failings, the public needs to believe that the emperor is wearing clothes. Such denial undermines the effective functioning of any democracy.

Yet whatever the public's reluctance to deal directly with presidential health difficulties, aspirants for the office should be ethically bound to candidly address their physical and psychological limits to govern effectively. The national well-being obviously takes priority over any candidate's reluctance to acknowledge potential incapacity. True, since 1967, the Twenty-fifth Amendment allows a president to respond to this problem in rational, ethical ways. But the history of executive branch resistance to even acknowledging such health problems, let alone suspending a president's authority, does not generate much confidence in the usefulness of this constitutional remedy. The likelihood of a physical collapse should be a significant consideration in putting anyone in the Oval Office. New methods must be found to ensure that the health status of a president or a major candidate is taken into account.

Mental Health and Fitness for Office

And what about psychological disability? The frustration that Lyndon B. Johnson experienced over the Vietnam War, according to Bill Moyers and Richard Goodwin, two of his aides, made him paranoid and depressed. Moyers described it as "a pronounced, prolonged depression. He would just go within himself, just disappear—morose, self-pitying, angry.... a tormented man." Secretary of State Dean Rusk also worried about Johnson's stability, as did Mrs. Johnson and Senator J. William Fulbright. Moyers told me that he went to see Mrs. Johnson about the president's "paranoia." At one point, in a private meeting, Fulbright found Johnson more "insecure" and "frenetic" over Vietnam than he had ever seen him. He thought Johnson beyond "rational discussion" of the war. "The senator feared that while Johnson was in this mood he was capable of almost any recklessness, including the bombing of China" (Dallek 1998, 372).

Although Goodwin and Moyers considered consulting a psychiatrist about Johnson's state of mind, they never did. Nor did Rusk or anyone else in the administration consider doing anything about suspending the president's authority under the Twenty-fifth Amendment. The country survived the Johnson presidency without a disaster beyond Vietnam. But it seems highly imprudent, indeed unethical, to have left such a troubled man, apparently incapable of rational judgment, in this position of vast authority.

Richard Nixon's behavior during the 1973 Yom Kippur War was equally troubling. By October of that year, the Watergate scandal was threatening Nixon's impeachment and removal from office. Six days into the conflict, at 7:55 in the evening, when Brent Scowcroft told Henry Kissinger that the British prime minister wanted to speak to the president, Kissinger described Nixon as "loaded." They agreed that the conversation would have to wait until the next morning.[1]

Two weeks later, when the Soviets threatened to send a paratrooper brigade into the Sinai desert to rescue the Egyptian Third Army, which was surrounded by Israeli forces, Kissinger and Al Haig, Nixon's chief of staff, agreed that they needed to convene a meeting of national security officials to consider how to prevent Soviet action. In a telephone conversation at 9:50 in the evening, when Kissinger asked Haig where the president was, Haig reported that Nixon was asleep. Astonishingly, these two senior government officials agreed not to wake him. At a meeting later that night, seven national security officials agreed to raise the defense condition—the nuclear alert—from DEFCON IV to DEFCON III. Nixon was still asleep, or so inebriated or sedated that they made this decision without direct presidential authority.

When the Soviets backed down the next day, Nixon congratulated Kissinger on having done "a hell of a job." Kissinger agreed to Nixon's request to create the public impression that it was the president who had managed the crisis, but when Nixon held a press conference to trumpet his "achievement," Kissinger complained to Haig that "the crazy bastard really made a mess with the Russians." He saw Nixon as having rubbed the Soviet retreat in their faces. He feared that Leonid Brezhnev "will not take this. This guy over there is a maniac also," suggesting that Nixon was, too.

How ethical was it for the Johnson and Nixon aides to hide presidential behavior that they believed was detrimental to the public good? Was there no way for them to have put the national well-being above the fiction of their president's competence in managing war crises? These aides were public servants. Some sense of moral obligation to the public and the polity ought to have compelled them to chart such a course. Unfortunately, as noted earlier, the Twenty-fifth Amendment provides at best a limited, faulty means for doing so. Even though that amendment allows for suspending a president's authority when he is unfit to wield it, this mechanism has never been used, despite several situations that warranted it. To resolve such ethical dilemmas in the future, new tools must be put into place.

A panel of independent medical experts might become part of the process. This panel could advise the public on whether a candidate is healthy enough to serve as president. This does not mean that the public will necessarily follow the panel's lead, but if it does not, and the panel proves to be more foresighted than the public, we should be able to recall a president. This would require a constitutional amendment allowing us to recall a president in a referendum. A majority of voters would be sufficient to produce this result. Sixty percent of the House and the Senate would need to approve the measure to put a referendum before the voters. The use of this recall mechanism would not need to be limited to health problems.

Scandals, Corruption, and Presidential Ethics

If it were only hidden health problems that raised questions about presidential integrity, we could certainly imagine a future with men and women in the White House who meet the country's highest ethical standards. But the history of presidential wrongdoing reaches well beyond that. Major scandals involving financial corruption have tainted multiple presidencies. Teapot Dome under Warren G. Harding is one case in point. Another is Nixon's use of public monies for private advantage during his presidency: "I am not a crook," Nixon told a press conference in 1974. That phrase reminded people of his 1952 Checkers speech and why his critics had dubbed him "Tricky Dick." More disturbing aspects of the Nixon presidency, of course, were the abuses of power in the Watergate burglary and the White House cover-up that followed it.

One can search in vain for an administration in the twentieth century that did not have some examples of White House missteps by men and women appointed by, if not close to, the president. Roosevelt's New Deal gave birth to the term "boondoggles." Cries of cronyism marred Truman's time in office, with merchandise including deep freezes and mink coats changing hands. Dwight D. Eisenhower's private secretary, Sherman Adams, was forced from office for influence peddling. In 1960, allegations of ballot box stuffing to elect John F. Kennedy in Illinois and Texas marred the election's outcome and JFK's elevation to the presidency. The resignations of Nixon vice president Spiro Agnew and Jimmy Carter budget director Burt Lance, both for earlier financial wrongdoing, are additional examples of dishonesty that sullied presidential reputations. Complaints about Bill Clinton's lying led to his impeachment. And however unwarranted the impeachment was by any constitutional standard, it has only deepened the public cynicism and distrust surrounding the modern presidency.

On December 30, 2007, the Washington Post ran a front-page story called "Sorting Truth from Campaign Fiction." The Post asserted that every one of the major candidates in both parties had fabricated stories to make themselves more appealing to voters. "All these claims," the paper concluded, "are demonstrably false."

None of the foregoing is to suggest that recent presidential history is solely the story of unethical chief executives abusing their power for personal aggrandizement, whether political, financial, or historical. Despite wrongdoing by officials in many of these administrations, presidents themselves generally were not directly involved. Even the hapless Warren G. Harding was more a victim than an architect of the corruption that beset his abbreviated presidential term, famously declaring, "It's not my enemies, it's my friends, my goddamn friends who have me walking the floor at night."

Harry Truman, whose administration was vulnerable to the famous K1C2—Korea, communism, and corruption—campaign attack in 1952, was the victim of his affinity for less than virtuous associates. In his long career dating from his ties to Kansas City's Pendergast machine, he never made a dishonest dollar, but some of the people around him during this time certainly did.

Lyndon Johnson entered the White House in 1963 with a reputation for ethical lapses.[2] In 1937, for example, when he first ran for a congressional seat in Texas, Mrs. Johnson urged her husband's campaign manager to avoid mudslinging. She wanted Lyndon to be a "gentleman," she explained. "Well," the manager responded, "Do you want him to be a gentleman or a congressman?" LBJ's disputed 87-vote Texas primary victory in 1948 won him the nickname "Landslide Lyndon," and his acquisition of lucrative radio and television properties opened him to accusations that he had used his House and Senate offices to exert improper influence. Upon becoming president, however, he took precautions to prevent corruption or improprieties from tainting his administration. He warned male associates that if any of them engaged in questionable conduct, he would cut off a vital part of their anatomy. When his longtime aide Walter Jenkins embarrassed Johnson by getting arrested for indecent sexual behavior in a downtown Washington YMCA men's room, Johnson insisted on his immediate resignation.

In the end, of course, Johnson could not escape himself. As the ancient Greeks said, fate is character. Johnson's machinations over Vietnam created public doubts about his trustworthiness. "How do you know when LBJ is telling the truth?" Johnson's critics joked. "When he pulls his ear lobe, rubs his chin, he is telling the truth. When he begins to move his lips, you know he's lying."

It seems essential to note that despite the ethical problems that have dogged so many administrations, U.S. history also holds numerous examples of wise, constructive presidential leadership serving the national well-being at home and abroad. These demonstrations of prudent, ethical conduct to advance the public good should serve to counter public cynicism about White House wrongdoing. Nevertheless, unethical actions are an integral part of this history and can only be ignored in a romanticized recounting that aims more to burnish the country's self-image than to yield a realistic assessment of its institutional strengths and weaknesses. Theodore Roosevelt was contemptuous of flag wavers who put loyalty to the president ahead of truth telling. He said that it would be "base and servile" and even "morally treasonable to the American public" not "to blame a president when he does wrong" (Roosevelt 1918). That statement is a useful guide in examining ethical issues in the presidency.

Presidential Secrecy and Deception in Foreign Policy and National Security

In its earlier discussion of presidential health problems and their concealment, this essay suggests a clear dividing line. Hiding a serious health issue, on the part of a U.S. president or a major presidential candidate, is a violation of the public trust and therefore unethical. And the preceding discussion of past presidential misdeeds and unlawful lapses by presidential aides and associates can stand as a cautionary tale against future ethical failings in this new century. In these two sets of cases, the line between ethical and unethical behavior is sharp and distinct. But this also brings forward a gray area of presidential line crossing that deserves the closest possible scrutiny. I think here of executive decisions to bend existing rules to defend the nation from destabilizing and hostile influences that might jeopardize the country's safety or national security.

American involvement in international affairs beginning with World War II has challenged presidents to make unwelcome decisions that violate traditional assumptions about appropriate executive behavior. By definition, these crucial decisions on matters of state forced the nation's chief executives to face ethical dilemmas of a sort their predecessors seldom contemplated. Spying on both foreign and alleged domestic enemies, covert operations aimed at overturning unfriendly foreign governments, and assassination plots have been a significant part of what presidents believed essential to guard against external and internal dangers.

Retrospectively, a number of these actions have seemed excessive and counterproductive. To be sure, the Federal Bureau of Investigation, Office of Strategic Services, and Central Intelligence Agency had their successes in countering hostile actions that may justify diminished regard for other nation's sovereign rights. But all of these operations, however necessary they seemed at the time, should eventually be revealed as a firewall against future misjudgments in the name of national security. The price of liberty, Thomas Jefferson observed, is constant vigilance. Examples of such ethically questionable policy choices span the past half century and more: the Eisenhower administration's covert operations to overthrow unfriendly governments in Iran and Guatemala; John F. Kennedy's Bay of Pigs attack on Fidel Castro's Cuba and subsequent plots to destabilize his government and even assassinate him; the Kennedy-sanctioned toppling of Ngo Dinh Diem in South Vietnam in 1963; Lyndon Johnson's Gulf of Tonkin Resolution in 1964 on dubious claims of North Vietnamese attacks on U.S. destroyers; Richard Nixon's complicity in overthrowing Salvador Allende's Chilean government; the Reagan administration's efforts to oust Nicaragua's Sandinistas; and the catalogue of wrongs alleged against the recent administration of George W. Bush, including, most notably, the sanctioning of torture and the invasion and occupation of Iraq justified by unfounded fears of weapons of mass destruction.

These are all examples of foreign policy actions that are not only questionable on ethical grounds but also resonate today as unwise initiatives that ill served the nation's security. Current ongoing tensions with Iran can be traced back to the Eisenhower administration's toppling of Mohammed Mossadegh. Our enduring breach with Cuba's Castro regime is in part the result of Kennedy's unwise actions. The coup against Diem did more to destabilize South Vietnam and deepen U.S. involvement in the Vietnamese conflict than to promote Saigon's autonomy and our early exit from the fighting. The Tonkin Gulf resolution allowed Johnson to justify his escalation of U.S. involvement in Vietnam without additional congressional consultations about committing the nation to a large-scale war. The ousting of Allende inflicted a repressive Pinochet government on Chile and deepened antagonism to the United States across Latin America, as did our anti-Sandinista actions in Nicaragua. The Iraq War has cost significant numbers of American and Iraqi lives and undermined America's international standing and George W. Bush's presidency.

I am not categorically opposed to secret operations in the service of the nation's security. There are times when extraordinary actions may be necessary to defend the United States against external threats that could inflict serious losses in blood and treasure. In a dangerous world, "dirty hands" may sometimes be required to advance a greater good, such as protecting vulnerable populations or bringing to justice the perpetrators of egregious international crimes. Would anyone today question Roosevelt's judgment if he had directed the Office of Strategic Services to assassinate Adolf Hitler or any of the other high Nazi officials who drove the world into a war that cost 50 million lives and committed such heinous crimes against humanity? Who now can make a persuasive ethical case against Israel for having violated Argentina's sovereignty in bringing Adolf Eichmann to justice? Yet in principle, presidents should not sign on to coups against foreign governments, nor should they deceive the public about foreign policy, as they typically have. Presidential acts of deception in such matters may rightly be condemned as unethical violations of democratic standards.

Lyndon Johnson's expansion of the Vietnam War is a prime case in point (Dallek 1998). In 1966, he tried to hide or at least mute his commitment to a wider ground war in Southeast Asia. In January, after deciding to increase the number of ground forces in Vietnam by 120,000 men, he hid his decision from the press and the public by announcing 10,000 monthly troop deployments over the next 12 months. The deception denied the Congress, and the country more generally, the chance to have a full-scale debate on the wisdom of an expanded Vietnam conflict. It was an abuse of presidential power and went far to trap the United States in what turned out to be an unwinnable war that cost the country more than 58,000 lives, not to mention the millions of Vietnamese, Laotians, and Cambodians who also paid the ultimate price for Johnson's decision to expand the war. We would likely have a different take on Johnson's actions if he had preserved South Vietnam's autonomy in a more limited conflict that cost significantly fewer U.S. casualties. But the relative unimportance of Vietnam in the larger U.S. victory in the Cold War currently underscores the folly of having fought so costly and unproductive a conflict.

A similar judgment can be made about Nixon's four-year withdrawal policy from Vietnam.[3] From the start of his presidency in 1969, Nixon assumed that his reelection in 1972 depended in large part on ending American involvement in the war. He wished to ensure, however, that a withdrawal would not appear to be a defeat, which he believed would undermine U.S. credibility with adversaries and allies and jeopardize his reelection. In December 1970, when H. R. "Bob" Haldeman, Nixon's chief of staff, told Henry Kissinger that the president wanted to end U.S. involvement in Vietnam by the end of 1971, Kissinger warned that if an American withdrawal were followed in 1972 by Saigon's instability, it could have an adverse effect on the president's reelection. To fend off this possibility, Nixon delayed a full withdrawal until after the 1972 elections.

It was an act of political cynicism. Nixon and Kissinger had little hope that Saigon would be able to defend itself against the North whether they ended the war in 1971 or two years later, as was the case. Nothing was gained by waiting those two additional years, and much was lost. The U.S. paid a heavy price in thousands of additional lives and further damage to America's international credibility. The longer the United States stayed in the war, the greater the doubts about Nixon's honest commitment to a structure of peace that he described as his principal foreign policy goal. If Vietnamization or some other policy had been successful in staving off Saigon's defeat, Nixon could have made the case for the additional sacrifices. But South Vietnam's collapse in January 1975 gave the lie to Nixon's assertions in 1973 that his administration had achieved "peace with honor."

My point here is that what we see as ethical in foreign policy is often tied to a pragmatic judgment on whether it worked. A failed policy that cost lives and money and retrospectively was unnecessary in ensuring the country's long-term security, such as Vietnam, seems deserving of condemnation. This application of an ethical judgment decades after disputed events occurred may seem arbitrary. But as policy makers locked in disputes famously remark, "history will judge." And it does.

It is one thing to have misled the country in support of a mistaken war that cost many lives with little national gain; it is another thing entirely to deceive the country about a policy that provoked much contemporary opposition but ultimately benefited the nation. It is instructive in this regard to set the Johnson and Nixon examples alongside of earlier foreign policy deceptions by FDR. Roosevelt's understanding of the country's national security needs differed from that of most Americans during World War II. His awareness of that difference led him to mislead the public about his postwar views of international affairs.

Despite U.S. involvement in two global conflicts in 25 years, a majority of Americans were not ready to embrace great power politics as the surest way to avert future wars. Instead, they hoped that the century's second global conflict would convince governments and peoples everywhere to abandon old-fashioned habits of selfish nationalism for collective security through a United Nations organization. The naïve illusion during the war was that nations would follow America's lead in adopting democracy at home and the rule of law abroad. The clearest statement of this desire was expressed in the 1943 best-selling book One World, by 1940 Republican presidential nominee Wendell Willkie. The greatest nonfiction best seller to that point in U.S. history, One World described a trend everywhere, especially in Russia and China, toward the Americanization of economic and political life. The message of Willkie's book was that inside every foreigner was an American waiting to emerge.

Franklin Roosevelt understood that if he was going to lead Americans through the war prepared to embrace internationalism, he needed to convince them that they would be at the forefront of a mass movement toward Wilsonian universalism. He consistently encouraged the belief that Soviet Russia and China were becoming democratic nations, eager to cooperate with the United States in working through a United Nations to ensure the future peace. When he returned from the Yalta Conference in February 1945, for example, FDR famously described the meeting as signaling the end of spheres of influence and balance of power arrangements that had failed to avert war in the past.

Roosevelt's public pronouncements catered to current popular sentiment and, to some extent, reflected his own idealism, but they were not the full measure of his outlook. He had limited hopes that the world was undergoing a shift away from power politics to universalism, and that friendly dealings would necessarily mark Soviet–American relations rather than rivalry and an arms race. The fuller expression of Roosevelt's thinking was revealed in the agreement that he reached with Churchill in 1944 to hold back the secret of the atomic bomb from Stalin. Roosevelt was uncertain about future friendship with so different a country and government as Soviet Russia. He saw the atomic bomb as a means to check post-1945 Soviet aggression if, as he believed possible, the United States and the Soviet Union came into conflict in the postwar world.

Roosevelt saw his deception as essential to bring the country through the war prepared to participate in international affairs. He correctly anticipated that should it become clear after 1945 that Moscow was more an adversary than an ally, Americans would not retreat into isolationism but rather take up their new role as the leading defender of free nations. His prescience makes his deception less an ethical breach than an act of realistic statesmanship. Like Thomas Jefferson, who believed his Louisiana Purchase was unconstitutional, but was justified by what he called "the laws of necessity," or "of self-preservation," Roosevelt considered his idealized picture of postwar relations a small price to pay for persuading Americans to stand against future acts of aggression by tyrants who would jeopardize American national security. What FDR did was an example of what Machiavelli called "wise discretion." Unfortunately, "unwise discretion" that does not meet the test of "self-preservation" has more often been the case.

Roosevelt, Johnson, and Nixon all engaged in similar deceptions about foreign policy, but in Roosevelt's case, the outcomes were markedly different. And outcomes play a decisive role in the ethical judgments of history. This is something we have to live with in a dangerous world—presidents who sometimes act in secret for what they see as the nation's good. We judge presidents based on whether their hidden foreign policy actions succeeded or failed and on whether their goals were justified. These are moral judgments. LBJ and George W. Bush will justifiably receive poor marks from historians for having acted unwisely in what they mistakenly saw as serving the national interest in Vietnam and Iraq. FDR, by contrast, misled the public much as Johnson and Bush did. Yet he is generally, and justifiably, regarded as one of our greatest presidents—one who advanced the nation's well-being and preserved its fundamental values.

When thinking about presidential leadership in the twenty-first century, we would do well to consider how we can inhibit these unwise acts of the imperial presidency. The debate over the Iraq War and the Bush administration's use of presidential power to combat terrorist threats—torture, rendition, abuses of civil liberties through surveillance of suspected terrorists without regard for the Foreign Intelligence Surveillance Act, and signing statements that Bush used to ignore parts of laws he saw as conflicting with his duties as commander in chief—raised questions about "wise discretion." The legacy of these policies has loomed large over the new Obama administration during its first year in office.

Will George W. Bush be seen as a president who bent the rules in the necessary service of national defense, or as one who overreacted and abused the powers of his office? The argument asserting "abuse" seems compelling, but during Bush's two terms in office, the evidence was less than sufficient to make an effective case for the president's impeachment. Is there no way, then, to rid ourselves of a chief executive about whom substantial questions of ethical failings and incompetence have been raised if the charges fall short of what sitting members of Congress see as high crimes and misdemeanors?

Because the Constitution's impeachment apparatus is so difficult to use against a president—only three times in the republic's 218-year history and only twice against an elected president—and because the Twenty-fifth Amendment has so far proved no deterrent to presidential missteps, we would do well to find some other way to oust an incumbent whose policies and ethical behavior have antagonized a majority of the people.

Recent experience suggests the need for a constitutional recall amendment. It should set a high bar for removal and include a process that would be the greatest possible expression of the popular will. Sixty percent of the House and Senate or possibly two-thirds of both bodies should be able to vote for a national referendum that would be open to all citizens eligible to vote in state elections. The ballot would ask voters to say yes or no to removing the president and vice president, who is almost always the president's alter ego, from office. Should the majority vote to recall both incumbents, the Speaker of the House would succeed to the presidency and the new president would select a vice president, who, as mandated by the Twenty-fifth Amendment, would need confirmation by congressional majorities.

There seems little danger that the recall provision would be abused. Only two governors have been recalled in the last century, including Gray Davis in California, where Arnold Schwarzenegger has given the recall a good name.

Even if the recall was little used, its availability could help pressure a president into avoiding actions within his administration that could turn the country against him. Would Presidents Ulysses S. Grant and Warren G. Harding have been so casual about wrongdoing in their governments if they had known that a remedy other than impeachment existed for ousting them? The country seems certain to face future struggles over questions of ethical presidential behavior. And while history will judge the persuasiveness of such complaints, it seems well to have a contemporary tool to help keep our all too flawed presidents and their administrations on the straight and narrow.

On December 31, 2007, the New York Times expressed hope that American voters in the coming year would "have the wisdom to grant the awesome powers of the presidency to someone who has the integrity, principle and decency to use them honorably." Abuses such as those committed by the Bush administration, the Times asserted, revealed contempt for "the Constitution, the rule of law and human decency." To many observers, the succession of a former professor of constitutional law to the presidency seemed an ideal antidote to these abuses of executive power. Less than a year into the presidency of Barack Obama, the early returns suggest that the Bush legacy in this area is easier to criticize as a candidate than to undo from the White House. Indeed, the Obama administration has shown an institutional desire to hold onto at least some of the expanded powers asserted by its predecessor.

Is an ethical presidency, then, possible in the twenty-first century? Despite past missteps, I would like to think so. True, the recent Bush administration did not inspire a lot of hope, and so far, the Obama administration has not fulfilled the hope that candidate Obama inspired in many. But Bush's secretiveness, which did more to undermine than to serve his presidency, may well encourage future chief executives to be more transparent with the public and more cautious about abusing their powers. Moreover, the rise of a more aggressive media, which nowadays doggedly tracks presidential misdeeds, may also make presidents more cautious about deceiving the public.

This essay has focused especially on two salient types of presidential secrecy and deception, the first concerning the health problems of a president or candidate, and the second concerning foreign policy decisions. I have argued that these two types of behavior are ethically distinct from one another, although the ethics of health information is clearly intertwined with the conduct of foreign policy.

Presidents and presidential candidates are not, in my analysis, entitled to hide their personal health problems. A crucial reason why is that U.S. presidents have their finger on the nuclear trigger; both their physical and mental stability are always of vital moment. To help ensure ethical conduct in this area, a special medical panel should be empowered to monitor health information about any sitting presidents and all major candidates. Is there any situation in which a president is justified in hiding a medical problem that would not affect his performance but might deny him the office? I think not. In my view, a president or candidate who squares with the public on health issues will engender greater trust from the voters and actually raise his chances of being elected or reelected rather than turning voters away. Any run for the presidency is something of a roll of the dice. The question of health problems is simply another part of the process that makes up our democratic system.

The ethics of secrecy and deception concerning foreign policy is more complex, involving shades of gray. There may be instances in which the country is better served by a president dealing with a foreign policy issue out of public view. This is not to suggest a blanket endorsement for secrecy. Presidential deceptions, such as with Vietnam and Iraq, have caused us a lot of grief. The commitment of the nation's blood and treasure to policies abroad should always rest on a bedrock of democratic support.

To be sure, as the FDR example demonstrates, the conduct of foreign affairs does not have to be entirely above board to give it long-term ethical standing. Nevertheless, we can hope that a country that prides itself, as Abraham Lincoln said, on being the last best hope of earth would set a standard by its highest elected official that Americans could take pride in and peoples everywhere could wish to emulate. Ω

References

American Presidency Project. n.d. "1888 Presidential Election." Available here [accessed November 12, 2009].

Bradley, Omar. 1948. "Armistice Day Speech." In The Collected Writings of General Omar N. Bradley, vol. 1. Washington, DC, 1967.

Cannon, Lou. 1991. President Reagan: The Role of a Lifetime. New York: Simon & Schuster.

Dallek, Robert. 1970. The Roosevelt Diplomacy and World War II. New York: Holt, Rinehart and Winston.

Dallek, Robert. 1979. Franklin D. Roosevelt and American Foreign Policy, 1932-1945. New York: Oxford University Press.

Dallek, Robert. 1988. Flawed Giant: Lyndon Johnson and His Times, 1961-1973. New York: Oxford University Press.

Dallek, Robert. 2003. An Unfinished Life: John F. Kennedy, 1917-1963. Boston: Little, Brown.

Dallek, Robert. 2007. Nixon and Kissinger: Partners in Power. New York: HarperCollins.

James, Henry, ed. 1920. The Letters of William James. Boston: Atlantic Monthly Press.

Levinson, Sanford. 2006. Our Undemocratic Constitution: Where the Constitution Goes Wrong (And How We the People Can Correct It). New York: Oxford University Press.

Link, Arthur S. 2002. "Woodrow Wilson." In The Presidents: A Reference History, ed. Henry F. Graff. New York: Gale Group, 365-88.

Morris, Edmund. 1999. Dutch: A Memoir of Ronald Reagan. New York: Random House.

Nixon, Richard M. 1977. "Interview with David Frost." New York Times, May 20, p. A1.

Orwell, George. 1946. "Politics and the English Language." Horizon, no. 76 (April).

Roosevelt, Theodore. 1918. Op-ed. Kansas City Star, May 7.

Solzhenitsyn, Alexander. 1974. Quoted in The Observer, December 29.

Endnotes

[1] The following discussion draws heavily on Dallek (2007) supra.
[2] This discussion draws heavily on Dallek (1998) supra.
[3] The following discussion draws heavily on Dallek (2007) supra.

[Robert Dallek is the author of An Unfinished Life: John F. Kennedy, 1917-1963 and Nixon and Kissinger: Partners in Power. He is an elected fellow of the American Academy of Arts and Sciences and of the Society of American Historians, which he served as president in 2004-2005. Dallek attended the University of Illinois, graduating with a B.A. in history in June 1955. He then spent several years at Columbia University, earning an M.A. in February 1957, and a Ph.D. in June 1964. He retired as a professor of history at Boston University and previously taught at Columbia University, UCLA, and Oxford. He was a co-winner of the Bancroft Prize in 1980 for Franklin D. Roosevelt and American Foreign Policy, 1932–1945 (1979).]

Copyright © 2010 Presidential Studies Quarterly

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.Copyright © 2010 Sapper's (Fair & Balanced) Rants & Raves