OK, OK. Politics makes strange bedfellows. Molly Ivins, one minute. A National Review editor, the next minute. In the interest of fairness & balance watchwords of this Blog I will grant Garfinkle several of his points. We in the United States have a parochial view of the world. We in the United States do not deal well with complicated issues. What I like about the Garfinkle essay is his willingness to flail the Right and the Left equally. I disliked his defense of W, particularly this sentence: ...An administration has a right to its own rhetoric, and with it a right to put some distance between that rhetoric and its actual conduct. W's rhetoric is babble. I do not think for a MINUTE that W differentiates between his rhetoric (Bring 'em on! and actual conduct (as long as it's someone else taking 'em on). However, the point is well-taken. W and the rest of us don't have a clue as to what is good policy v. bad policy. W will pay the price in the history books later this century. If this be (fair & balanced) sedition, make the most of it.
Foreign Policy Immaculately Conceived
By Adam Garfinkle
For most normal people most of the time, thinking about U.S. foreign policy is an Andy Warhol sort of experience, which is to say that for about 15 minutes every few months (or years, for some), a foreign policy matter becomes “famous” in their consciousness. When a talented but untutored journalistic mind focuses on a foreign policy issue, particularly one that editors will pay to have written about, an amazing thing sometimes happens: All of a sudden, crystalline truth rises from the clear flame of an obvious logic that, for some unexplained reason, all of the experts and practitioners thinking and working on the problem for years never saw. This is the immaculate conception theory of U.S. foreign policy at work.
The immaculate conception theory of U.S. foreign policy operates from three central premises. The first is that foreign policy decisions always involve one and only one major interest or principle at a time. The second is that it is always possible to know the direct and peripheral impact of crisis-driven decisions several months or years into the future. The third is that U.S. foreign policy decisions are always taken with all principals in agreement and are implemented down the line as those principals intend — in short, they are logically coherent.
Put this way, of course, no sentient adult would defend such a theory. Even those who have never read Isaiah Berlin intuit from their own experiences that tradeoffs among incommensurable interests or principles are inevitable. They recognize that the urgent and the imminent generally push out the important and the eventual in high-level decision making. They know that disagreement and dissension often affect how public policy is made and applied. More than that, any sober soul is capable of applying this elemental understanding to particular cases if he really puts his mind to it.
Oh, really? Not so for some, apparently, when grinding axes against a deadline. Prominent examples from the recent history of U.S. policy in Southwest Asia — examples that still bear on current matters, as it happens — show that the immaculate conception theory is alive and all too well. It also shows, mindless aphorism notwithstanding, that hindsight is not necessarily 20/20.
How many times have we heard the clarion claim that the covert U.S. effort to aid the Afghan mujahedeen through the Pakistani regime during the 1980s was, in the end, a terrible mistake because it led first to a cruel Afghan civil war and then to the rise of the Taliban? I have lost count.
This argument is about as cogent as saying to a 79-year old man — Ralph, let’s call him — that he should never have gotten married because one of his grandsons has turned out to be a schmuck. But a person does not consider marriage with the character of one of several theoretical grandchildren foremost in mind. It was not possible at the time of the nuptials for Ralph to have foreseen the personality quirks of a ne’er-do-well son-in-law not yet born; so, lo and behold, the fine upbringing that he bequeathed to his children somehow got mangled in translation to the next generation. These things happen.
Similarly, in 1980, when the initial decision was made (in the Carter administration, by the way), to establish links with the mujahedeen, the preeminent concern of American decision makers was not the future of Afghanistan, but the future of the Soviet Union and its position in Southwest Asia. Whatever the Politburo intended at the time, the consolidation of Soviet control in Afghanistan would have given future Soviet leaders options they would not otherwise have had. In light of the strategic realities of the day, the American concern was entirely reasonable: Any group of U.S. decision makers would have thought and done more or less the same thing, even if they could have foreseen the risks to which they might expose the country on other scores.
But, of course, such foresight was impossible. Who in 1980 or 1982 or 1985 could have foreseen the confluence of events that would bring al Qaeda into being, with a haven in Afghanistan? The Saudi policies that led to bin Laden’s exile and the Kuwait crisis that led to the placement of U.S. forces on Saudi soil had not yet happened — and neither could have been reasonably anticipated. The civil strife that followed the exit of the Red Army from Afghanistan, and which established the preconditions for the rise of the Taliban government, had not yet happened either. Of course, despite the policy’s overall success in undermining the Soviet position in Afghanistan, entrusting Pakistan’s Inter-Services Intelligence Directorate to manage aid to the mujahedeen turned out to be problematic, but who of the immaculate conception set knows whether there were better alternatives available at the time? There weren’t; a tradeoff was involved, and it was a tradeoff known to carry certain risks.
True, the United States walked away too soon from Afghanistan after the Red Army departed in 1989, but the Berlin Wall was falling and it seemed that more important issues were at hand. (And they were more important.) Besides, pace the third premise of the theory of the immaculate conception, there was disagreement among administration experts as to what would happen in Afghanistan. One prominent insider, born in Afghanistan, was confident that things would not go sour. He was mistaken, but his assessment was not unreasonable. These things happen.
Another example of immaculate conception-style analysis, also very well worn, specifically concerns the shah of Iran; but this example has a generic character known as the “friendly tyrants” problem. The particular claim is almost endlessly made that it was a terrible mistake for the cia to have overthrown Mohammed Mossadegh in 1953 to restore the shah to his Peacock throne, for that, it is averred, is what brought the Ayatollah Khomeini to power and sired the disaster of 1978-79 (and, one could reasonably add, the disaster of 1979 to present). The generic “friendly tyrants” argument is now applied widely if thoughtlessly to U.S. support over the years for undemocratic regimes in Egypt, Jordan, Saudi Arabia, Morocco, Pakistan, and other Muslim countries. The argument, such as it is, goes like this: The people in these countries hate the United States because the U.S. government is complicit in their being repressed by ugly and incompetent regimes. Just as in Iran, they predict, we’ll be sorry one day — “after the revolution comes” — for ever having helped the “bad guys.”
Now, the fall of the shah and the rise of the Islamic Republic is a complex case; one could write an entire book about it. Indeed, some have done just that — and all of the serious books on the subject show that the immaculate conception theory has it wrong. American interests in Iran in the early 1950s (the broader Western interest, too; the British had as much to do with the fall of Mossadegh as did the United States) had to do with Cold War geopolitics. Mossadegh was anti-Western by rhetoric and policy disposition. When he came to power in 1951, the Truman administration worried, particularly in light of Soviet behavior in northern Iran after World War ii, that a populist regime of that sort would end up being allied with or suborned by the Soviet Union. The coming of the Eisenhower administration did not allay U.S. fears as Mossadegh’s policies became ever more worrisome, and so a plan devised mostly before Eisenhower took office went successfully forward with the result that Iran remained a bulwark of Western defenses in the Middle East and Southwest Asia for the critical next quarter-century.
It is easy now to dismiss this benefit as “mere,” but it was not mere at the time. The pro-Western orientation of Iran from a time before Suez until the Camp David Accords was of enormous value to U.S. and Western statecraft. Not only did it make military planning for Southwest Asia a far less onerous task for the United States than it would have been if Iran had been a Soviet client, but it helped balance regional geopolitics for important American allies, notably Turkey and Israel. The shah’s relatively moderate hand in the development of opec, at least until 1974, was also of immeasurable benefit to the postwar international economy. Playing back history in the counterfactual tense is a frustrating and often futile exercise, but as thought exercises go, trying to imagine how constrained U.S. policies might have been with Iran on the wrong side of the Cold War is sobering. Whether one takes the Suez crisis (which was botched enough from the American side as it was), the twin Middle East crises of 1958 over revolution in Iraq and near-civil war in Lebanon, or the 1967 and 1973 Arab-Israeli wars, U.S. options military and diplomatic would have been much impoverished. Indeed, the whole face of U.S. policy in the region, which was overall a very successful one, would have looked very different and likely much more dour.
Key to the immaculate conception case with regard to Iran is the rather flat portrait presented of the shah’s moral and political debility. The United States cannot be guilty by association with tyranny in the eyes of the tyrannized unless the protected ruler is indeed vile. But the shah was not vile, and he was not unpopular for most of his tenure. His repressive tendencies came fairly late, after he had lost several trusted and wise advisors and after he became ill with the cancer that eventually killed him. For most of his rule he tried to emulate his father, Reza Shah, as a modernizer — and with the White Revolution he made much progress in that regard. He dispossessed the clerical estates and tamed the power of the landed aristocracy. As serious students of modernization know (alas, that leaves out almost all write-from-the-hip journalists), land reform is absolutely essential to economic and eventually political modernization; by almost any standard the shah’s efforts in this regard were impressive. Iranian modernization as directed from the Peacock throne probably went farther and was more sustainable than any that Mossadegh and his disputatious colleagues and successors could have achieved. Indeed, had Iran come under a form of even limited neo-imperial Soviet influence like that from which Egypt, Algeria, Iraq, Syria, and other countries have so much suffered, its “reforms” might have actually been retrogressive.
More than that, though the immaculate conceptionists tend not to know it, the shah granted the vote to women in 1964. It was this act that first galvanized clerical opposition to the regime and was the catalyst for the first occasion upon which Ruhollah Khomeini went out and got himself arrested. We know how the story turned sad in 1978, but the success of the shah’s reforms went so deep in Iranian society that the rule of the Islamic Republic will, in the end, not stick. Perhaps the best illustration of this is that the mullahs have not dared suggest that the vote be taken away from women, though this is precisely what their theology would mandate. The clerical regime’s reticence on this score defines a significant limit, a social red line, that leaves open a dynamic in which the empowerment of women may well drive Iranian society toward pluralism, the flowering of liberal constitutionalism, and eventually democracy.
Even that is not quite all. Immaculate conception theorists hold that once the shah was restored, his repressive misrule made the Ayatollah Khomeini inevitable. Not only is the shah’s repression distorted and exaggerated in their telling of it, but it was the bungling of the Carter administration that allowed the clerics to seize power. Illustrating the difference between an ignoramus and a fool, some of that administration’s cabinet members not merely believed — they actually said it publicly — that Ayatollah Khomeini was a “saint” who would soon retire from politics. Worse, the administration actively dissuaded the Iranian military, via the infamous Huyser mission among other modalities, from preventing the mullahs from taking power. Supporting the shah was good policy. Failure to adjust when the shah’s touch slipped was unfortunate but not fatal. The mismanagement of the endgame was disastrous, but it was also entirely avoidable.
This is not the place to rehearse the larger “friendly tyrants” debate; only suffice it to say that since countries such as Egypt and Saudi Arabia were undemocratic long before the United States ever began to support their governments, the argument that the United States is somehow responsible for their being undemocratic is a little hard to follow. However they came to be undemocratic, U.S. support does implicate us in their misanthropies — true enough. But once again, it is a mistake to think that one and only one set of interests is at play at one time. The proper question to ask is: What have been the interests and principles — plural — at issue, and what have been the available alternative policy choices to deal with them, particularly when a given action may advance one interest at the cost of retarding the achievement of another? Was the United States ever in a position merely to wave its hand and bring democracy to Egypt or Saudi Arabia? Would it have been responsible to try to do so in light of the other strategic interests we held in common with these countries — the utility of their anti-Soviet postures, their role in preserving the stability of the Egyptian-Israeli peace treaty, their contribution to moderating the price of oil and hence aiding the health of the international economy, and others besides? Not only were these no small matters, neither were they bereft of moral implications. It is not morality but moral posturing to wear human rights concerns on one’s sleeve, indicating as it usually does one’s favoritism for intentions over consequences, while simultaneously presuming that concern for the structural elements of international peace and prosperity are the domain of the cold, gray overseers of corporate and national equities. This verges on ethical illiteracy.
In any event, the answer to both aforementioned questions is plainly “no,” and any group of responsible American decision makers, sitting in the seats and seeing the world as those decision makers must have sat and seen, would have reached the same conclusions. Could we have attained a better balance between our strategic interests and our democratic principles in relations with such countries? No doubt we could have and should have. It is true, too, that a certain condescension toward Middle Easterners and their social and political proclivities was in quiet evidence. (President Reagan broke through this condescension once, in spectacular if limited fashion, when during the 1985 Achille Lauro tragedy he contended against Egyptian pleas that Americans “understand” the Arab cultural definition of a “white lie” with the assertion that Egyptians were perfectly capable of understanding the American view as well.) The point, of course, is this: In the pinch during the Cold War, when major decisions had to be made, U.S. decision makers (even in the Carter administration) never allowed democratic reform and human rights concerns to trump all other interests. This was particularly so in countries where democracy was not ingrained in local political culture and where U.S. efforts were therefore unlikely to bear fruit. Of course this was the right approach to take, for the Cold War was the preeminent moral stake at play in the world. Of what value would an accumulation of moralist gesticulations have been against the survival of Soviet power?
But that was then, and this is now. Cold War habits have died hard in some places; in more than a few, unfortunately, they are breathing still. What Pakistan represented to the United States during the Cold War, when India was a Soviet ally and China was a tacit ally of the United States also allied with Pakistan, was one thing. What Pakistan represented after 1991 was something else, but the U.S. policy establishment was slow to mark and act upon the change. That establishment has been even slower to reckon the meaning of the end of the Cold War for U.S. interests in Korea and Northeast Asia generally. It made sense to risk war and to pay other costs in Korea between 1953 and 1991, when that peninsula was tied to a larger stake; if it has made sense since 1991, the logic has not been demonstrated, to say the least. Should we have adjusted policy toward Arab and other “friendly tyrants” after the end of the Cold War, when the balance of our interests changed? Absolutely. Since September 11, 2001, this conclusion has finally begun to sink in, as has a sense of regret that the U.S. government did not reach it sooner. Mistakes have indeed been made, but not the ones the immaculate conceptionists cite.
These examples of the immaculate conception theory of U.S. foreign policy tend to come from the political left, where the attack on realism’s supposed interest-based bias in favor of a values-based one has long succumbed to the “mass” production techniques of modern journalism. (Just who really acts in the service of values is not all that clear, as suggested above, but never mind.) The theory of the immaculate conception is not limited to the left, however, as the mother of all examples — the end of the Gulf War in late winter 1991 — demonstrates.
It has become axiomatic in many right-of-center circles that the decision not to march to Baghdad and bring down the Baath Party in February 1991 was a terrible mistake. Some in the George H.W. Bush administration believed that at the time, and President Bush himself did subsequently acknowledge some misjudgments — though not the decision to abjure an occupation of Baghdad. Rather, the former President Bush admitted that the exact terms of the ceasefire and the failure of the United States to assist the Kurdish and Shia uprisings were connected mistakes. He has it exactly right.
There were good and not-so-good reasons for the decision not to march to Baghdad. Judging from what was known then — not three months or two years or ten years afterwards — it is both possible and still important to ponder them. Indeed, given America’s current undertaking in Iraq, it is, or ought to be, irresistible.
A reason given at the time to stay away from Baghdad was that the United States had U.N. Security Council authorization only to liberate Kuwait, not to invade Iraq. This was true, technically, and the United States does need to be mindful of how its actions affect the systemic contours of international law, an institution that benefits us more than it benefits most others. That said, this was not a strong argument. Had the administration viewed other considerations differently, the U.N. barrier alone would not have stopped it.
Among those other considerations was the concern that the United States could not count on a continuation of very low casualties among coalition forces once the battle was moved to Iraqi soil. That concern was predicated on the fear that the battle would not remain conventionally fought if the regime in Baghdad concluded that its days were literally numbered. The worry was that higher casualties would undermine the coalition and reduce public support for the war. We do not know whether the Hammurabi or the Medina divisions of the Iraqi Republican Guard would have fought well on their own soil in 1991. Nor is it certain that the Iraqis would have used chemical or biological weapons against U.S. forces. But the possibilities were not so far-fetched that responsible American decision makers would have failed to take them seriously.
A third argument, now all but forgotten, was that Iraq was a Soviet client, and to occupy Iraq would be to humiliate the Soviet Union, with whom we hoped to create a new world order. This, of course, was an argument that some people are glad is now forgotten, since it reveals the fact that some key American decision makers suffered a massive failure of imagination. They could not conceive of a world without the Soviet Union, which lasted only another 10 months. At the time, however, given the premise of a surviving ussr, this too was not an unreasonable consideration.
Yet another argument was that bringing down the Baath would splinter Iraq and, via its Shia population, provide a major advantage to Iran at Saudi expense. People disagreed then, and still disagree, about the plausibility of this scenario. No serious regional expert has ever credited it; all agree that Arab and Persian blood is thicker than sectarian Islamic water and that Iraqi Shia are not unionist-minded with Iran. The administration at the time took this danger seriously, however, not least because its policy was ultimately Saudi-centered, not Kuwaiti-centered — and the Saudis were adamant about this danger.
But why? Do the Saudis understand their own neighborhood less well than Western experts? Of course not. The Saudis were not worried by Iranian irredentism per se, but as good Muslims they follow Abu Bakr’s admonition to know one’s genealogies, not to be as the ignorant peasants who say they come from this or that place. It was not, and is not, borders that worry the Saudis, but sectarian loyalties. They very much prefer a Sunni-dominated authoritarian Iraq than a looser, more populist Shia Iraq. They know that Shia are the majority in Iraq, and they have watched in recent years as a similarly despised and downtrodden Shia Arab minority in Lebanon has risen to significant political status. They do not want to see such a thing repeated on their northern border, not just because the religion of the Wahabbis abhors Shia saint cults and theological apostasy, but because probably the majority population in Al-Hasa province, where most of Saudi Arabia’s oil lies, is Shia. They fear sectarian contagion from a Shia-dominated Iraq, not a split, territorially fractured Iraq. But they fed American leaders a line in 1991, and those leaders appear to have swallowed it whole.
Finally, some argued that an American-led occupation of an Arab capital would exhume all of the gruesome historical demons of the previous three centuries of Muslim-Western conflict from their netherworld haunts, making the United States the focus of enormous resentment and hatred throughout the Arab and Muslim worlds. We might think of ourselves as liberators, and the Iraqis might have welcomed us the morning after. But then how to set up a new regime and take our leave without undermining that regime’s nationalist credentials? (Hint: This is difficult.) How to create a government decentralized enough to accommodate Iraq’s ethnic and sectarian heterogeneity but not so decentralized as to tempt Iraq’s neighbors to turn the country into a souk for espionage, smuggling, and general mayhem? (Hint: This is even more difficult.) How to build a defense force after the fact that is capable of defending Iraq from large neighbors like Iran but not at the same time “too” large juxtaposed against smaller neighbors like Kuwait? (Hint: This is impossible.) So getting to Baghdad was never the problem; getting out of it without creating more trouble than we would have resolved was the hitch.
Alas, as should be all too clear, this is not a mere historical footnote. It is a problem still resonating a dozen years later, and none of the problems contemplated in 1991 have gotten easier. This, ultimately, is the best reason for not having marched to Baghdad, and that reason, conjoined to the belief of virtually every expert that Saddam would not survive six months after such a trouncing humiliation, explains why we did not then go there. It was a reasonable prediction, but it was wrong. These things happen.
It would not have been wrong, however, had the two mistakes to which President George H.W. Bush has pointed not been made. Had U.S. civilian authorities not ceded their decision-making power to General Norman Schwarzkopf, who let the Iraqis fly those helicopters, and had the United States simultaneously supported the Kurdish and Shia rebellions, Saddam would not have survived in power. So yes, mistakes were made; but, again, not the ones most often raised by the theorists of the immaculate conception persuasion.
Now, why were those mistakes made? One reason was disagreement near the top, and President Bush pere tried to have it both ways. He wanted Saddam gone, but he did not want to pay a price in American blood and entanglement if he could help it. He wanted Saddam gone, but not at the cost of provoking a crisis with Saudi Arabia or aiding the mullahs of Tehran. He split the differences among his advisors and hoped for the best, and this was by no means unreasonable. It happened not to have been successful, but in whose life does every wager pay off?
Had the president ordered the occupation of Baghdad in 1991, we would not have had to put up with Saddam for the past dozen years, and that would have been more than a small mercy. But for all anyone knows, American troops would have been there all along, for a dozen years, and who knows the larger consequences of that in the Arab and Muslim worlds — and of how those consequences might have redounded back on us? Who knows for certain that other, even more dangerous consequences than our having to live with Saddam these past 12 years would not have been set in motion? No one can possibly know this, which is why the popular condemnation of the Gulf War’s endgame often sounds far too cocksure against the available, but inevitably incomplete, evidence. It is perfectly true, as Paul Wolfowitz wrote back in 1994, that “by and large, wars are not constructive acts: they are better judged by what they prevent than by what they accomplish. But what a war has prevented is impossible ever to know with certainty, and for many observers is never a subject of serious reflection.” It is also true, however, that what wars prevent may sometimes end up benign while what they accomplish can evoke justifiable regret. It is, unfortunately, not an unreasonable fear that Gulf War ii will illustrate that very point to our considerable disappointment.
American presidents, who have to make the truly big decisions of U.S. foreign policy, must come to a judgment with incomplete information, often under stress and merciless time constraints, and frequently with their closest advisors painting one another in shades of disagreement. The choices are never between obviously good and obviously bad, but between greater and lesser sets of risks, greater and lesser prospects of danger. Banal as it sounds, we do well to remind ourselves from time to time that things really are not so simple, even when one’s basic principles are clear and correct. When President George W. Bush strove, from September 12, 2001 onward, to make the moral and strategic stakes of the war on terrorism clear, he was immediately enshrouded by an inescapable fog of irrepressible fact: namely, that our two most critical tactical allies in the war on terrorism, Pakistan and Saudi Arabia, were the two governments whose policies had led most directly to 9-11. If that was not enough ambiguity with which to start the war on terrorism, the various sideswipes of the Israeli-Palestinian conflict soon provided more.
Does this mean that George Bush, with his Bush Doctrine, is now evincing a form of the immaculate conception theory of U.S. foreign policy? Is the administration simplifying to our peril that which cannot, or at any rate ought not, be simplified? Not necessarily. An administration has a right to its own rhetoric, and with it a right to put some distance between that rhetoric and its actual conduct. As it is true, as Henry Kissinger has said, that covert action should not be confused with charity work, the art of public diplomacy is not and should not be tantamount to telling the truth, the whole truth, and nothing but the truth.
But the president does invite harm if he thinks himself free from having to make tradeoffs; if he thinks, for example, that by some sort of ex cathedra definition there can be no long-term political downside to a protracted U.S. occupation of Iraq. We do take risks that imperfect policy in the here and now will give rise to unknowable dangers in the there and then. As bad as Saddam and the Baath have been, they have not been Islamist in orientation. If we are not prepared to sit in Baghdad for half a century, who can guarantee that such a regime will not in due course follow the war? And if we do sit in Baghdad for a long time, who can guarantee that our doing so will not engender such tendencies in states nearby? And the president courts trouble if he somehow loses sight of the need to wend his way among advisors who do not always agree and underlings who do not always behave. Splitting the difference among differently minded advisors works, or at least doesn’t obviously fail, when incremental policy fits the task. When boldness is required, such splitting is liable to give rise to half-measures (and mis-measures) that only make things worse.
The president will also find no escape, even long after he leaves the White House, from the accusations of the immaculate conception school, whose students will not cease to pronounce the judgment of the sophomoric from now until thistles lose their barbs. One can only imagine what simplicities they will fabricate from the detritus of this war. Of this irritation, what can one say? These things happen.
Adam Garfinkle is a Senior Fellow at the Foreign Policy Research Institute.
Copyright © 2003 Policy Review