Wednesday, August 31, 2016

In Today's Essay, Andrew Bacevich Reconfigures The Past & It's Worth Your Time

Andrew Bacevich is a rare combination of scholar and warrior who writes with perception and cool detachment about our current military, diplomatic, and political disasters. If this is a (fair & balanced) illustration of true patriotism, so be it.

[x Harper's]
American Imperium
By Andrew J. Bacevich

TagCrowd cloud of the following piece of writing

created at TagCrowd.com

Republicans and Democrats disagree today on many issues, but they are united in their resolve that the United States must remain the world’s greatest military power. This bipartisan commitment to maintaining American supremacy has become a political signature of our times. In its most benign form, the consensus finds expression in extravagant and unremitting displays of affection for those who wear the uniform. Considerably less benign is a pronounced enthusiasm for putting our soldiers to work “keeping America safe.” This tendency finds the United States more or less permanently engaged in hostilities abroad, even as presidents from both parties take turns reiterating the nation’s enduring commitment to peace.

To be sure, this penchant for military activism attracts its share of critics. Yet dissent does not imply influence. The trivializing din of what passes for news drowns out the antiwar critique. One consequence of remaining perpetually at war is that the political landscape in America does not include a peace party. Nor, during presidential-election cycles, does that landscape accommodate a peace candidate of voter consequence. The campaign now in progress has proved no exception. Candidates calculate that tough talk wins votes. They are no more likely to question the fundamentals of U.S. military policy than to express skepticism about the existence of a deity. Principled opposition to war ranks as a disqualifying condition, akin to having once belonged to the Communist Party or the KKK. The American political scene allows no room for the intellectual progeny of Jane Addams, Eugene V. Debs, Dorothy Day, or Martin Luther King Jr.

So, this November, voters will choose between rival species of hawks. Each of the finalists will insist that freedom’s survival hinges on having in the Oval Office a president ready and willing to employ force, even as each will dodge any substantive assessment of what acting on that impulse has produced of late. In this sense, the outcome of the general election has already been decided. As regards so-called national security, victory is ensured. The status quo will prevail, largely unexamined and almost entirely intact.

Citizens convinced that U.S. national-security policies are generally working well can therefore rest easy. Those not sharing that view, meanwhile, might wonder how it is that military policies that are manifestly defective — the ongoing accumulation of unwon wars providing but one measure — avoid serious scrutiny, with critics of those policies consigned to the political margins.

History provides at least a partial answer to this puzzle. The constructed image of the past to which most Americans habitually subscribe prevents them from seeing other possibilities, a condition for which historians themselves bear some responsibility. Far from encouraging Americans to think otherwise, these historians have effectively collaborated with those interests that are intent on suppressing any popular inclination toward critical reflection. This tunnel vision affirms certain propositions that are dear to American hearts, preeminently the conviction that history itself has summoned the United States to create a global order based on its own self-image. The resulting metanarrative unfolds as a drama in four acts: in the first, Americans respond to but then back away from history’s charge; in the second, they indulge in an interval of adolescent folly, with dire consequences; in the third, they reach maturity and shoulder their providentially assigned responsibilities; in the fourth, after briefly straying off course, they stage an extraordinary recovery. When the final curtain in this drama falls, somewhere around 1989, the United States is the last superpower standing.

For Americans, the events that established the twentieth century as their century occurred in the military realm: two misleadingly named “world wars” separated by an “interwar period” during which the United States ostensibly took a time-out, followed by a so-called Cold War that culminated in decisive victory despite being inexplicably marred by Vietnam. To believe in the lessons of this melodrama — which warn above all against the dangers of isolationism and appeasement — is to accept that the American Century should last in perpetuity. Among Washington insiders, this view enjoys a standing comparable to belief in the Second Coming among devout Christians.

Unfortunately, in the United States these lessons retain little relevance. Whatever the defects of current U.S. policy, isolationism and appeasement do not number among them. With its military active in more than 150 countries, the United States today finds itself, if anything, overextended. Our principal security challenges — the risks to the planet posed by climate change, the turmoil enveloping much of the Islamic world and now spilling into the West, China’s emergence as a potential rival to which Americans have mortgaged their prosperity — will not yield to any solution found in the standard Pentagon repertoire. Yet when it comes to conjuring up alternatives, the militarized history to which Americans look for instruction has little to offer.

Prospects for thinking otherwise require an altogether different historical frame. Shuffling the deck — reimagining our military past — just might produce lessons that speak more directly to our present predicament.

Consider an alternative take on the twentieth-century US military experience, with a post-9/11 codicil included for good measure. Like the established narrative, this one also consists of four episodes: a Hundred Years’ War for the Hemisphere, launched in 1898; a War for Pacific Dominion, also initiated in 1898, petering out in the 1970s but today showing signs of reviving; a War for the West, already under way when the United States entered it in 1917 and destined to continue for seven more decades; and a War for the Greater Middle East, dating from 1980 and ongoing still with no end in sight

In contrast to the more familiar four-part narrative, these several military endeavors bear no more than an incidental relationship to one another. Even so, they resemble one another in this important sense: each found expression as an expansive yet geographically specific military enterprise destined to extend across several decades. Each involved the use (or threatened use) of violence against an identifiable adversary or set of adversaries.

Yet for historians inclined to think otherwise, the analytically pertinent question is not against whom U.S. forces fought but why. It’s what the United States was seeking to accomplish that matters most. Here, briefly, is a revised account of the wars defining the (extended) American Century, placing purpose or motive at the forefront.

In February 1898, the battleship USS Maine, at anchor in Havana Harbor, blew up and sank, killing 266 American sailors. Widely viewed at the time as an act of state-sponsored terrorism, this incident initiated what soon became a War for the Hemisphere.

Two months later, vowing to deliver Cubans from oppressive colonial rule, the United States Congress declared war on Spain. Within weeks, however, the enterprise evolved into something quite different. After ousting Cuba’s Spanish overseers, the United States disregarded the claims of nationalists calling for independence, subjected the island to several years of military rule, and then converted it into a protectorate that was allowed limited autonomy. Under the banner of anti-imperialism, a project aimed at creating an informal empire had commenced.

America’s intervention in Cuba triggered a bout of unprecedented expansionism. By the end of 1898, U.S. forces had also seized Puerto Rico, along with various properties in the Pacific. These actions lacked a coherent rationale until Theodore Roosevelt, elevated to the presidency in 1901, took it on himself to fill that void. An American-instigated faux revolution that culminated with a newly founded Republic of Panama signing over to the United States its patrimony — the route for a transisthmian canal — clarified the hierarchy of U.S. interests. Much as concern about Persian Gulf oil later induced the United States to assume responsibility for policing that region, so concern for securing the as yet unopened canal induced it to police the Caribbean.

In 1904, Roosevelt’s famous “corollary” to the Monroe Doctrine, claiming for the United States authority to exercise “international police power” in the face of “flagrant . . . wrongdoing or impotence,” provided a template for further action. Soon thereafter, US forces began to intervene at will throughout the Caribbean and Central America, typically under the guise of protecting American lives and property but in fact to position the United States as regional suzerain. Within a decade, Haiti, the Dominican Republic, and Nicaragua joined Cuba and Panama on the roster of American protectorates. Only in Mexico, too large to occupy and too much in the grip of revolutionary upheaval to tame, did US military efforts to impose order come up short.

“Yankee imperialism” incurred costs, however, not least of all by undermining America’s preferred self-image as benevolent and peace-loving, and therefore unlike any other great power in history. To reduce those costs, beginning in the 1920s successive administrations sought to lower the American military profile in the Caribbean basin. The United States was now content to allow local elites to govern so long as they respected parameters established in Washington. Here was a workable formula for exercising indirect authority, one that prioritized order over democracy, social justice, and the rule of law.

By 1933, when Franklin Roosevelt inaugurated his Good Neighbor policy with the announcement that “the definite policy of the United States from now on is one opposed to armed intervention,” the War for the Hemisphere seemed largely won. Yet neighborliness did not mean that US military forces were leaving the scene. As insurance against backsliding, Roosevelt left intact the U.S. bases in Cuba and Puerto Rico, and continued to garrison Panama.

So rather than ending, the Hundred Years’ War for the Hemisphere had merely gone on hiatus. In the 1950s, the conflict resumed and even intensified, with Washington now defining threats to its authority in ideological terms. Leftist radicals rather than feckless caudillos posed the problem. During President Dwight D. Eisenhower’s first term, a CIA-engineered coup in Guatemala tacitly revoked FDR’s nonintervention pledge and appeared to offer a novel way to enforce regional discipline without actually committing US troops. Under President John F. Kennedy, the CIA tried again, in Cuba. That was just for starters.

Between 1964 and 1994, U.S. forces intervened in the Dominican Republic, Grenada, Panama, and Haiti, in most cases for the second or third time. Nicaragua and El Salvador also received sustained American attention. In the former, Washington employed methods that were indistinguishable from terrorism to undermine a regime it viewed as illegitimate. In the latter, it supported an ugly counterinsurgency campaign to prevent leftist guerrillas from overthrowing right-wing oligarchs. Only in the mid-1990s did the Hundred Years’ War for the Hemisphere once more begin to subside. With the United States having forfeited its claim to the Panama Canal and with US–Cuban relations now normalized, it may have ended for good.

Today the United States enjoys unquestioned regional primacy, gained at a total cost of fewer than a thousand U.S. combat fatalities, even counting the luckless sailors who went down with the Maine. More difficult to say with certainty is whether a century of interventionism facilitated or complicated US efforts to assert primacy in its “own back yard.” Was coercion necessary? Or might patience have produced a similar outcome? Still, in the end, Washington got what it wanted. Given the gaping imbalance of power between the Colossus of the North and its neighbors, we may wonder whether the final outcome was ever in doubt.

During its outward thrust of 1898, the United States seized the entire Philippine archipelago, along with smaller bits of territory such as Guam, Wake, and the Hawaiian Islands. By annexing the Philippines, U.S. authorities enlisted in a high-stakes competition to determine the fate of the Western Pacific, with all parties involved viewing China as the ultimate prize. Along with traditional heavyweights such as France, Great Britain, and Russia, the ranks of the competitors included two emerging powers. One was the United States, the other imperial Japan. Within two decades, thanks in large part to the preliminary round of the War for the West, the roster had thinned considerably, putting the two recent arrivals on the path for a showdown.

The War for Pacific Dominion confronted the US military with important preliminary tasks. Obliging Filipinos to submit to a new set of colonial masters entailed years of bitter fighting. More American soldiers died pacifying the Philippines between 1899 and 1902 than were to lose their lives during the entire Hundred Years’ War for the Hemisphere. Yet even as US forces were struggling in the Philippines, orders from Washington sent them venturing more deeply into Asia. In 1900, several thousand American troops deployed to China to join a broad coalition (including Japan) assembled to put down the so-called Boxer Rebellion. Although the expedition had a nominally humanitarian purpose — Boxers were murdering Chinese Christians while laying siege to legations in Peking’s diplomatic quarter — its real aim was to preserve the privileged status accorded foreigners in China. In that regard, it succeeded, thereby giving a victory to imperialism.

Through its participation in this brief campaign, the United States signaled its own interest in China. A pair of diplomatic communiqués known as the Open Door Notes codified Washington’s position by specifying two non-negotiable demands: first, to preserve China’s territorial integrity; and second, to guarantee equal opportunity for all the foreign powers engaged in exploiting that country. Both of these demands would eventually put the United States and Japan at cross-purposes. To substantiate its claims, the United States established a modest military presence in China. At Tientsin, two days’ march from Peking, the US Army stationed an infantry regiment. The US Navy ramped up its patrols on the Yangtze River between Shanghai and Chungking — more or less the equivalent of Chinese gunboats today traversing the Mississippi River between New Orleans and Minneapolis.

US and Japanese interests in China proved to be irreconcilable. In hindsight, a violent collision between these two rising powers appears almost unavoidable. As wide as the Pacific might be, it was not wide enough to accommodate the ambitions of both countries. Although a set of arms-limiting treaties negotiated at the Washington Naval Conference of 1921–22 put a momentary brake on the rush toward war, that pause could not withstand the crisis of the Great Depression. Once Japanese forces invaded Manchuria in 1931 and established the puppet state of Manchukuo, the options available to the United States had reduced to two: either allow the Japanese a free hand in China or muster sufficient power to prevent them from having their way. By the 1930s, the War for Pacific Dominion had become a zero-sum game.

To recurring acts of Japanese aggression in China Washington responded with condemnation and, eventually, punishing economic sanctions. What the United States did not do, however, was reinforce its Pacific outposts to the point where they could withstand serious assault. Indeed, the Navy and War departments all but conceded that the Philippines, impulsively absorbed back in the heady days of 1898, were essentially indefensible.

At odds with Washington over China, Japanese leaders concluded that the survival of their empire hinged on defeating the United States in a direct military confrontation. They could see no alternative to the sword. Nor, barring an unexpected Japanese capitulation to its demands, could the United States. So the December 7, 1941, attack on Pearl Harbor came as a surprise only in the narrow sense that US commanders underestimated the prowess of Japan’s aviators.

That said, the ensuing conflict was from the outset a huge mismatch. Only in willingness to die for their country did the Japanese prove equal to the Americans. By every other measure — military-age population, raw materials, industrial capacity, access to technology — they trailed badly. Allies exacerbated the disparity, since Japan fought virtually alone. Once FDR persuaded his countrymen to go all out to win — after Pearl Harbor, not a difficult sell — the war’s eventual outcome was not in doubt. When the incineration of Hiroshima and Nagasaki ended the fighting, the issue of Pacific dominion appeared settled. Having brought their principal foe to its knees, the Americans were now in a position to reap the rewards.

In the event, things were to prove more complicated. Although the United States had thwarted Japan’s efforts to control China, developments within China itself soon dashed American expectations of enjoying an advantageous position there. The United States “lost” it to communist revolutionaries who ousted the regime that Washington had supported against the Japanese. In an instant, China went from ally to antagonist.

So US forces remained in Japan, first as occupiers and then as guarantors of Japanese security (and as a check on any Japanese temptation to rearm). That possible threats to Japan were more than theoretical became evident in the summer of 1950, when war erupted on the nearby Korean peninsula. A mere five years after the War for Pacific Dominion had seemingly ended, GI’s embarked on a new round of fighting.

The experience proved an unhappy one. Egregious errors of judgment by the Americans drew China into the hostilities, making the war longer and more costly than it might otherwise have been. When the end finally came, it did so in the form of a painfully unsatisfactory draw. Yet with the defense of South Korea now added to Washington’s list of obligations, US forces stayed on there as well.

In the eyes of US policymakers, Red China now stood as America’s principal antagonist in the Asia–Pacific region. Viewing the region through rose-tinted glasses, Washington saw communism everywhere on the march. So in American eyes a doomed campaign by France to retain its colonies in Indochina became part of a much larger crusade against communism on behalf of freedom. When France pulled the plug in Vietnam, in 1954, the United States effectively stepped into its role. An effort extending across several administrations to erect in Southeast Asia a bulwark of anticommunism aligned with the United States exacted a terrible toll on all parties involved and produced only one thing of value: machinations undertaken by President Richard Nixon to extricate the United States from a mess of its own making persuaded him to reclassify China not as an ideological antagonist but as a geopolitical collaborator.

As a consequence, the rationale for waging war in Vietnam in the first place — resisting the onslaught of the Red hordes — also faded. With it, so too did any further impetus for US military action in the region. The War for Pacific Dominion quieted down appreciably, though it didn’t quite end. With China now pouring its energies into internal development, Americans found plentiful opportunities to invest and indulge their insatiable appetite for consumption. True, a possible renewal of fighting in Korea remained a perpetual concern. But when your biggest worry is a small, impoverished nation-state that is unable even to feed itself, you’re doing pretty well.

As far as the Pacific is concerned, Americans may end up viewing the last two decades of the twentieth century and the first decade of the twenty-first as a sort of golden interlude. The end of that period may now be approaching. Uncertainty about China’s intentions as a bona fide superpower is spooking other nearby nations, not least of all Japan. That another round of competition for the Pacific now looms qualifies at the very least as a real possibility.

For the United States, the War for the West began in 1917, when President Woodrow Wilson persuaded Congress to enter a stalemated European conflict that had been under way since 1914. The proximate cause of the US decision to intervene was the resumption of German U-boat attacks on American shipping. To that point, US policy had been one of formal neutrality, a posture that had not prevented the United States from providing Germany’s enemies, principally Great Britain and France, with substantial assistance, both material and financial. The Germans had reason to be miffed.

For the war’s European participants, the issue at hand was as stark as it was straightforward. Through force of arms, Germany was bidding for continental primacy; through force of arms, Great Britain, France, and Russia were intent on thwarting that bid. To the extent that ideals figured among the stated war aims, they served as mere window dressing. Calculations related to Machtpolitik overrode all other considerations.

President Wilson purported to believe that America’s entry into the war, ensuring Germany’s defeat, would vanquish war itself, with the world made safe for democracy — an argument that he advanced with greater passion and eloquence than logic. Here was the cause for which Americans sent their young men to fight in Europe: the New World was going to redeem the Old.

It didn’t work out that way. The doughboys made it to the fight, but belatedly. Even with 116,000 dead, their contribution to the final outcome fell short of being decisive. When the Germans eventually quit, they appealed for a Wilsonian “peace without victory.” The Allies had other ideas. Their conception of peace was to render Germany too weak to pose any further danger. Meanwhile, Great Britain and France wasted little time claiming the spoils, most notably by carving up the Ottoman Empire and thereby laying the groundwork for what would eventually become the War for the Greater Middle East.

When Wilson’s grandiose expectations of a world transformed came to naught, Americans concluded — not without cause — that throwing in with the Allies had been a huge mistake. What observers today mischaracterize as “isolationism” was a conviction, firmly held by many Americans during the 1920s and 1930s, that the United States should never again repeat that mistake.

According to myth, that conviction itself produced an even more terrible conflagration, the European conflict of 1939–45, which occurred (at least in part) because Americans had second thoughts about their participation in the war of 1914–18 and thereby shirked their duty to intervene. Yet this is the equivalent of blaming a drunken brawl between rival street gangs on members of Alcoholics Anonymous meeting in a nearby church basement.

Although the second European war of the twentieth century differed from its predecessor in many ways, it remained at root a contest to decide the balance of power. Once again, Germany, now governed by nihilistic criminals, was making a bid for primacy. This time around, the Allies had a weaker hand, and during the war’s opening stages they played it poorly. Fortunately, Adolf Hitler came to their rescue by committing two unforced errors. Even though Joseph Stalin was earnestly seeking to avoid a military confrontation with Germany, Hitler removed that option by invading the Soviet Union in June 1941. Franklin Roosevelt had by then come to view the elimination of the Nazi menace as a necessity, but only when Hitler obligingly declared war on the United States, days after Pearl Harbor, did the American public rally behind that proposition.

In terms of the war’s actual conduct, only the United States was in a position to exercise any meaningful choice, whereas Great Britain and the Soviet Union responded to the dictates of circumstance. Exercising that choice, the Americans left the Red Army to bear the burden of fighting. In a decision that qualifies as shrewd or perfidious depending on your point of view, the United States waited until the German army was already on the ropes in the east before opening up a real second front.

The upshot was that the Americans (with Anglo-Canadian and French assistance) liberated the western half of Europe while conceding the eastern half to Soviet control. In effect, the prerogative of determining Europe’s fate thereby passed into non-European hands. Although out of courtesy US officials continued to indulge the pretense that London and Paris remained centers of global power, this was no longer actually the case. By 1945 the decisions that mattered were made in Washington and Moscow.

So rather than ending with Germany’s second defeat, the War for the West simply entered a new phase. Within months, the Grand Alliance collapsed and the prospect of renewed hostilities loomed, with the United States and the Soviet Union each determined to exclude the other from Europe. During the decades-long armed standoff that ensued, both sides engaged in bluff and bluster, accumulated vast arsenals that included tens of thousands of nuclear weapons, and mounted impressive displays of military might, all for the professed purpose of preventing a “cold” war from turning “hot.”

Germany remained a source of potential instability, because that divided country represented such a coveted (or feared) prize. Only after 1961 did a semblance of stability emerge, as the erection of the Berlin Wall reduced the urgency of the crisis by emphasizing that it was not going to end anytime soon. All parties concerned concluded that a Germany split in two was something they could live with.

By the 1960s, armed conflict (other than through gross miscalculation) appeared increasingly improbable. Each side devoted itself to consolidating its holdings while attempting to undermine the other side’s hold on its allies, puppets, satellites, and fraternal partners. For national-security elites, managing this competition held the promise of a bountiful source of permanent employment. When Mikhail Gorbachev decided, in the late 1980s, to call the whole thing off, President Ronald Reagan numbered among the few people in Washington willing to take the offer seriously. Still, in 1989 the Soviet–American rivalry ended. So, too, if less remarked on, did the larger struggle dating from 1914 within which the so-called Cold War had formed the final chapter.

In what seemed, misleadingly, to be the defining event of the age, the United States had prevailed. The West was now ours.

Among the bequests that Europeans handed off to the United States as they wearied of exercising power, none can surpass the Greater Middle East in its problematic consequences. After the European war of 1939–45, the imperial overlords of the Islamic world, above all Great Britain, retreated. In a naïve act of monumental folly, the United States filled the vacuum left by their departure.

For Americans, the War for the Greater Middle East kicked off in 1980, when President Jimmy Carter designated the Persian Gulf a vital U.S. national-security interest. The Carter Doctrine, as the president’s declaration came to be known, initiated the militarizing of America’s Middle East policy, with next to no appreciation for what might follow.

During the successive “oil shocks” of the previous decade, Americans had made clear their unwillingness to tolerate any disruption to their oil-dependent lifestyle, and, in an immediate sense, the purpose of the War for the Greater Middle East was to prevent the recurrence of such disagreeable events. Yet in its actual implementation, the ensuing military project became much more than simply a war for oil.

In the decades since Carter promulgated his eponymous doctrine, the list of countries in the Islamic world that US forces have invaded, occupied, garrisoned, bombed, or raided, or where American soldiers have killed or been killed, has grown very long indeed. Since 1980, that list has included Iraq and Afghanistan, of course, but also Iran, Lebanon, Libya, Turkey, Kuwait, Saudi Arabia, Qatar, Bahrain, the United Arab Emirates, Jordan, Bosnia, Kosovo, Yemen, Sudan, Somalia, Pakistan, and Syria. Of late, several West African nations with very large or predominantly Muslim populations have come in for attention. At times, US objectives in the region have been specific and concrete. At other times, they have been broad and preposterously gauzy. Overall, however, Washington has found reasons aplenty to keep the troops busy. They arrived variously promising to keep the peace, punish evildoers, liberate the oppressed, shield the innocent, feed the starving, avert genocide or ethnic cleansing, spread democracy, and advance the cause of women’s rights. Rarely have the results met announced expectations.

In sharp contrast with the Hundred Years’ War for the Hemisphere, US military efforts in the Greater Middle East have not contributed to regional stability. If anything, the reverse is true. Hopes of achieving primacy comparable to what the United States gained by 1945 in its War for Pacific Dominion remain unfulfilled and appear increasingly unrealistic. As for “winning,” in the sense that the United States ultimately prevailed in the War for the West, the absence of evident progress in the theaters that have received the most U.S. military attention gives little cause for optimism.

To be fair, US troops have labored under handicaps. Among the most severe has been the absence of common agreement regarding the mission. Apart from the brief period of 2002–2006 when George W. Bush fancied that what ailed the Greater Middle East was the absence of liberal democracy (with his Freedom Agenda the needed antidote), policymakers have struggled to define the mission that American troops are expected to fulfill. The recurring inclination to define the core issue as “terrorism,” with expectations that killing “terrorists” in sufficient numbers should put things right, exemplifies this difficulty. Reliance on such generic terms amounts to a de facto admission of ignorance.

When contemplating the world beyond their own borders, many Americans — especially those in the midst of campaigning for high office — reflexively adhere to a dichotomous teleology of good versus evil and us versus them. The very “otherness” of the Greater Middle East itself qualifies the region in the eyes of most Americans as historically and culturally alien. US military policy there has been inconsistent, episodic, and almost entirely reactive, with Washington cobbling together a response to whatever happens to be the crisis of the moment. Expediency and opportunism have seldom translated into effectiveness.

Consider America’s involvement in four successive Gulf Wars over the past thirty-five years. In Gulf War I, which began in 1980, when Iraq invaded Iran, and lasted until 1988, the United States provided both covert and overt support to Saddam Hussein, even while secretly supplying arms to Iran. In Gulf War II, which began in 1990, when Iraq invaded Kuwait, the United States turned on Saddam. Although the campaign to oust his forces from Kuwait ended in apparent victory, Washington decided to keep US troops in the region to “contain” Iraq. Without attracting serious public attention, Gulf War II thereby continued through the 1990s. In Gulf War III, the events of 9/11 having rendered Saddam’s continued survival intolerable, the United States in 2003 finished him off and set about creating a new political order more to Washington’s liking.US forces then spent years vainly trying to curb the anarchy created by the invasion and subsequent occupation of Iraq.

Unfortunately, the eventual withdrawal of US troops at the end of 2011 marked little more than a brief pause. Within three years, Gulf War IV had commenced. To prop up a weak Iraqi state now besieged by a new enemy, one whose very existence was a direct result of previous US intervention, the armed forces of the United States once more returned to the fight. Although the specifics varied, US military actions since 1980 in Islamic countries as far afield as Afghanistan, Lebanon, Libya, and Somalia have produced similar results — at best they have been ambiguous, more commonly disastrous.

As for the current crop of presidential candidates vowing to “smash the would-be caliphate” (Hillary Clinton), “carpet bomb them into oblivion” (Ted Cruz), and “bomb the hell out of the oilfields” (Donald Trump), Americans would do well to view such promises with skepticism. If US military power offers a solution to all that ails the Greater Middle East, then why hasn’t the problem long since been solved?

Lessons drawn from this alternative narrative of twentieth-century US military history have no small relevance to the present day. Among other things, the narrative demonstrates that the bugaboos of isolationism and appeasement are pure inventions.

If isolationism defined US foreign policy during the 1920s and 1930s, someone forgot to let the American officer corps in on the secret. In 1924, for example, Brigadier General Douglas MacArthur was commanding US troops in the Philippines. Lieutenant Colonel George C. Marshall was serving in China as the commander of the 15th Infantry. Major George S. Patton was preparing to set sail for Hawaii and begin a stint as a staff officer at Schofield Barracks. Dwight D. Eisenhower’s assignment in the Pacific still lay in the future; in 1924, Major Eisenhower’s duty station was Panama. The indifference of the American people may have allowed that army to stagnate intellectually and materially. But those who served had by no means turned their backs on the world.

As for appeasement, hang that tag on Neville Chamberlain and Édouard Daladier, if you like. But as a description of US military policy over the past century, it does not apply. Since 1898, apart from taking an occasional breather, the United States has shown a strong and consistent preference for activism over restraint and for projecting power abroad rather than husbanding it for self-defense. Only on rare occasions have American soldiers and sailors had reason to complain of being underemployed. So although the British may have acquired their empire “in a fit of absence of mind,” as apologists once claimed, the same cannot be said of Americans in the twentieth century. Not only in the Western Hemisphere but also in the Pacific and Europe, the United States achieved preeminence because it sought preeminence.

In the Greater Middle East, the site of our most recent war, a similar quest for preeminence has now foundered, with the time for acknowledging the improbability of it ever succeeding now at hand. Such an admission just might enable Americans to see how much the global landscape has changed since the United States made its dramatic leap into the ranks of great powers more than a century ago, as well as to extract insights of greater relevance than hoary old warnings about isolationism and appeasement.

The first insight pertains to military hegemony, which turns out to be less than a panacea. In the Western Hemisphere, for example, the undoubted military supremacy enjoyed by the United States is today largely beside the point. The prospect of hostile outside powers intruding in the Americas, which US policymakers once cited as a justification for armed intervention, has all but disappeared.

Yet when it comes to actually existing security concerns, conventional military power possesses limited utility. Whatever the merits of gunboat diplomacy as practiced by Teddy Roosevelt and Wilson or by Eisenhower and JFK, such methods won’t stem the flow of drugs, weapons, dirty money, and desperate migrants passing back and forth across porous borders. Even ordinary Americans have begun to notice that the existing paradigm for managing hemispheric relations isn’t working — hence the popular appeal of Donald Trump’s promise to “build a wall” that would remove a host of problems with a single stroke. However bizarre and impractical, Trump’s proposal implicitly acknowledges that with the Hundred Years’ War for the Hemisphere now a thing of the past, fresh thinking is in order. The management of hemispheric relations requires a new paradigm, in which security is defined chiefly in economic rather than in military terms and policing is assigned to the purview of police agencies rather than to conventional armed forces. In short, it requires the radical demilitarization of US policy. In the Western Hemisphere, apart from protecting the United States itself from armed attack, the Pentagon needs to stand down.

The second insight is that before signing up to fight for something, we ought to make sure that something is worth fighting for. When the United States has disregarded this axiom, it has paid dearly. In this regard, the annexation of the Philippines, acquired in a fever of imperial enthusiasm at the very outset of the War for Pacific Dominion, was a blunder of the first order. When the fever broke, the United States found itself saddled with a distant overseas possession for which it had little use and which it could not properly defend. Americans may, if they wish, enshrine the ensuing saga of Bataan and Corregidor as glorious chapters in US military history. But pointless sacrifice comes closer to the truth.

By committing itself to the survival of South Vietnam, the United States replicated the error of its Philippine commitment. The fate of the Vietnamese south of the 17th parallel did not constitute a vital interest of the United States. Yet once we entered the war, a reluctance to admit error convinced successive administrations that there was no choice but to press on. A debacle of epic proportions ensued.

Jingoists keen to insert the United States today into minor territorial disputes between China and its neighbors should take note. Leave it to the likes of John Bolton, a senior official during the George W. Bush Administration, to advocate “risky brinkmanship” as the way to put China in its place. Others will ask how much value the United States should assign to the question of what flag flies over tiny island chains such as the Paracels and Spratlys. The answer, measured in American blood, amounts to milliliters.

During the twentieth century, achieving even transitory dominion in the Pacific came at a very high price. In three big fights, the United States came away with one win, one draw, and one defeat. Seeing that one win as a template for the future would be a serious mistake. Few if any of the advantages that enabled the United States to defeat Japan seventy years ago will pertain to a potential confrontation with China today. So unless Washington is prepared to pay an even higher price to maintain Pacific dominion, it may be time to define US objectives there in more modest terms.

A third insight encourages terminating obligations that have become redundant. Here the War for the West is particularly instructive. When that war abruptly ended in 1989, what had the United States won? As it turned out, less than met the eye. Although the war’s conclusion found Europe “whole and free,” as US officials incessantly proclaimed, the epicenter of global politics had by then moved elsewhere. The prize for which the United States had paid so dearly had in the interim lost much of its value.

Americans drawn to the allure of European culture, food, and fashion have yet to figure this out. Hence the far greater attention given to the occasional terrorist attack in Paris than to comparably deadly and more frequent incidents in places such as Nigeria or Egypt or Pakistan. Yet events in those countries are likely to have as much bearing, if not more, on the fate of the planet than anything occurring in the tenth or eleventh arrondissement.

Furthermore, “whole and free” has not translated into “reliable and effective.” Visions of a United States of Europe partnering with the United States of America to advance common interests and common values have proved illusory. The European Union actually resembles a loose confederation, with little of the cohesion that the word “union” implies. Especially in matters related to security, the EU combines ineptitude with irresolution, a point made abundantly clear during the Balkan crises of the 1990s and reiterated since.

Granted, Americans rightly prefer a pacified Europe to a totalitarian one. Yet rather than an asset, Europe today has become a net liability, with NATO having evolved into a mechanism for indulging European dependency. The Western alliance that was forged to deal with the old Soviet threat has survived and indeed expanded ever eastward, having increased from sixteen members in 1990 to twenty-eight today. As the alliance enlarges, however, it sheds capability. Allowing their own armies to waste away, Europeans count on the United States to pick up the slack. In effect, NATO provides European nations an excuse to dodge their most fundamental responsibility: self-defense.

Nearly a century after Americans hailed the kaiser’s abdication, more than seventy years after they celebrated Hitler’s suicide, and almost thirty years after they cheered the fall of the Berlin Wall, a thoroughly pacified Europe cannot muster the wherewithal to deal even with modest threats such as post-Soviet Russia. For the United States to indulge this European inclination to outsource its own security might make sense if Europe itself still mattered as much as it did when the War for the West began. But it does not. Indeed, having on three occasions over the course of eight decades helped prevent Europe from being dominated by a single hostile power, the United States has more than fulfilled its obligation to defend Western civilization. Europe’s problems need no longer be America’s.

Finally, there is this old lesson, evident in each of the four wars that make up our alternative narrative but acutely present in the ongoing War for the Greater Middle East. That is the danger of allowing moral self-delusion to compromise political judgment. Americans have a notable penchant for seeing US troops as agents of all that is good and holy pitted against the forces of evil. On rare occasions, and even then only loosely, the depiction has fit. Far more frequently, this inclination has obscured both the moral implications of American actions and the political complexities underlying the conflict to which the United States has made itself a party.

Indulging the notion that we live in a black-and-white world inevitably produces military policies that are both misguided and morally dubious. In the Greater Middle East, the notion has done just that, exacting costs that continue to mount daily as the United States embroils itself more deeply in problems to which our military power cannot provide an antidote. Perseverance is not the answer; it’s the definition of insanity. Thinking otherwise would be a first step toward restoring sanity. Reconfiguring the past so as to better decipher its meaning offers a first step toward doing just that. Ω

[Andrew J. Bacevich is professor emeritus of history and international relations at Boston University. Bacevitch received a BS (history) from the United States Military Academy as well as a PhD (history) from Princeton University. He retired from Army active duty as a colonel in a career that spanned Vietnam to the Persian Gulf. Bacevich is the author of Breach of Trust: How Americans Failed Their Soldiers and Their Country (2013), among other works. His newest book is America’s War for the Greater Middle East (2016). See all books by Andrew BAcevich here.]

Copyriht © 2016 Harper’s Magazine Foundation



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2016 Sapper's (Fair & Balanced) Rants & Raves

Tuesday, August 30, 2016

Today's Choice: One Damned Election After Another Or Thoughtful Political History

The following essay about the demise of political history in US college and university classrooms did not mention the best practitioner in political history of the 20th century: Richard Hofstadter (1916-1970) known in this blog as St. Richard. This blogger first encountered St. Richard through his second book, The American Political Tradition and the Men Who Made It (1948, 1973, 1976, 1989). As far as this blogger is concerned, St. Richard is the gold standard for US political history. If this is (fair & balanced) Cliocentric appreciation, so be it.

[x NY Fishwrap]
Why Did We Stop Teaching Political History?
By Fredrik Logevall and Kenneth Osgood

TagCrowd cloud of the following piece of writing

created at TagCrowd.com

American political history, it would seem, is everywhere. Hardly a day passes without some columnist comparing Donald J. Trump to Huey Long, Father Coughlin or George Wallace. “All the Way,” a play about Lyndon B. Johnson, won a slew of awards and was turned into an HBO film.

But the public’s love for political stories belies a crisis in the profession. American political history as a field of study has cratered. Fewer scholars build careers on studying the political process, in part because few universities make space for them. Fewer courses are available, and fewer students are exposed to it. What was once a central part of the historical profession, a vital part of this country’s continuing democratic discussion, is disappearing.

This wasn’t always the case. Political history — a specialization in elections and elected officials, policy and policy making, parties and party politics — was once a dominant, if not the dominant, pursuit of American historians. Many of them, in turn, made vital contributions to the political process itself, whether it was Arthur Schlesinger Jr.’s role in the Kennedy White House or C. Vann Woodward’s The Strange Career of Jim Crow (1955, 2002), which the Rev. Dr. Martin Luther King Jr. called the “bible of the civil rights movement.”

But somewhere along the way, such work fell out of favor with history departments. According to the American Historical Association’s listing of academic departments, three-quarters of colleges and universities now lack full-time researchers and teachers in the subject.

There appears to be little effort to fill the void. A search of the leading website advertising academic jobs in history, H-Net, yielded just 15 advertisements in the last 10 years specifically seeking a tenure-track, junior historian specializing in American political history. That’s right: just 15 new jobs in the last decade.

As a result, the study of America’s political past is being marginalized. Many college catalogs list precious few specialized courses on the subject, and survey courses often give scant attention to political topics. The pipelines for new PhDs in the subject, and therefore new faculty, are drying up, and in many graduate programs one can earn a doctorate in American history with little exposure to politics.

How did it come to this? The trend began in the 1960s. America’s misadventure in Vietnam led to broad questioning of elite decision making and conventional politics, and by extension those historical narratives that merely recounted the doings of powerful men. Likewise, the movements of the 1960s and 1970s by African-Americans, Latinos, women, homosexuals and environmental activists brought a new emphasis on history from the bottom up, spotlighting the role of social movements in shaping the nation’s past.

The long overdue diversification of the academy also fostered changing perspectives. As a field once dominated by middle-class white males opened its doors to women, minorities and people from working-class backgrounds, recovering the lost experiences of these groups understandably became priority No. 1.

These transformations enriched the national story. But they also carried costs. Perceived “traditional” types of history that examined the doings of governing elites fell into disfavor, and political history suffered the effects (as did its cousins, diplomatic and military history).

The ramifications extend well beyond higher education. The drying up of scholarly expertise affects universities’ ability to educate teachers — as well as aspiring lawyers, politicians, journalists and business leaders — who will enter their professions having learned too little about the nation’s political history. Not least, in this age of extreme partisanship, they’ll be insufficiently aware of the importance that compromise has played in America’s past, of the vital role of mutual give-and-take in the democratic process.

Change will not be easy, and will not come from history departments facing tight budgets and competing demands. What is needed, to begin with, is for university administrators to identify political history as a priority, for students and families to lobby their schools, for benefactors to endow professorships and graduate fellowships and for lawmakers and school boards to enact policies that bolster its teaching — and without politicizing the enterprise.

This matters. Knowledge of our political past is important because it can serve as an antidote to the misuse of history by our leaders and save us from being bamboozled by analogies, by the easy “lessons of the past.” It can make us less egocentric by showing us how other politicians and governments in other times have responded to division and challenge. And it can help us better understand the likely effects of our actions, a vital step in the acquisition of insight and maturity.

Judging by the state of our political discourse during this dismal campaign season, the change can’t come soon enough. Ω

[Fredrik Logevall is the Laurence D. Belfer Professor of International Affairs at the Harvard Kennedy School and also Professor of History in the Department of History of Harvard University. His most important book to date is Embers of War: The Fall of an Empire and the Making of America’s Vietnam (2012), which won the 2013 Pulitzer Prize for History and the 2013 Francis Parkman Prize, as well as the 2013 American Library in Paris Book Award and the 2013 Arthur Ross Book Award from the Council on Foreign Relations. Longevall received a BA (political science) from Simon Fraser University, an MA (history) from the Univesity of Oregon, and a PhD (foreign relations history) from Yale University.

Kenneth Osgood is Professor of History and Director of the McBride Honors Program in Public Affairs at the Colorado School of Mines. He also is the associate editor of the journal Diplomatic History, and a series editor of Palgrave’s History of the Media series. He has written Total Cold War: Eisenhower’s Secret Propaganda Battle at Home and Abroad (2006), winner of the Herbert Hoover Book Award for best book on any aspect of US history during the early 20th century. Osgood received a BA (history, magna cum laude) from the University of Notre Dame, and both an MA and PhD (history) from the University of California at Santa Barbara.]

Copyright © 2016 The New York Times Company



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2016 Sapper's (Fair & Balanced) Rants & Raves

Monday, August 29, 2016

Today, Another First For This Blog — A 'Toon Daily Double

In today's 'toon double-helping, Tom Tomorrow (Dan Perkins) leads off by mixing some onomatopoeia with repetitive evocations of Donald T. (for "The) Chump's actual surname providing the sound of mob-feet in the streets and then goes with an alt-universe dream world for Sparky the Wonder Penguin (complete with alt-Inuit-style goggles) and Sparky's sidekick, Blinky the Dog morphing into a Chump-dog. Then, for a more plausible explanation for the sound of mob-footfalls is provided by Ben Sargent: there is an actual mob of Chump-followers, but that mob is a replacement for the Teabaggers who are receding into obscurity. If this is a (fair & balanced) graphic depiction of 2016 politics, so be it.

Vannevar Bush HyperlinkBracketed numericsDirectory]
[1] Point — This Modern World (Tom Tomorrow/Dan Perkins)
[2] Counterpoint — Loon Star State (Ben Sargent)


[1]Back To Directory
[x This Modern World]
TRUMP TRUMP TRUMP TRUMP
By Tom Tomorrow (Dan Perksins

Tom Tomorrow/Dan Perkins

[Dan Perkins is an editorial cartoonist better known by the pen name "Tom Tomorrow". His weekly comic strip, "This Modern World," which comments on current events from a strong liberal perspective, appears regularly in approximately 150 papers across the U.S., as well as on Daily Kos. The strip debuted in 1990 in SF Weekly. Perkins, a long time resident of Brooklyn, New York, currently lives in Connecticut. He received the Robert F. Kennedy Award for Excellence in Journalism in both 1998 and 2002. When he is not working on projects related to his comic strip, Perkins writes a daily political blog, also entitled "This Modern World," which he began in December 2001. More recently, Dan Perkins, pen name Tom Tomorrow, was named the winner of the 2013 Herblock Prize for editorial cartooning. Even more recently, Dan Perkins was a runner-up for the 2015 Pulitzer Prize for Editorial Cartooning.]

Copyright © 2016 Tom Tomorrow (Dan Perkins)



[2]Back To Directory
[x TO/Loon Star State]
Old Tea Partiers' Home
By Ben Sargent

[Ben Sargent received the Pulitzer Prize in 1982 in editorial cartooning while at the Austin American-Statesman. He bagan his journalism career in his hometown of Amarillo, TX while a student at Amarillo College where he was the cartoonist for the campus newspaper, The Ranger. After transferring to The University of Texas at Austin, Sargent also was the cartoonist for the campus newspaper, The Daily Texan. Sargent was the Austin American-Statesman’s political cartoonist from 1974 until his retirement in 2009 and since then he is the cartoonist for The Texas Observer. Bnn Sargent received an AA (English/journalism) from Amarillo College as well as a BJ (print media) from The University of Texas at Austin.]

Copyright © 2016 The Texas Observer/Ben Sargent



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2016 Sapper's (Fair & Balanced) Rants & Raves

Sunday, August 28, 2016

If The Slickster Was Bad, Today's (Im)Moral Stupids Are W-O-R-S-E

It would seem that Faux News has been hoist by its own petard in having ridden the impeachment of The Slickster (POTUS 42) because of a sex scandal to falling victim by a sex scandal of its very own. The disgusting Roger Ailes who rose from one of The Trickster's squad of Dirty Tricksters to take Faux News to the top of the cable heap with 24/7 drumming for impeachment of the POTUS 42. As Jane Mayer reports today, the impeachment campaign tracked alongside Ailes' sexual harassment of a female employee between 1988 and 2011. This sleazy affair ended in 2011 with the payment of a settlement in the amount of $3.15M in return for avoiding a lawsuit. Gag this blogger with a spoon. If this is a (fair & balanced) tale of political and moral hypocrisy, so be it.

[x New Yorker]
Roger Ailes, The Clintons, And The Scandals Of The Scandalmongers
By Jane Mayer

TagCrowd cloud of the following piece of writing

created at TagCrowd.com

This election year, the big question was supposed to be whether Hillary Clinton would shatter the glass ceiling. Instead, it has become the year in which one of the country’s most towering glass houses has shattered. Few people may remember it now, but Fox News, which Rupert Murdoch’s News Corporation launched in 1996, became a ratings leader largely because of its gleefully censorious coverage of Bill Clinton’s sex scandals. Now the network is mired in its own scandal. Last month, Roger Ailes resigned as Fox News’s chairman and C.E.O. in the face of multiple allegations of sexual harassment, including a lawsuit filed against him by the former anchor Gretchen Carlson. (Ailes has denied Carlson’s allegations.) The unfolding embarrassment at the network poses a host of questions—not the least of which is how the network’s executives justified their Javert-like pursuit of Clinton’s extramarital affairs, given their boss’s own repeated sexual misconduct. If you go back and look carefully at the chronology, some of Ailes’s most egregious alleged harassment of women was taking place at the same time that Fox News was suggesting that Clinton deserved to be impeached. Sexual harassment is a serious issue, and it merits serious coverage, but it’s hard to believe that the suits at Fox were motivated by genuine concern, given their own corporate culture.

Gabriel Sherman, in his 2014 book The Loudest Voice in the Room (2014), describes how brilliantly and relentlessly Ailes exploited Clinton’s scandalous affair with the White House intern Monica Lewinsky in order to build Fox News’s brand. Sherman writes, “Whatever else it was, the scandal was a media bonanza, and no medium benefited from it more than cable news—and no cable channel more than Fox News.” Within hours of the Lewinsky story breaking, in January, 1998, Ailes inaugurated a new nightly show devoted to the melodrama, and assigned five producers and correspondents to cover it. No detail was too sordid for Fox to cover. With Ailes, a former Republican political operative, at the helm, Fox covered the affair as a criminal act, and rode the story straight up the cable-ratings charts. “Monica was a news channel’s dream come true,” John Moody, Fox’s executive editor, once admitted.

Fox News has devoted considerably less attention to its own sex scandal. When the network announced Ailes’s departure, his alleged improprieties were not mentioned. Carlson’s attorneys told the Guardian that at least twenty women have accused Ailes of sexually harassing them throughout his career. Carlson and the anchor Megyn Kelly, who has also reportedly alleged that she was harassed by Ailes, are the best known among these women, but the story of Laurie Luhn, the former head of booking for Fox News, is especially damning.

Luhn’s account, if true, suggests that, at precisely the same time Ailes was leading Fox’s breathless coverage of the Clinton-impeachment proceedings, Ailes, who was married, was paying Luhn—who was single, broke, and decades younger—to service him sexually. In a recent blockbuster interview with Sherman, in New York, Luhn said that she met Ailes in 1988. Soon afterward, Ailes began paying her a monthly retainer, for sex and for private research on his competitors. When he helped launch Fox, in 1996, Luhn said, Ailes offered her a staff job in “guest relations.” Over time, her job descriptions at Fox changed, but Ailes, whom Luhn described as a “predator,” did not. She told Sherman that her twenty-year involvement with Ailes had been “psychological torture.” As she grew increasingly unhappy, she said, Ailes grew more controlling, insisting that she tell no one of their sexual relations. Luhn told Sherman that Ailes kept an incriminating videotape of her in a safe-deposit vault, as a form of insurance. By 2011, however, Luhn said, she had informed Fox’s general counsel that Ailes had sexually harassed her for decades. All of this might sound hard to believe, and Luhn has acknowledged a history of psychological difficulties. But Ailes and his lawyers declined an invitation from Sherman to rebut Luhn’s story. Moreover, in 2011, Fox agreed to pay Luhn an astounding $3.15-million severance agreement, which included nondisclosure clauses. It looks a lot like hush money, paid for with corporate funds and handled by multiple Fox executives. Yet, if silencing Luhn was the aim, it hasn’t worked. Luhn was reportedly among the first women to contact investigators hired by Fox, in the wake of Carlson’s lawsuit, to straighten out the twisted truth about sexual harassment at the company.

Fox viewers were, of course, left in the dark about Ailes’s personal life as the network relentlessly exposed Clinton’s private life. The campaign was nearly successful. On December 19, 1998, the Republican-ruled House of Representatives voted to impeach Clinton on two articles, for perjury and obstruction of justice, contending that he had lied under oath about his extramarital affair with Lewinsky. The Senate eventually acquitted Clinton, after a highly partisan trial.

Here, too, hindsight has revealed more hypocrisy. The drive to impeach Clinton was led by three successive House Republican leaders. As ThinkProgress noted last year, each of these self-styled moral authorities was subsequently tarnished in his own extramarital sex scandal. Newt Gingrich, the first Speaker to whip his members into an impeachment frenzy, has since acknowledged that during the same period he was engaged in an extramarital affair with a congressional aide, who was then in her twenties. Gingrich subsequently got divorced and married the aide, Callista Bisek, who became his third wife. (His second wife has also said that he began an affair with her while still married to his first, who, at the time, was recovering from cancer. Gingrich has never specifically admitted to that affair.) All the while, he was publicly castigating Democrats as the party of moral degeneration. For example, while the Democratic Party was nominating Clinton, in 1992, Gingrich introduced George H. W. Bush at a campaign stop by declaring that Woody Allen’s “non-family” was one that “fits the Democratic platform perfectly,” because Allen was “having non-incest with a non-daughter to whom he was a non-father.”

Gingrich resigned from the House Speakership in November, 1998, at which point the Republican House members unanimously voted to pass the gavel to Bob Livingston, a congressman from Louisiana. Less than two months later, on the same day that the House was scheduled to vote on Clinton’s impeachment, Livingston announced that he would not assume the Speakership. Larry Flynt, the publisher of Hustler, had revealed that he had gathered evidence that Livingston had been involved with at least four women during the previous decade.

After Livingston stepped down, the Republican majority in the House voted to replace him with Dennis Hastert, an amiable-seeming congressman from Illinois. Hastert spoke of how his “conscience” had led him to “the solemn conclusion” that Clinton had “abused and violated the public trust” and therefore needed to be impeached. In 2015, Hastert was revealed to have sexually abused a boy during his years as a wrestling coach in Illinois, between the mid-sixties and early eighties; according to prosecutors at his later trial, the number of known victims has climbed to five. This past October, Hastert pleaded guilty to bank fraud, and in April admitted to the abuse at his sentencing hearing. He had concealed an effort to buy the silence of his victims, through payments that amounted to more than three million dollars. The presiding judge called him “a serial child molester” before sentencing him to serve fifteen months in federal prison. (He was not charged for the abuse because the statute of limitations had passed.)

Murdoch, the chairman of News Corp., has stepped in to temporarily fill Ailes’s shoes at Fox News, but he has his own baggage. According to the Wall Street Journal and Vanity Fair, Murdoch’s second marriage was upended, in 1999, by an extramarital affair with Wendi Deng, a young émigré from China and an intern at Murdoch’s Star TV. The affair likely overlapped with the Clinton impeachment proceedings. Murdoch’s marriage to Deng ended fourteen years later, again amid sensational rumors of infidelity.

The Clintons, by contrast, have remained married. It’s impossible for anyone on the outside to judge whether that is a marital triumph or a Faustian bargain. Will past Clinton scandals become a focus of the fall campaign? It’s possible. Trump, who has admitted to his own sexual infidelities, is reportedly now being advised by Ailes. Trump has called Bill Clinton “the worst abuser of women in the history of politics,” and said that his sex scandals are “fair game.” Voters may or may not be swayed by the the exhumation of such arguments, this time around. But the unending scandals of the scandalmongers have made one thing clear: neither party has a lock on virtue. Ω

[Jane Mayer has been a staff writer for The New Yorker magazine since 1995. Mayer is a graduate of Yale University (BA, history), where she was a stringer for Time magazine. Mayer has also contributed to the New York Review of Books and American Prospect and co-authored two books—Strange Justice: The Selling of Clarence Thomas (1994) (written with Jill Abramson), a study of the controversy-laden nomination and appointment of Clarence Thomas to the US Supreme Court, and Landslide: The Unmaking of the President, 1984–1988 (1989) (written with Doyle McManus), an account of Ronald Reagan's second term in the White House. Mayer's The Dark Side (2008) — addressing the origins, legal justifications, and possible war crimes liability of the use of interrogation techniques to break down detainees' resistance and the subsequent deaths of detainees under such interrogation as applied by the CIA — was a finalist for the National Book Awards. Her most recent book is Dark Money: The Hidden History of the Billionaires Behind the Rise of the Radical Right (2016). Mayer is the granddaughter of the late historian and biographer Allan Nevins.]

Copyright © 2016 The New Yorker/Condé Nast Digital



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2016 Sapper's (Fair & Balanced) Rants & Raves

Saturday, August 27, 2016

In Praise of Eags (Timothy Egan)

In 2006, Mike Judge, one of the creators of "Beavis and Butt-Head," released "Idocracy" — a film that portrayed this country five hundred years in the future (2025). As Eags (Timothy Egan) reports in 2016, we have already arrived in an equivalent of "Idiocracy." Thanks to Donald T. (for "The") Chump and his knuckle-dragging, drooling supporters, we will witness the possible triumph of stupidity as our dominant national attribute. If this is a (fair & balanced) realization that we live in the present-day "praise of folly," so be it.

[x NY Fishwrap]
The Dumbed Down Democracy
By Eags (Timothy Egan)

TagCrowd cloud of the following piece of writing

created at TagCrowd.com

Are you smarter than an immigrant? Can you name, say, all three branches of government or a single Supreme Court justice? Most Americans, those born here, those about to make the most momentous decision in civic life this November, cannot. And most cannot pass the simple test aced by 90 percent of new citizens.

Well, then: Who controlled the Senate during the 2014 election, when control of the upper chamber was at stake? If you answered Dunno at the time, you were with a majority of Americans in the clueless category.

But surely now, when election news saturation is thicker than the humidity around Lady Liberty’s lip, we’ve become a bit more clue-full. I give you Texas. A recent survey of Donald Trump supporters there found that 40 percent of them believe that Acorn will steal the upcoming election.

Acorn? News flash: That community-organizing group has been out of existence for six years. Acorn is gone, disbanded, dead. It can no more steal an election than Donald Trump can pole vault over his Mexican wall.

We know that at least 30 million American adults cannot read. But the current presidential election may yet prove that an even bigger part of the citizenry is politically illiterate — and functional. Which is to say, they will vote despite being unable to accept basic facts needed to process this American life.

“There’s got to be a reckoning on all this,” said Charlie Sykes, the influential conservative radio host, in a soul-searching interview with Business Insider. “We’ve created this monster.”

Trump, who says he doesn’t read much at all, is both a product of the epidemic of ignorance and a main producer of it. He can litter the campaign trail with hundreds of easily debunked falsehoods because conservative media has spent more than two decades tearing down the idea of objective fact.

If Trump supporters knew that illegal immigration peaked in 2007, or that violent crime has been on a steady downward spiral nationwide for more than 20 years, they would scoff when Trump says Mexican rapists are surging across the border and crime is out of control.

If more than 16 percent of Americans could locate Ukraine on a map, it would have been a Really Big Deal when Trump said that Russia was not going to invade it — two years after they had, in fact, invaded it.

If basic civics was still taught, and required, for high school graduation, Trump could not claim that judges “sign bills.”

The dumbing down of this democracy has been gradual, and then — this year — all at once. The Princeton Review found that the Lincoln-Douglas debates of 1858 were engaged at roughly a high school senior level. A century later, the presidential debate of 1960 was a notch below, at a 10th grade level. By the year 2000, the two contenders were speaking like sixth graders. And in the upcoming debates — “Crooked Hillary” against “Don the Con” — we’ll be lucky to get beyond preschool potty talk.

How did this happen, when the populace was so less educated in the days when most families didn’t even have an indoor potty to talk about? You can look at one calculated loop of misinformation over the last two weeks to find some of the answer.

A big political lie often starts on the Drudge Report, home of Obama-as-Muslim stories. He jump-started a recent smear with pictures of Hillary Clinton losing her balance — proof that something was very wrong with her. Fox News then went big with it, using the Trump adviser and free-media enabler Sean Hannity as the village gossip. Then Rudy Giuliani, the internet diagnostician, urged people to Google “Hillary Clinton illness” for evidence of her malady. This forced Clinton to prove her stamina, in an appearance on Jimmy Kimmel, by opening a jar of pickles.

The only good thing to come out of this is that now, when you Google “Hillary Clinton illness” what pops up are scathing stories about a skeletal-faced rumormonger named Rudy Giuliani, and a terrific Stephen Colbert takedown of this awful man.

But what you don’t know really can hurt you. Last year was the hottest on record. And the July just passed was earth’s warmest month in the modern era. Still, Gallup found that 45 percent of Republicans don’t believe the temperature. We’re not talking about doubt over whether the latest spike was human-caused — they don’t accept the numbers, from all those lying meteorologists.

Of late, almost half of Floridians have done something to protect themselves from the Zika virus, heeding government warnings. But the other half cannot wish it away, as the anti-vaccine crowd on the far left does for serious and preventable illnesses.

I’m sorry that my once-surging Seattle Mariners dropped two out of three games to the Yankees this week. I just prefer not to believe it. And look — now my guys are in first place, no matter what the skewed “standings” show. In my own universe, surrounded by junk fact and junk conclusions, I feel better already. Ω

[Timothy Egan writes "Outposts," a column at the NY Fishwrap online. Egan — winner of both a Pulitzer Prize in 2001 as a member of a team of reporters who wrote the series "How Race Is Lived in America" and a National Book Award (The Worst Hard Time in 2006) — graduated from the University of Washington with a degree in journalism, and was awarded an honorary doctorate of humane letters by Whitman College in 2000 for his environmental writings. Egan's most recent book is The Big Burn: Teddy Roosevelt and the Fire that Saved America (2009).]

Copyright © 2016 The New York Times Company



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2016 Sapper's (Fair & Balanced) Rants & Raves