Thursday, April 30, 2009

As We Really Are

Yesterday, this little ol' blog provided a jeremiad from Kishore Mahbubani, Dean of the Lee Kuan Yew School of Public Policy at the National University of Singapore. As if the national wardrobe in hair shirts was still lacking, today's post to the little ol' blog comes from Professor Andrew Bacevich of Boston University. In 1945, the visionary Vannevar Bush wrote "As We May Think," and now, in 2009, the visionary Andrew Bacevich has written (in effect), "As We Really Are." If this is the (fair & balanced) recognition of reality, so be it.

[x Salon]
Farewell To The American Century
By Andrew Bacevich

Tag Cloud of the following article

created at TagCrowd.com

In a recent column, the Washington Post's Richard Cohen wrote, "What Henry Luce called 'the American Century' is over." Cohen is right. All that remains is to drive a stake through the heart of Luce's pernicious creation, lest it come back to life. This promises to take some doing.

When the Time-Life publisher coined his famous phrase, his intent was to prod his fellow citizens into action. Appearing in the Feb. 7, 1941, issue of Life, his essay, "The American Century," hit the newsstands at a moment when the world was in the throes of a vast crisis. A war in Europe had gone disastrously awry. A second almost equally dangerous conflict was unfolding in the Far East. Aggressors were on the march.

With the fate of democracy hanging in the balance, Americans diddled. Luce urged them to get off the dime. More than that, he summoned them to "accept wholeheartedly our duty and our opportunity as the most powerful and vital nation in the world ... to exert upon the world the full impact of our influence, for such purposes as we see fit and by such means as we see fit."

Read today, Luce's essay, with its strange mix of chauvinism, religiosity and bombast ("We must now undertake to be the Good Samaritan to the entire world ..."), does not stand up well. Yet the phrase "American Century" stuck and has enjoyed a remarkable run. It stands in relation to the contemporary era much as "Victorian Age" does to the 19th century. In one pithy phrase, it captures (or at least seems to capture) the essence of some defining truth: America as alpha and omega, source of salvation and sustenance, vanguard of history, guiding spirit and inspiration for all humankind.

In its classic formulation, the central theme of the American Century has been one of righteousness overcoming evil. The United States (above all the U.S. military) made that triumph possible. When, having been given a final nudge on Dec. 7, 1941, Americans finally accepted their duty to lead, they saved the world from successive diabolical totalitarianisms. In doing so, the U.S. not only preserved the possibility of human freedom but modeled what freedom ought to look like.

Thank you, comrades

So goes the preferred narrative of the American Century, as recounted by its celebrants.

The problems with this account are twofold. First, it claims for the United States excessive credit. Second, it excludes, ignores or trivializes matters at odds with the triumphal story line.

The net effect is to perpetuate an array of illusions that, whatever their value in prior decades, have long since outlived their usefulness. In short, the persistence of this self-congratulatory account deprives Americans of self-awareness, hindering our efforts to navigate the treacherous waters in which the country finds itself at present. Bluntly, we are perpetuating a mythic version of the past that never even approximated reality and today has become downright malignant. Although Richard Cohen may be right in declaring the American Century over, the American people — and especially the American political class — still remain in its thrall.

Constructing a past usable to the present requires a willingness to include much that the American Century leaves out.

For example, to the extent that the demolition of totalitarianism deserves to be seen as a prominent theme of contemporary history (and it does), the primary credit for that achievement surely belongs to the Soviet Union. When it came to defeating the Third Reich, the Soviets bore by far the preponderant burden, sustaining 65 percent of all Allied deaths in World War II.

By comparison, the United States suffered 2 percent of those losses, for which any American whose father or grandfather served in and survived that war should be saying: Thank you, Comrade Stalin.

For the United States to claim credit for destroying the Wehrmacht is the equivalent of Toyota claiming credit for inventing the automobile. We entered the game late and then shrewdly scooped up more than our fair share of the winnings. The true "Greatest Generation" is the one that willingly expended millions of their fellow Russians while killing millions of German soldiers.

Hard on the heels of World War II came the Cold War, during which erstwhile allies became rivals. Once again, after a decades-long struggle, the United States came out on top.

Yet in determining that outcome, the brilliance of American statesmen was far less important than the ineptitude of those who presided over the Kremlin. Ham-handed Soviet leaders so mismanaged their empire that it eventually imploded, permanently discrediting Marxism-Leninism as a plausible alternative to liberal democratic capitalism. The Soviet dragon managed to slay itself. So thank you, Comrades Malenkov, Khrushchev, Brezhnev, Andropov, Chernenko and Gorbachev.

Screwing the pooch

What flag-wavers tend to leave out of their account of the American Century is not only the contributions of others, but the various missteps perpetrated by the United States — missteps, it should be noted, that spawned many of the problems bedeviling us today.

The instances of folly and criminality bearing the label "made in Washington" may not rank up there with the Armenian genocide, the Bolshevik Revolution, the appeasement of Adolf Hitler, or the Holocaust, but they sure don't qualify as small change. To give them their due is necessarily to render the standard account of the American Century untenable.

Here are several examples, each one familiar, even if its implications for the problems we face today are studiously ignored:

Cuba. In 1898, the United States went to war with Spain for the proclaimed purpose of liberating the so-called Pearl of the Antilles. When that brief war ended, Washington reneged on its promise. If there actually has been an American Century, it begins here, with the U.S. government breaking a solemn commitment, while baldly insisting otherwise. By converting Cuba into a protectorate, the United States set in motion a long train of events leading eventually to the rise of Fidel Castro, the Bay of Pigs, Operation Mongoose, the Cuban Missile Crisis, and even today's Guantánamo Bay prison camp. The line connecting these various developments may not be a straight one, given the many twists and turns along the way, but the dots do connect.

The Bomb. Nuclear weapons imperil our existence. Used on a large scale, they could destroy civilization itself. Even now, the prospect of a lesser power like North Korea or Iran acquiring nukes sends jitters around the world. American presidents — Barack Obama is only the latest in a long line — declare the abolition of these weapons to be an imperative. What they are less inclined to acknowledge is the role the United States played in afflicting humankind with this scourge.

The United States invented the bomb. The United States — alone among members of the nuclear club — actually employed it as a weapon of war. The U.S. led the way in defining nuclear-strike capacity as the benchmark of power in the postwar world, leaving other powers like the Soviet Union, Great Britain, France and China scrambling to catch up. Today, the U.S. still maintains an enormous nuclear arsenal at the ready and adamantly refuses to commit itself to a no-first-use policy, even as it professes its horror at the prospect of some other nation doing as the United States itself has done.

Iran. Extending his hand to Tehran, President Obama has invited those who govern the Islamic republic to "unclench their fists." Yet to a considerable degree, those clenched fists are of our own making. For most Americans, the discovery of Iran dates from the time of the notorious hostage crisis of 1979-1981 when Iranian students occupied the U.S. embassy in Tehran, detained several dozen U.S. diplomats and military officers and subjected the administration of Jimmy Carter to a 444-day-long lesson in abject humiliation.

For most Iranians, the story of U.S.-Iranian relations begins somewhat earlier. It starts in 1953, when CIA agents collaborated with their British counterparts to overthrow the democratically elected government of Mohammed Mossadegh and return the Shah of Iran to his throne. The plot succeeded. The Shah regained power. The Americans got oil, along with a lucrative market for exporting arms. The people of Iran pretty much got screwed. Freedom and democracy did not prosper. The antagonism that expressed itself in November 1979 with the takeover of the U.S. embassy in Tehran was not entirely without cause.

Afghanistan. President Obama has wasted little time in making the Afghanistan War his own. Like his predecessor he vows to defeat the Taliban. Also like his predecessor he has yet to confront the role played by the United States in creating the Taliban in the first place. Washington once took pride in the success it enjoyed funneling arms and assistance to fundamentalist Afghans waging jihad against foreign occupiers. During the administrations of Jimmy Carter and Ronald Reagan, this was considered to represent the very acme of clever statecraft. U.S. support for the Afghan mujahideen caused the Soviets fits. Yet it also fed a cancer that, in time, exacted a most grievous toll on Americans themselves — and has U.S. forces today bogged down in a seemingly endless war.

Act of contrition

Had the United States acted otherwise, would Cuba have evolved into a stable and prosperous democracy, a beacon of hope for the rest of Latin America? Would the world have avoided the blight of nuclear weapons? Would Iran today be an ally of the United States, a beacon of liberalism in the Islamic world, rather than a charter member of the "axis of evil?" Would Afghanistan be a quiet, pastoral land at peace with its neighbors? No one, of course, can say what might have been. All we know for sure is that policies concocted in Washington by reputedly savvy statesmen now look exceedingly ill-advised.

What are we to make of these blunders? The temptation may be to avert our gaze, thereby preserving the reassuring tale of the American Century. We should avoid that temptation and take the opposite course, acknowledging openly, freely and unabashedly where we have gone wrong. We should carve such acknowledgments into the face of a new monument smack in the middle of the Mall in Washington: We blew it. We screwed the pooch. We caught a case of the stupids. We got it ass-backwards.

Only through the exercise of candor might we avoid replicating such mistakes.

Indeed, we ought to apologize. When it comes to avoiding the repetition of sin, nothing works like abject contrition. We should, therefore, tell the people of Cuba that we are sorry for having made such a hash of U.S.-Cuban relations for so long. President Obama should speak on our behalf in asking the people of Hiroshima and Nagasaki for forgiveness. He should express our deep collective regret to Iranians and Afghans for what past U.S. interventionism has wrought.

The United States should do these things without any expectations of reciprocity. Regardless of what U.S. officials may say or do, Castro won't fess up to having made his own share of mistakes. The Japanese won't liken Hiroshima to Pearl Harbor and call it a wash. Iran's mullahs and Afghanistan's jihadists won't be offering to a chastened Washington to let bygones be bygones.

No, we apologize to them, but for our own good — to free ourselves from the accumulated conceits of the American Century and to acknowledge that the United States participated fully in the barbarism, folly and tragedy that defines our time. For those sins, we must hold ourselves accountable.

To solve our problems requires that we see ourselves as we really are. And that requires shedding, once and for all, the illusions embodied in the American Century. ♥

[Andrew J. Bacevich graduated from West Point in 1969 and served in the U.S. Army during the Vietnam War, serving in Vietnam from the summer of 1970 to the summer of 1971. Afterwards he held posts in Germany, the United States, and the Persian Gulf up to his retirement from the service with the rank of Colonel in the early 1990s. He holds a Ph.D. in American Diplomatic History from Princeton University, and taught at West Point and Johns Hopkins University prior to joining the faculty at Boston University in 1998 as a professor of international relations and director of its Center for International Relations (from 1998 to 2005). Bacevitch is the author of several books, including American Empire: The Realities and Consequences of US Diplomacy (2002) and The New American Militarism: How Americans are Seduced by War (2005). He has been "a persistent, vocal critic of the US occupation of Iraq, calling the conflict a catastrophic failure." In March of 2007, he described George W. Bush's endorsement of such "preventive wars" as "immoral, illicit, and imprudent."

On May 13, 2007, Bacevich's son, also named Andrew J. Bacevich, was killed in action in Iraq, when he was killed by a suicide bomber south of Samarra in Salah Ad Din Province. The younger Bacevich, 27, was a First Lieutenant. He was assigned to the 3rd Battalion, 8th U.S. Cavalry Regiment, 1st Cavalry Division.]

Copyright © 2009 Salon Media Group, Inc.

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.

Copyright © 2009 Sapper's (Fair & Balanced) Rants & Raves

Wednesday, April 29, 2009

Quo Vadis?

Listening to a live talk show on NPR from Kokomo, IN is a frightening experience. The Dumbos have descended upon the place and it's yada yada yada about government oppression and black helicopters. Listening to this reminds this blogger of Walt Kelley's Pogo Possum: "We have met the enemy and he is us." If this is (fair & balanced) 100th-day despair, so be it.


[x WQ]
Can America Fail?
by Kishore Mahbubani

Tag Cloud of the following article

created at TagCrowd.com

In 1981, Singapore’s long-ruling People’s Action Party was shocked when it suffered its first defeat at the polls in many years, even though the contest was in a single constituency. I asked Dr. Goh Keng Swee, one of Singapore’s three founding fathers and the architect of its economic miracle, why the PAP lost. He replied, “Kishore, we failed because we did not even conceive of the possibility of failure.”

The simple thesis of this essay is that American society could also fail if it does not force itself to conceive of failure. The massive crises that American society is experiencing now are partly the product of just such a blindness to potential catastrophe. That is not a diagnosis I deliver with rancor. Nations, like individuals, languish when they only have uncritical lovers or unloving critics. I consider myself a loving critic of the United States, a critic who wants American society to succeed. America, I wrote in 2005 in Beyond the Age of Innocence: Rebuilding Trust Between America and the World, has done more good for the rest of the world than any other society.” If the United States fails, the world will suffer too.

The first systemic failure America has suffered is groupthink. Looking back at the origins of the current financial crisis, it is amazing that American society accepted the incredible assumptions of economic gurus such as Alan Greenspan and Robert Rubin that unregulated financial markets would naturally deliver economic growth and serve the public good. In 2003, Greenspan posed this question: “The vast increase in the size of the over- the- counter derivatives markets is the result of the market finding them a very useful vehicle. And the question is, should these be regulated?” His own answer was that the state should not go beyond regular banking regulation because “these derivative transactions are transactions among professionals.” In short, the financial players would regulate themselves.

This is manifest nonsense. The goal of these financial professionals was always to enhance their personal wealth, not to serve the public interest. So why was Greenspan’s nonsense accepted by American society? The simple and amazing answer is that most Americans assumed that their country has a rich and vibrant “marketplace of ideas” in which all ideas are challenged. Certainly, America has the freest media in the world. No subject is taboo. No sacred cow is immune from criticism. But the paradox here is that the belief that American society allows every idea to be challenged has led Americans to assume that every idea is challenged. They have failed to notice when their minds have been enveloped in groupthink. Again, failure occurs when you do not conceive of failure.

The second systemic failure has been the erosion of the notion of individual responsibility. Here, too, an illusion is at work. Because they so firmly believe that their society rests on a culture of individual responsibility — rather than a culture of entitlement, like the social welfare states of Europe — Americans cannot see how their individual actions have undermined, rather than strengthened, their society. In their heart of hearts, many Americans believe that they are living up to the famous challenge of President John F. Kennedy, “Ask not what your country can do for you — ask what you can do for your country.” They believe that they give more than they take back from their own society.

There is a simple empirical test to see whether this is true: Do Americans pay more in taxes to the government than they receive in government services? The answer is clear. Apart from a few years during the Clinton administration, the United States has had many more federal budget deficits than surpluses — and the ostensibly more fiscally responsible Republicans are even guiltier of deficit financing than the Democrats.

The recently departed Bush administration left America with a national debt of more than $10 trillion, compared with the $5.7 trillion left by the Clinton administration. Because of this large debt burden, President Barack Obama has fewer bullets to fire as he faces the biggest national economic crisis in almost a century. The American population has taken away the ammunition he could have used, and left its leaders to pray that China and Japan will continue to buy U.S. Treasury bonds.

How did this happen? Americans have justified the erosion of individual responsibility by demonizing taxes. Every candidate for political office in America runs against taxes. No American politician — including President Obama — dares to tell the truth: that no modern society can function without significant taxes. In some cases, taxes do a lot of good. If Americans were to impose a $1 per gallon tax on gasoline (which they could easily afford), they would begin to solve many of their problems, reducing greenhouse-gas emissions, dependence on Middle East oil, and the production of fuel-inefficient cars and trucks.

The way Americans have dealt with the tax question shows that there is a sharp contradiction between their belief that their society rests on a culture of individual responsibility and the reality that it has been engulfed by a culture of individual irresponsibility. But beliefs are hard to change. Many American myths come from the Wild West era, when lone cowboys struggled and survived supposedly through individual ingenuity alone, without the help of the state. Americans continue to believe that they do not benefit from state support. The reality is that many do.

The third systemic failure of American society is its failure to see how the abuse of American power has created many of the problems the United States now confronts abroad. The best example is 9/11. Americans believe they were innocent victims of an evil attack by Osama bin Laden and Al Qaeda. And there can be no doubt that the victims of 9/11 were innocent. Yet Americans tend to forget the fact that Osama bin Laden and Al Qaeda were essentially created by U.S. policies. In short, a force launched by the United States came back to bite it.

During the Cold War, the United States was looking for a powerful weapon to destabilize the Soviet Union. It found it when it created a pan- Islamic force of mujahideen fighters, drawn from countries as diverse as Algeria and Indonesia, to roll back the Soviet invasion of Afghanistan after 1979. For a time, American interests and the interests of the Islamic world converged, and the fighters drove the Soviets out and contributed to the collapse of the Soviet Union. At the same time, however, America also awakened the sleeping dragon of Islamic solidarity.

Yet when the Cold War ended, America thoughtlessly disengaged from Af ghan istan and the powerful Islamic forces it had supported there. To make matters worse, it switched its Middle East policy from a relatively even handed one on the Israel- Palestine issue to one heavily weighted to ward the Israelis. Aaron David Miller, a longtime U.S. Middle East negotiator who served under both the Clinton and George W. Bush administrations (and is now a public-policy scholar at the Woodrow Wilson Center), wrote recently that both administrations “scrupulously” road-tested every idea and proposal with Israel before bringing it to the Palestinians.

Americans seem only barely aware of the pain and suffering of the Palestinian people, and the sympathy their plight stirs in the world’s 1.2 billion Muslims, who hold America responsible for the Palestinians’ condition. And tragically, in the long run, a conflict between six million Israelis and 1.2 billion Muslims would bring grief to Israel. Hence, Americans should seriously review their Middle East policies.

The Middle East is only one of many areas in which American policies have harmed the world. From U.S. cotton subsidies, which have hurt poor African farmers, to the invasion of Iraq; from Washington’s double stan dard on nuclear proliferation — calling on non nuclear states to abide by the Nuclear Non- Proliferation Treaty while ignoring its own obligations — to its decision to walk away from the Kyoto Protocol without providing an alternate approach to global warming, many American policies have injured the 6.5 billion other people who inhabit the world.

Why aren’t Americans aware of this? The reason is that virtually all analysis by American intellectuals rests on the assumption that problems come from outside America and America provides only solutions. Yet the rest of the world can see clearly that American power has created many of the world’s major problems. American thinkers and policymakers cannot see this because they are engaged in an incestuous, self-referential, and self-congratulatory discourse. They have lost the ability to listen to other voices on the planet because they cannot conceive of the possibility that they are not already listening. But until they begin to open their ears, America’s problems with the world will con tinue.

It will not be easy for America to change course, because many of its problems have deep structural causes. To an outsider, it is plain to see that structural failures have developed in America’s governance, in its social contract, and in its response to globalization. Many Americans still cannot see this.

When Americans are asked to identify what makes them proudest of their society, they inevitably point to its democratic character. And there can be no doubt that America has the most successful democracy in the world. Yet it may also have some of the most corrupt governance in the world. The reason more Americans are not aware of this is that most of the corruption is legal.

In democracies, the role of government is to serve the public interest. Americans believe that they have a government “of the people, by the people, and for the people.” The reality is more complex. It looks more like a government “of the people, by special-interest groups, and for special-interest groups.” In the theory of democracy, corrupt and ineffective politicians are thrown out by elections. Yet the fact that more than 90 percent of incumbents who seek reelection to the U.S. House of Representatives are re elected provides a clear warning that all is not well. In The Audacity of Hope (2006), Barack Obama himself describes the corruption of the political system and the public’s low regard for politicians. “All of which leads to the conclusion that if we want anything to change in Washington, we’ll need to throw the rascals out. And yet year after year we keep the rascals right where they are, with the reelection rate for House members hovering at around 96 percent,” Obama writes. Why? “These days, almost every congressional district is drawn by the ruling party with computer-driven precision to ensure that a clear majority of Democrats or Republicans reside within its borders. Indeed, it’s not a stretch to say that most voters no longer choose their representatives; instead, representatives choose their voters.”

The net effect of this corruption is that American governmental institutions and processes are now designed to protect special interests rather than public interests. As the financial crisis has revealed with startling clarity, regulatory agencies such as the Securities and Exchange Commission and the Commodity Futures Trading Commission have been captured by the industries they are supposed to regulate. And when Congress opens the government’s purse, the benefits flow to special interests rather than the public interest. Few Americans are aware how severely special interests undermine their own national interests, both at home and abroad. The latest two world trade negotiating rounds (including the present Doha Round), for example, have been held hostage by the American agricultural lobbies. To protect 25,000 rich American cotton farmers, the United States has jeopardized the interests of the rest of the 6.8 billion people in the world.

Normally, a crisis provides a great opportunity to change course. Yet the current crisis has elicited tremendous delay, obfuscation, and pandering to special interests. From afar, America’s myopia is astounding and incomprehensible. When the stimulus packages of the Chinese and U.S. governments emerged at about the same time, I scanned American publications in search of attempts to compare the two measures. I could not find any. This confirmed my suspicion that American intellectuals and policymakers could not even conceive of the possibility that the Chinese effort may be smarter or better designed than the American one.

An even bigger structural failure that American society may face is the collapse of its social contract. The general assumption in the United States is that American society remains strong and cohesive because every citizen has an equal chance to succeed. Because most Americans believe they have had the same opportunity, there is little resentment when a Bill Gates or a Sergey Brin amasses a great fortune.

This ideal of equal opportunity is a useful national myth. But when the gap between myth and reality becomes too wide, the myth cannot be sustained. Today, research shows that social mobility in the United States has declined significantly. In the 2008 report The Measure of America, a research group, the American Human Development Project, notes that “the average income of the top fifth of U.S. households in 2006 was almost 15 times that of those in the lowest fifth — or $168,170 versus $11,352.” The researchers also observe that “social mobility is now less fluid in the United States than in other affluent nations. Indeed, a poor child born in Germany, France, Canada, or one of the Nordic countries has a better chance to join the middle class in adulthood than an American child born into similar circumstances.”

Behind these statistics are some harsh realities. Nearly one in five American children lives in poverty, and more than one in 13 lives in extreme poverty. African-American babies are more than twice as likely as white or Latino babies to die before reaching their first birthday. People in more than half a million households experience hunger, data from the U.S. Department of Agriculture indicate. The education system is both inegalitarian and ineffective. In a recent international assessment of subject-matter literacy in 57 countries, America’s 15-year-olds ranked 24th in mathematics and 17th in science. It should come as no surprise that though the United States ranks second among 177 countries in per capita income, it ranks only 12th in terms of human development.

More dangerously, many of those who have grown wealthy in the past few decades have added little of real economic value to society. Instead, they have created “financial weapons of mass destruction,” and now they continue to expect rich bonuses even after they delivered staggering losses. Their behavior demonstrates a remarkable decline of American values and, more important, the deterioration of the implicit social contract between the wealthy and the rest of society. It would be fatal for America if the wealthy classes were to lose the trust and confidence of the broader American body politic. But many of America’s wealthy cannot even conceive of this possibility. This explains why so few of the Richard Fulds and John Thains have apologized with any sincerity for the damage they have done.

America’s latest responses to globalization also reveal symptoms of a structural failure. Hitherto, Americans have been champions of globalization because they have believed that their own economy, the most competitive in the world, would naturally triumph as countries lowered their trade and tariff barriers. This belief has been an important force driving the world trading system toward greater openness.

Today, in a sign of great danger for the United States and for the world, the American people are losing confidence in their ability to compete with Chinese and Indian workers. More and more American politicians are jumping on the protectionist bandwagon (al though almost all of them dishonestly claim they are not protectionists). Even the American intelligentsia is retreating from its once stout defense of free trade. Paul Krugman of Princeton and The New York Times, who won the Nobel Prize for Economics in 2008, showed which way the wind was blowing when he wrote, “It’s hard to avoid the conclusion that growing U.S. trade with Third World countries reduces the real wages of many and perhaps most workers in this country. And that reality makes the politics of trade very difficult.”

At the moment of their country’s greatest economic vulnerability in many decades, few Americans dare to speak the truth and say that the United States cannot retreat from globalization. Both the American people and the world would be worse off. However, as globalization and global capitalism create new forces of “creative destruction,” America will have to restructure its economy and society in order to compete. It will need to confront its enormously wasteful and inefficient health care policies and the deteriorating standards of its public education system. It must finally confront its economic failures as well, and stop rewarding them. If General Motors, Chrysler, and Ford cannot compete, it will be futile to protect them. They, too, have failed because they could not conceive of failure.

Every problem has a solution. This has always been the optimistic American view. It is just as true in bad times as in good times. But painful problems do not often have painless solutions. This is equally true of the current economic crisis. To deal with it, American leaders must add an important word when they speak the truth to the American people. The word is sacrifice. There can be no solution to America’s problems without sacrifice.

One paradox of the human condition is that the most logical point at which to undertake painful reform is in good times. The pain will be less then. But virtually no society, and especially no democratic society, can administer significant pain in good times. It takes a crisis to make change possible. Hence, there is a lot of wisdom in the principle, “never waste a crisis.”

Let me suggest for purely illustrative purposes three painful reforms the United States should consider now. The goal of these suggestions is to trigger a serious discussion of reform in American discourse.

First, there is a silver bullet that can dispel some of the doom and gloom enveloping the world and admit a little hope. And hope is what we need to get the economic wheels turning in the right direction. As Amartya Sen, another Nobel laureate in economics, said recently, “Once an economy is in the grip of pessimism, you cannot change it just by changing the objective circumstance, because the lack of confidence in people makes the economy almost unrescuable. You have to address the confidence thing, and that requires a different type of agenda than we have.” The completion of the Doha Round of world trade talks would go a long way toward restoring that confidence. The good news is that the deal is almost 95 percent cooked. But the last five percent is the most difficult.

One of the key obstacles to the completion of the Doha Round is the resistance of those 25,000 rich American cotton farmers. Millions of their poor West African counterparts will not accept a Doha Round agreement without a removal of the U.S. cotton subsidies that unfairly render their own crops uncompetitive. In both moral and rational terms, the decision should be obvious. The interests of the 6.8 billion people who will benefit from a successful Doha Round are more important than the interests of 25,000 American farmers. This handful of individuals should not be allowed to veto a global trade deal.

America’s rich cotton farmers are also in the best position to make a sacrifice. Collectively, they have received more than $3 billion a year in subsidies over the last eight years, a total of about $1 million each. If they cannot make a sacrifice, who in America can? Where is the American politician with the courage say this?

America has a second silver bullet it can use: a $1 per gallon tax on gasoline. To prevent the diversion of the resulting revenues into pork barrel projects, the money should be firewalled and used only to promote energy efficiency and address the challenge of climate change. Last year, the United States consumed more than 142 billion gallons of gas. Hence, even allowing for a change in consumption, a gas tax could easily raise more than $100 billion per year to address energy challenges.

This sounds like a painful sacrifice, one that America’s leaders can hardly conceive of asking, yet it is surprising that Americans did not complain when they effectively paid a tax of well over $1 per gallon to Saudi Arabia and other oil producers when oil prices surged last year. Then, the price at the pump was more than $4 a gallon. Today, with world oil prices hovering around only $40 a barrel, the price per gallon is around half its peak price. A $1 tax would still leave gas relatively cheap.

This brings me to the third silver bullet: Every American politician should declare that the long-­term interests of the country are more important than his or her personal political career. As leaders, they should be prepared to make the ultimate political sacrifice in order to speak the truth: The time has come for Americans to spend less and work harder. This would be an extraordinary commitment for politicians anywhere in the world, but it is precisely politics as usual that led the United States to today’s debacle.

The latest budget presented to Congress by President Obama offers a great opportunity for change. Instead of tearing the budget apart in pursuit of narrow interests and larding it with provisions for special interests, Congress has the opportunity to help craft a rational plan to help people at the bottom, promote universal health care, and create incentives to enhance American competitiveness.

I know that such a rational budget is almost totally inconceivable to the American body politic. The American political system has become so badly clogged with special interests that it resembles a diseased heart. When an individual develops coronary blockages, he or she knows that the choices are massive surgery or a massive heart attack. The fact that the American body politic cannot conceive of the possibility that its clogged political arteries could lead to a catastrophic heart attack is an indication that American society cannot conceive of failure. And if you cannot conceive of failure, failure comes. ♥

[Kishore Mahbubani, dean of the Lee Kuan Yew School of Public Policy at the National University of Singapore, is the author most recently of The New Asian Hemisphere: The Irresistible Shift of Global Power to the East (2008). Mahbubani served as president of the United Nations Security Council (2001-2002). Mahbubani received an undergraduate in University of Singapore as well as a Master's degree in philosophy from Dalhousie University. In addition, he was a fellow at the Center for International Affairs at Harvard University in 1991-92.]

Copyright © 2009 Wilson Quarterly

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.

Copyright © 2009 Sapper's (Fair & Balanced) Rants & Raves

Tuesday, April 28, 2009

Whee! Waterboard Me Again, Dude!

Tom Tomorrow dissects the doublespeak and dissembling that fills the airwaves and the Op-Ed columns. The remains are disgusting and are a stain upon the national fabric. Send The Dubster, The Dickster, and all of their obedient subordinates to the gallows. If hanging was good enough for that torturer, Saddam Hussein, it is good enough for this lot. The reason for their torture was to link Saddam to 9/11 by any means necessary to justify the war with Iraq. Hanging the Bushies as Saddam hanged by the neck until dead will close the circle and end this long, national nightmare. If this is (fair & balanced) call for the rope, so be it.

[x Salon]
TMW — Talking About Torture
By Tom Tomorrow (Dan Perkins)

Click on image to enlarge. ♥

Tom Tomorrow/Dan Perkins

[Dan Perkins is an editorial cartoonist better known by the pen name "Tom Tomorrow". His weekly comic strip, "This Modern World," which comments on current events from a strong liberal perspective, appears regularly in approximately 150 papers across the U.S., as well as on Salon and Working for Change. The strip debuted in 1990 in SF Weekly.

Perkins, a long time resident of Brooklyn, New York, currently lives in Connecticut. He received the Robert F. Kennedy Award for Excellence in Journalism in both 1998 and 2002.

When he is not working on projects related to his comic strip, Perkins writes a daily political weblog, also entitled "This Modern World," which he began in December 2001.]

Copyright © 2009 Salon Media Group, Inc.

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.

Copyright © 2009 Sapper's (Fair & Balanced) Rants & Raves

Oink! Redux

Wash your hands! The last Great Swine Flu Epidemic was a flop. This time...? If this is (fair & balanced) hysteria, so be it.

[x Salon]
The Last Great Swine Flu Epidemic
By Patrick Di Justo

Tag Cloud of the following article

created at TagCrowd.com

There is evidence there will be a major flu epidemic this coming fall. The indication is that we will see a return of the 1918 flu virus that is the most virulent form of the flu. In 1918 a half million Americans died. The projections are that this virus will kill one million Americans in 1976.

— F. David Matthews, Secretary of Health, Education, and Welfare (February 1976)

In January 1976, 19-year old U.S. Army Private David Lewis, stationed at Fort Dix, joined his platoon on a 50-mile hike through the New Jersey snow. Lewis didn't have to go; he was suffering from flu and had been confined to his quarters by his unit's medical officer. Thirteen miles into the hike, Lewis collapsed and died a short time later of pneumonia caused by influenza. Because Lewis was young, generally healthy and should not have succumbed to the common flu, his death set off a cascade of uncertainty that confused the scientists, panicked the government and eventually embittered a public made distrustful of authority by Vietnam and Watergate.

This past Sunday, Homeland Security Secretary Janet Napolitano left open the possibility of a mass immunization program for the current outbreak of swine flu. If that happens, the Obama administration has a lot to learn from the debacle set in motion by Private Lewis' ill-fated hike.

Lewis was a victim of swine flu, a form of influenza endemic to pig populations. Influenza is caused by a virus, a microorganism that is mostly dead and partially alive. The virus' genetic code, held inside a protein sheath, consists of several helices of RNA. The virus injects its RNA into a healthy cell, which causes the cell to stop its usual work and make more copies of the virus. RNA genes mutate easily; for this reason, each new flu season brings a slightly different form of the disease into the population. Most year-to-year mutations bring little change to the virus, but for some still unknown reason, influenza seems to undergo a significant genetic change every ten years or so.

This major mutation results in a radically new strain of flu, one that races through a population because few people are immune to it. The dangerous influenza epidemics of 1938, 1947, 1957 (60,000 dead in the U.S.) and 1968 (the dreaded Hong Kong flu) fit this pattern. It was believed that swine flu, a particularly deadly form of the virus, had a 60-year mutation cycle that brought on worldwide pandemics, killing millions of people. Both the 10- and 60-year cycles were due to converge in the mid 1970s; Lewis' death in 1976 was thought to be the first instance of a new, incredibly lethal type of flu.

Doctors from the Centers for Disease Control tested Private Lewis' blood, and determined that his immune system had developed antibodies to a strain of flu similar to the Spanish influenza of 1918. That particular strain of swine flu produced the worst human pandemic of the 20th century: 1 billion sick in every country of the world, at least 22 million dead in the space of a few months. If Lewis had been exposed to something like the 1918 flu virus, the world could be in for an extensive and lethal outbreak. CDC doctors, charged with protecting the U.S. from epidemics, began to worry.

By the end of January, 155 soldiers at Fort Dix reported positive for swine flu antibodies. None of the soldiers' families or co-workers, however, had been exposed to the virus; all of the reported swine flu cases had been limited to the soldiers in Private Lewis' camp. The virus wasn't spreading. For some reason this information did not mollify the doctors, and on February 14, 1976, the CDC issued a notice to all U.S. hospitals to be on the lookout for any cases of swine flu.

By March, the normal end of flu season, worldwide cases of all types of flu had diminished, and not one case of swine flu had been reported outside of Fort Dix. For some reason this news did not placate the doctors either, and on March 13, 1976, the director of the CDC asked Congress for money to develop and test enough swine flu vaccine to immunize at least 80 percent of the population of the United States, believed to be the minimum needed to avoid an epidemic.

1976 was the year of the U.S. Bicentennial. 1976 was a presidential election year. 1976 was two years after Watergate caused Nixon's resignation, and one year after the fall of Saigon. The U.S. government, both Republicans and Democrats, had never been held in such low esteem. Practically every elected official felt an overwhelming itch that patriotic year to do something to get the public thinking of them as good guys again. A swine flu pandemic was an opportunity on a plate. What better way to get into the good graces of the voters than to save them from a plague?

Between March 13 and March 24, the U.S. government dealt with the perceived flu emergency at fever pitch. The vaccine request went from the CDC to the secretary of HEW (Department of Health, Education and Welfare, the forerunner of today's Department of Health and Human Services), and reached the president's desk in less than a week. On March 24, the day after he lost the North Carolina primary to Ronald Reagan, President Gerald Ford welcomed the top virologists in the nation to a meeting in the White House and asked them if the nation was facing a swine flu epidemic. Would mass vaccinations be necessary? The doctors all said yes.

After the meeting, President Ford held a press conference with Jonas Salk and Albert Sabin, developers of the polio vaccine. The president heralded the impending flu plague and asked Congress for $135 million to investigate the development of a swine flu vaccine, with the goal of vaccinating the citizenry. This was probably the first time that most of the nation had heard of swine flu.

Congress, with few exceptions, raced to support the bill. Knowing the Republican president would not, could not veto a bill he requested, the Democratically controlled House attached $1.8 billion dollars in welfare and environmental spending to the flu bill. President Ford signed the bill on April 15, 1976, and incorrectly remarked to the press that the Fort Dix swine flu was identical to the deadly 1918 variety. He announced the immunization program would begin in October.

The scientists began to come to their senses. By July, they were pretty much agreed that a flu pandemic in 1976 would not lead to 1 million U.S. dead. The flu strain extracted from Private Lewis, they learned, was much less virulent that the 1918 strain, and modern medicine could handle an outbreak far better than the World War I doctors could. The World Health Organization ordered hospitals to keep a global lookout for swine flu, but it did not request mass immunization of the population.

But the U.S. government was unstoppable. Congress began to pressure the drug companies to work faster toward development of a swine flu vaccine. The drug companies insisted that proper vaccine development required years of experimentation and clinical trials, and they were reluctant to develop and distribute an untested drug. The drug companies suggested that they could work faster if they were given immunity from lawsuits in the event something went wrong with the vaccine. Congress refused. The issue of legal liability remained at an impasse until August 2, 1976.

On that day, two members of the American Legion died of a strange respiratory disease they acquired at the Legion's convention in Philadelphia. Congress collectively freaked. Panicky news reports out of Philadelphia hinted that the deaths were the beginning of the Great Swine Flu Epidemic of 1976. On August 3, Congress agreed to completely indemnify the drug companies against any and all lawsuits they might incur as a result of the distribution of swine flu vaccine. The drug companies got to work.

On the same day, the CDC Disease Etiology Team sprang into action, and it had never performed better. On Aug. 5, the head of the CDC was able to testify before Congress and announce conclusively that the Legionnaires had died of a new disease, a type of pneumonia that was definitely not swine flu. When Congress was informed that the dreaded epidemic had not started, they canceled their indemnification agreement with the drug companies. The drug companies announced that they would immediately cease development of swine flu vaccines. They also began to hint that even if they were to be re-indemnified, they now wanted Congress to guarantee them reasonable profits from the development of the vaccines.

President Ford went on television that night and delivered a speech to the nation, telling Americans that Congress will be to blame for your deaths when the flu season begins in October. Congress caved in, and on August 15, President Ford signed the National Influenza Immunization Program (NIIP). This set as a goal the immunization of at least 80 percent of the U.S. population, indemnified the drug companies and left vague the government's power to limit the drug companies' profit. The drug companies got to work.

By September, the swine flu scaffolding came crashing down. Pollsters reported that while 93 percent of the population had heard of swine flu and knew it could cause a million U.S. deaths, only 52 percent planned to get immunized. The press was claiming that Congress had not done a good job of educating the public. Congress members blamed the failure on the CDC. The CDC was busy looking into the deaths of the Legionnaires; while they were able to say that the Legionnaires had not died of swine flu, they were unable to pin down what exactly what had killed the men. The American Legion thought the whole thing was a Communist plot. Congressman John Murphy of Staten Island claimed the CDC was stalling on identifying the Legionnaire's disease to panic people into fearing swine flu. Murphy demanded an investigation into the CDC and the indemnification deal made with the drug companies. The heroic miracle that was supposed to overhaul the government's image was rendered futile before it had started.

On October 1, 1976, the immunization program began. By October 11, approximately 40 million people had received swine flu immunizations, mostly through the new compressed air vaccination guns. That evening, in Pittsburgh, came the first blow to the immunization program: Three senior citizens died soon after receiving their swine flu shots. The media outcry, linking the deaths to the immunizations without any proof, was so loud it drew an on-air rebuke from CBS news anchor Walter Cronkite, who warned his colleagues of the dangers of post hoc ergo propter hoc ("after this, therefore, because of this") thinking. But it was too late. The government had long feared mass panic about swine flu — now they feared mass panic about the swine flu vaccinations.

The deaths in Pittsburgh, though proved not to be related to the vaccine, were a strong setback to the program. The death blow came a few weeks later when reports appeared of Guillain-Barré syndrome, a paralyzing neuromuscular disorder, among some people who had received swine flu immunizations. The public refused to trust a government-operated health program that killed old people and crippled young people; as a result, less than 33 percent of the population had been immunized by the end of 1976. The National Influenza Immunization Program was effectively halted on December 16.

Gerald Ford's attempt to gain credit for keeping America safe was busted. He lost the presidential election to Jimmy Carter that November. The 1976 to 1977 flu season was the most flu-free since records had been kept; a condition that was apparently unrelated to the vaccination program. The Great Swine Flu Epidemic of 1976 never took place.

[Patrick Di Justo is a contributing editor at Wired magazine; he also is a freelance writer at New York magazine, the Scientific American, and Worldchanging: A User's Guide for the 21st Century. Di Justo received a BA in journalism from the College of Mount Saint Vincent and an MS in Multimedia Design from Columbia University.]

Copyright © 2008 Salon Media Group, Inc.

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.

Copyright © 2009 Sapper's (Fair & Balanced) Rants & Raves

Monday, April 27, 2009

My Favorite Butcher Was A Real Cutup!

The Texas barbeque aficionado of our day is Wyatt McSpadden, a talented photographer, born in Amarillo and transplanted to Austin. In his recent book, Texas BBQ, McSpadden proclaims that his favorite butcher was Harold Hines who worked in the McSpadden family grocery store in Amarillo in the 1960s. Hines prepared barbeque in the meat department on a few Saturdays each month and McSpadden was smitten. In his own bedraggled past, this blogger marveled at a butcher who worked in the grocery store owned by the blogger's maternal grandparents. The old fellow's name was Otto Osler and he was old-school. My managerial grandmother constantly snarled about Otto's apron which was not spotless and the chewing tobacco in Otto's cheek. For Wyatt McSpadden, the spirit of Harold Hines lives in Texas barbeque. For this blogger, Otto Osler lives there, too. If this is (fair & balanced) nostalgia, so be it.

[x Texas Monthly]
Holy Smoke
By Wyatt McSpadden

Tag Cloud of the following article

created at TagCrowd.com

I grew up in a grocery store in Amarillo. My dad and his brother took over Central Grocery, the family business, from their father when they returned from World War II. In 1962, when I was ten years old, I started going to work with Dad on Saturdays. I carried around a milk crate to stand on so I could work produce or bag groceries, my apron rolled up so I wouldn’t trip on it. The store was a marvelous place for a little kid, but the best part, the heart of it, was the meat market. Central Grocery was known around town for its fine meats, and the star of the operation was the butcher. The butcher was special: He didn’t sack groceries, run the register, trim the lettuce, or stock the shelves. The meat market was off-limits to me—its floors were slick, the knives were sharp, and the butcher was not to be disturbed.

One butcher stands out in my memory. I was in my early teens when he came to us. His name was Harold Hines, and everyone said he was a really good butcher; I can’t say if that was true or not, but I do know he was friendlier with the customers and with me than most of his predecessors had been. What really set him apart, though, was that once or twice a month, on a Saturday, Harold would make barbecue. That’s what he called it, although it was something he prepared in a big electric slow cooker in the meat market. I loved it—the smell that filled the store, the little cardboard bowls he’d give me to sample from, the soda crackers and longhorn cheese. The aroma drew folks to the market. Harold was a star.

For the past fifteen years, as I’ve traveled around Texas looking for barbecue places to photograph, my inner sack boy has been reawakened. Some of the places I visited weren’t so different from Dad’s store: Prause Meat Market, in La Grange, with the beautiful Friedrich refrigerated cases made in 1952, the year I was born; Gonzales Food Market, where co-owner Rene Lopez Garza still wears an apron and paper hat from a bread company; Dozier’s Grocery, in Fulshear, with a modest display of canned goods and paper products standing between the entrance and one of the most eye-popping selections of meats I’ve ever seen. What these places have that Central Grocery didn’t is real wood-smoked barbecue, made out back in brick pits fueled by post oak, pecan, and mesquite.

Many of the places I visited started out as meat markets. Kreuz Market, in Lockhart, and City Market, in Luling, still display vestiges of their butcher-shop heritage, even though these days the house specialty is smoked meats. Over time the process of creating pit barbecue has transformed such modest spots into magical places. The smoke and heat have penetrated the walls and the people who toil within them. Part of the magic is in the food; part is the fact that I was always made to feel welcome. Whether I called in advance or dropped in unannounced, I was, without exception, free to shoot whatever interested me. These pictures are my thank-you to all the wonderful folks, all the Harolds, who let me inside their lives and who took the time to stand for me or open a pit or stoke up a flame. ♥

[From Texas BBQ, by Wyatt McSpadden. Copyright © 2009 The University of Texas Press. McSpadden narrates a slide show of images of some of the state’s best barbecue joints, pitmasters, and finger-lickin’ meats from his new book, Texas BBQ. at this Texas Monthly link:Smokin'.]

Copyright © 2009 Emmis Publishing

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.

Copyright © 2009 Sapper's (Fair & Balanced) Rants & Raves

Shed No Tiers For Me

The most current version of Texas envy is grounded in the fact that California (sneer) has 9 Tier I universities and New York (double-sneer) has 7 such super-schools. Texas only has 3: UT-Austin, Texas A&M University, and Rice University.

Nationally, the United States has 51 Tier I Universities — ranging from Harvard University to the University of California at Berkeley. This year, in the Lone Star State, the cry has gone up for more Tier I universities because the two public super-schools in Texas are not located in the major urban centers; Houston, San Antonio, and Dallas want public Tier I institutions to call their own. Of course, the invidious comparisons of Texas to California and New York prompt the Texas urge to surpass those inferior states. A few years ago, the University of Texas at Austin sought to wrest the contract to manage the Los Alamos National Laboratory (NM) from the University of California at Berkeley. The contract was renewed with Cal-Berkeley. Ouch. More recently, the nano-technology research center that had been founded at UT-Austin was snatched away by the State University of New York at Albany. Double-Ouch. It's enough to bring tiers to a Texan's eyes.

Interestingly, an academic at Columbia University (one of the 7 New York Tier I schools) offers another alternative. Abolish the university. The Tier I issue is DOA. If this is (fair & balanced) academic heresy, so be it.

[x NY Fishwrap]
End The University As We Know It
By Mark C. Taylor

Tag Cloud of the following article

created at TagCrowd.com

Graduate education is the Detroit of higher learning. Most graduate programs in American universities produce a product for which there is no market (candidates for teaching positions that do not exist) and develop skills for which there is diminishing demand (research in subfields within subfields and publication in journals read by no one other than a few like-minded colleagues), all at a rapidly rising cost (sometimes well over $100,000 in student loans).

Widespread hiring freezes and layoffs have brought these problems into sharp relief now. But our graduate system has been in crisis for decades, and the seeds of this crisis go as far back as the formation of modern universities. Kant, in his 1798 work The Conflict of the Faculties, wrote that universities should “handle the entire content of learning by mass production, so to speak, by a division of labor, so that for every branch of the sciences there would be a public teacher or professor appointed as its trustee.”

Unfortunately this mass-production university model has led to separation where there ought to be collaboration and to ever-increasing specialization. In my own religion department, for example, we have 10 faculty members, working in eight subfields, with little overlap. And as departments fragment, research and publication become more and more about less and less. Each academic becomes the trustee not of a branch of the sciences, but of limited knowledge that all too often is irrelevant for genuinely important problems. A colleague recently boasted to me that his best student was doing his dissertation on how the medieval theologian Duns Scotus used citations.

The emphasis on narrow scholarship also encourages an educational system that has become a process of cloning. Faculty members cultivate those students whose futures they envision as identical to their own pasts, even though their tenures will stand in the way of these students having futures as full professors.

The dirty secret of higher education is that without underpaid graduate students to help in laboratories and with teaching, universities couldn’t conduct research or even instruct their growing undergraduate populations. That’s one of the main reasons we still encourage people to enroll in doctoral programs. It is simply cheaper to provide graduate students with modest stipends and adjuncts with as little as $5,000 a course — with no benefits — than it is to hire full-time professors.

In other words, young people enroll in graduate programs, work hard for subsistence pay and assume huge debt burdens, all because of the illusory promise of faculty appointments. But their economical presence, coupled with the intransigence of tenure, ensures that there will always be too many candidates for too few openings.

The other obstacle to change is that colleges and universities are self-regulating or, in academic parlance, governed by peer review. While trustees and administrations theoretically have some oversight responsibility, in practice, departments operate independently. To complicate matters further, once a faculty member has been granted tenure he is functionally autonomous. Many academics who cry out for the regulation of financial markets vehemently oppose it in their own departments.

If American higher education is to thrive in the 21st century, colleges and universities, like Wall Street and Detroit, must be rigorously regulated and completely restructured. The long process to make higher learning more agile, adaptive and imaginative can begin with six major steps:

1. Restructure the curriculum, beginning with graduate programs and proceeding as quickly as possible to undergraduate programs. The division-of-labor model of separate departments is obsolete and must be replaced with a curriculum structured like a web or complex adaptive network. Responsible teaching and scholarship must become cross-disciplinary and cross-cultural.

Just a few weeks ago, I attended a meeting of political scientists who had gathered to discuss why international relations theory had never considered the role of religion in society. Given the state of the world today, this is a significant oversight. There can be no adequate understanding of the most important issues we face when disciplines are cloistered from one another and operate on their own premises.

It would be far more effective to bring together people working on questions of religion, politics, history, economics, anthropology, sociology, literature, art, religion and philosophy to engage in comparative analysis of common problems. As the curriculum is restructured, fields of inquiry and methods of investigation will be transformed.

2. Abolish permanent departments, even for undergraduate education, and create problem-focused programs. These constantly evolving programs would have sunset clauses, and every seven years each one should be evaluated and either abolished, continued or significantly changed. It is possible to imagine a broad range of topics around which such zones of inquiry could be organized: Mind, Body, Law, Information, Networks, Language, Space, Time, Media, Money, Life and Water.

Consider, for example, a Water program. In the coming decades, water will become a more pressing problem than oil, and the quantity, quality and distribution of water will pose significant scientific, technological and ecological difficulties as well as serious political and economic challenges. These vexing practical problems cannot be adequately addressed without also considering important philosophical, religious and ethical issues. After all, beliefs shape practices as much as practices shape beliefs.

A Water program would bring together people in the humanities, arts, social and natural sciences with representatives from professional schools like medicine, law, business, engineering, social work, theology and architecture. Through the intersection of multiple perspectives and approaches, new theoretical insights will develop and unexpected practical solutions will emerge.

3. Increase collaboration among institutions. All institutions do not need to do all things and technology makes it possible for schools to form partnerships to share students and faculty. Institutions will be able to expand while contracting. Let one college have a strong department in French, for example, and the other a strong department in German; through teleconferencing and the Internet both subjects can be taught at both places with half the staff. With these tools, I have already team-taught semester-long seminars in real time at the Universities of Helsinki and Melbourne.

4. Transform the traditional dissertation. In the arts and humanities, where looming cutbacks will be most devastating, there is no longer a market for books modeled on the medieval dissertation, with more footnotes than text. As financial pressures on university presses continue to mount, publication of dissertations, and with it scholarly certification, is almost impossible. (The average university press print run of a dissertation that has been converted into a book is less than 500, and sales are usually considerably lower.) For many years, I have taught undergraduate courses in which students do not write traditional papers but develop analytic treatments in formats from hypertext and Web sites to films and video games. Graduate students should likewise be encouraged to produce “theses” in alternative formats.

5. Expand the range of professional options for graduate students. Most graduate students will never hold the kind of job for which they are being trained. It is, therefore, necessary to help them prepare for work in fields other than higher education. The exposure to new approaches and different cultures and the consideration of real-life issues will prepare students for jobs at businesses and nonprofit organizations. Moreover, the knowledge and skills they will cultivate in the new universities will enable them to adapt to a constantly changing world.

6. Impose mandatory retirement and abolish tenure. Initially intended to protect academic freedom, tenure has resulted in institutions with little turnover and professors impervious to change. After all, once tenure has been granted, there is no leverage to encourage a professor to continue to develop professionally or to require him or her to assume responsibilities like administration and student advising. Tenure should be replaced with seven-year contracts, which, like the programs in which faculty teach, can be terminated or renewed. This policy would enable colleges and universities to reward researchers, scholars and teachers who continue to evolve and remain productive while also making room for young people with new ideas and skills.

For many years, I have told students, “Do not do what I do; rather, take whatever I have to offer and do with it what I could never imagine doing and then come back and tell me about it.” My hope is that colleges and universities will be shaken out of their complacency and will open academia to a future we cannot conceive. ♥

[Mark C. Taylor, the chairman of the religion department at Columbia, is the author of the forthcoming Field Notes From Elsewhere: Reflections on Dying and Living. Taylor received a BA from Wesleyan University and a PhD in religion from Harvard University. Taylor began teaching at Williams College in 1973. In 1981 Taylor was the first foreigner to be awarded the Doctorgrad in philosophy in the 500-year history of the University of Copenhagen. In 2007, Taylor moved from Williams College to Columbia University.]


Copyright © 2009 The New York Times Company

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.

Copyright © 2009 Sapper's (Fair & Balanced) Rants & Raves