Monday, February 02, 2009

Once Upon A Time In The West....

On November 20, 2008, this blog quoted the late, great Richard Hofstadter:

"By myth..., I do not mean an idea that is simply false, but rather one that so effectively embodies men's values that it profoundly influences their way of perceiving reality and hence their behavior. In this sense, myths may have varying degrees of fiction or reality...."

Saint Richard was talking about the "agrarian myth" of the sturdy yeoman farmer, but he easily could have been talking about the "frontier myth" of the incorruptible loner who rides off into the sunset before the temptations of the town can compromise him. "Come back, Shane!" cried the boy in the 1953 Western film of the same name. Of all of our myths, the "frontier myth" may be the most powerful. In the past presidential election, the unsuccessful candidates labeled themselves: mavericks. The appeal was to the "frontier myth" of unconventional sorts of candidates (from the "frontier states" of Arizona and Alaska) who did not run with the crowd. If this is a (fair & balanced) traditional story accepted as history, so be it.

[x The American Scholar]
The Future Of The American Frontier
By John Tirman

Tag cloud of the following article

created at TagCrowd.com

The presidential campaign of 2008 will be recalled for many firsts: the first African-American presidential nominee, the near-miss campaigns of Hillary Clinton and Sarah Palin, the record spending and record turnout. But what was not new was its reliance on a very old standard of American political culture, the frontier myth. Perhaps no other set of ideas about America is more powerful politically, and the two autumn campaigns were reverential in their implicit bow to, or explicit exploitation of, the dense complex of frontier images and values attached to the American experience.

The limitless possibilities of the American dream, the expansion of American values, the national effort to tame faraway places, the promise of a bounty just over the horizon, and the essential virtue of the American people who explore and settle these frontiers—all of these tropes fortified the hopes of the campaigns to situate their candidate in the company of legendary pioneers. It is a testament to the power of this myth that it grips us still—its self-gratifying qualities having ensured its long lineage—even as the actual frontier of American action is swiftly closing. A century ago, the closure of the continental frontier obsessed politicians and intellectuals alike. Today, when the global frontier is closing, our political leaders have little sense of its significance.

Instead, the run for the White House recycled the frontier myth with scarcely a nod to its growing irrelevance. The Republican ticket, representing Western frontier states, was exemplary in this regard. John McCain’s credentials as a genuine hero were much in play. In frontier mythology, the hero is central to how we understand the tasks of taming the wilderness and extracting its bounty, and from Andrew Jackson to George Armstrong Custer to Jimmy Doolittle, the American hero has often been a warrior. That burnishing fact of McCain’s career was front and center in the political campaign. His self-description as a “maverick” glosses the hero status neatly, because the hero in our national narrative is typically the loner seeking justice. He repeatedly called himself a Teddy Roosevelt Republican, invoking one of the icons of the frontier myth, a self-made hero if there ever was one. And in his campaign he recycled one of the sacred phrases political leaders like to use to underscore their commitment to America’s unique greatness—John Winthrop’s line from Matthew that we are “as a City upon a Hill,” an exemplar for all the world.

The maverick hero was joined on the ticket around Labor Day by Alaska Governor Sarah Palin, who was introduced as yet another maverick and a frontier mother who hunts and can “field dress” a moose. Much was made of this, both sarcastically and triumphantly, but the direct embrace of the frontier myth was unmistakable and instantly popular. “The gun-toting Sarah Palin is like Annie Oakley,” exulted Camille Paglia, “a brash ambassador from America’s pioneer past.” One conservative blogger called her “a Western frontier version of Thatcher.” In viewing the giddy Palin debut, one reference that came to mind was historian David W. Noble’s depiction of “timeless space” as a treasured American perspective—the absence of confining histories, cultures, or mores, combined with the limitless American landscape. Alaska self-consciously conveys those qualities, considering itself a residual frontier, and the many exciting possibilities of that frontier were rejuvenated in the person of Alaska’s governor.

The Democratic ticket’s claim on frontier values was less obvious. Barack Obama invoked John F. Kennedy, Harry Truman, and Franklin D. Roosevelt as paragons of a global leadership that must be renewed, implicitly assuming that the whole world is our rightful domain of action. In this, he is in the internationalist tradition that seeks to promote American values, missionary-like, to a grateful world. As an Illinois lawyer-politician and as an African American, he is readily associated with Lincoln as frontier hero and liberator of the slaves. In his manner and education, he has often been compared with Kennedy, the new frontiersman. Obama’s intriguing personal journey is that of a lone truth seeker on a quest (common to all heroes), in this world but somehow always elevated above the mundane, an American Odysseus. His rapid rise to national prominence has been built on the irrational hope of his supporters that he can singlehandedly transform politics and the world, and indeed he was lampooned on the right as a Christ poseur.

What is striking about these candidates is the authenticity of their credentials. McCain’s heroism is evident in his gruesome captivity narrative, replete with cycles of courage and weakness. Obama scaled heights never before ascended by a black American, overcoming obstinate racism and xenophobia as the Herculean labors of a new epic. (Compare these two with the would-be cowboys Ronald Reagan and George W. Bush clearing brush from their ranches.) These truly heroic images are among the reasons why the campaign was fought so fiercely.

As the 2008 election shows, we can’t escape the frontier, even if the frontier has escaped us.

Why is the frontier myth losing its relevance? When the continental frontier closed—when the last indigenous tribes were subdued and the land taken—it created a sense of crisis in American politics. The answer to that crisis was to look outward, across oceans, to imagine frontiers to conquer abroad. Much of the ensuing century has involved America on such global frontiers. But now that frontier is also closing, as our capacity to treat the world like a virgin terrain diminishes, and the question it stirs is What next? What frontier, if any?

The cultural theorist Richard Slotkin describes the myth of the frontier as “the conquest of the wilderness and the subjugation or displacement of the Native Americans . . . the means to our achievement of a national identity, a democratic polity, an ever-expanding economy, and a phenomenally dynamic and ‘progressive’ civilization.” This conquest, he explains, was not only pursued for its own tangible rewards—security, land, and riches—but for and by a morally cleansing series of “savage wars” that conveyed upon the pioneers a “regeneration through violence.” It was at the frontier, where civilization confronted wilderness, that American values were forged. The frontier provided abundance for those courageous enough to seize it, in contrast to the scarcity and squalor and discontent common in cities in the East. The frontier myth braced and was braced by individualism, Social Darwinism, Manifest Destiny, and similar traditions of American ideology, and has been endlessly replayed and elaborated through the cultural power of novels, films, and journalism.

While not always recognized for what it is, it informs our foreign policy, our sense of place, and our purpose on this planet.

The world as an American frontier was a new idea when Theodore Roosevelt, Woodrow Wilson, and a few other intellectuals assayed the closing of the continental frontier. Roosevelt was a central figure in this realization. His lament about the closing frontier drew on an essentially racialist notion of how Americans—or Americans of a certain heroic class—subdued the savages and thereby burnished their own virile qualities and moral capacity to lead. The historian Frederick Jackson Turner promoted the more palatable idea that democratic self-reliance was a consequence of the American frontier experience, and that the closing of the frontier (which the Census Bureau proclaimed in 1891) was a threat to American democratic virtue. The frontier had also provided the United States a safety valve for development, unlike Europe, where socialism and class antagonism marred the political landscape. The economic stagnation America was experiencing in the 1890s, after a heady period of economic expansion, was one alarm ringing through all the thinking about the frontier and its legacy.

If the end of the North American frontier was a crisis for democratic and manly virtue, Roosevelt and Turner had an answer: extend the frontier elsewhere. Long before the USS Maine was blown up in Havana harbor, Roosevelt advocated war with Spain, which bestowed the Philippines to the new American empire and provided Roosevelt with the “savage war” and Asian foothold that were meant as an antidote to the frontier’s demise in North America.

Woodrow Wilson was less bombastic but no less committed to the extension of the American idea. “The spaces of their own continent were occupied and reduced to the uses of civilization; they had no frontiers wherewith ‘to satisfy the feet of the young men,’” he wrote in A History of the American People. “These new frontiers in the Indies and on the Far Pacific came to them as if out of the very necessity of the new career before them.” In the White House, from which Roosevelt suppressed the Philippines rebellion and built the Panama Canal, both with a high human toll, Wilson invaded Mexico, Nicaragua, the Dominican Republic, and Haiti before entering World War I. All of these actions undertaken on behalf of democratic ideals prefaced his attempts to make the world safe for democracy. While he was, in contrast to Roosevelt, increasingly anti-imperialist, he was no less expansionist—in one historian’s words, the “very model of Turner’s crusading democrat.”

The myth has been remarkably resilient. Not only did it inform American expansion globally during the presidencies of fdr and Truman, but the uncertainties posed by the Cold War (which used cowboys-and-Indians iconography time and again), the nuclear arms race, and subsequent crises of confidence (particularly urban crime, oil price explosions, the 1979 hostage taking in Tehran, and the 9/11 attacks) led to the embrace in popular culture and politics of the comforting narrative of civilization versus savages. The myth remains vibrant, but the frontier itself is disappearing again.

The end of the Cold War was the first sign that the global frontier was closing. The superpower standoff formed much of the United States’ identity in that phase of our global involvement, and its power explains our failure to construct a successor to that form of engagement. The “twilight struggle” with Soviet communism still shapes how we structure foreign relations, institutions, military doctrine, public diplomacy, and our sense of self-worth. It was a colossal, Manichaean contest, much like the one the pioneers experienced as they cleared and settled the continent. The anti-communist campaigns, which began internally as long ago as Wilson’s intervention against the Bolsheviks from 1918 to 1920, resulted in dozens of military interventions, cia covert operations, and lavish support for anti-communist regimes. This pattern was nourished by the depiction of communists as a threat to civilization. The conclusion of the U.S.-Soviet rivalry nearly 20 years ago thereby drained American globalism of a paramount ideology—a way of seeing ourselves in the world—and the supposed vitality that came with the waging of “savage wars” in Africa, Latin America, and Asia. It is with difficulty that we let go. That the war on terrorism closely followed, and invoked this warrior myth—the fight for Western values against barely human and wholly alien “hostiles”—should come as no surprise, since it evinces a purpose built by the Puritans and renewed throughout our history.

In the aftermath of the attacks of September 11, 2001, America instinctively reverted to the old category of a battle for civilization’s soul. Susan Faludi, in The Terror Dream: Fear and Fantasy in Post-9/11 America, incisively applies Slotkin’s framework to this rapid mobilization for a “war on terrorism,” especially the regeneration through violence for the heroic men of America. This battle intoxicated the nation for a time, but the scale, threat, and results look paltry in the shadow of previous warrior epics.

So while the ennobling and rewarding savage wars of the anti-communist frontier are diminishing, that pattern of mobilization and intervention has simply been imitated, with relatively little retooling, in the war against small and scattered gangs of Muslim extremists. This mimicry is likely to fail. The menace of would-be shoe bombers and a few restive Muslims in faraway and desolate places pales before the thousands of nuclear weapons that were aimed at us by the Soviets, the millions killed in Korea and Vietnam, and the totalitarianism of Stalin or Mao. The relentless invocation of every soldier or firefighter as a hero dilutes the essential mythic heroism once reserved for a Boone or a Crockett or a Lindbergh. As in Vietnam, moreover, the “Indians” are not so easily subdued, and the costly setbacks of the anti-terrorism campaigns are stirring a growing distaste for savage wars.

The end of the global frontier is also evident in its diminishing bounty. A primary cause of the imperialistic urge of the 1890s was the perceived need to export American products to sustain or increase production domestically and to relieve labor agitation. Such a boom in exports followed, enabled by natural resources and agricultural production. But the U.S. trade situation turned sour in the 1970s and has continued to deteriorate ever since. The decline is precipitate. In 1992, the trade deficit was $50 billion. In 2007, in constant dollars, it was $730 billion. As a percentage of all economic output, exports did not exceed the levels of 1900 until the 1990s, and by then imports were outpacing exports.

At the same time, income has stagnated for three decades for all but the wealthy in America—a direct slap at one of the tenets of the frontier myth, that expansion would lessen unequal distribution in the American economy. “The bonanza frontier offers the prospect of immediate and impressive economic benefit for a relatively low capital outlay,” Slotkin writes in Gunfighter Nation (1992), and “bonanza profits derive from the opportunity to acquire or produce at low cost some commodity that has a high commercial value.” In the 19th century, the bonanza was gold and land; in the 20th-century global frontier, it was oil and other minerals, financial products, and cheap goods from abroad.

The dismal performance of the global economic empire is often attributed to the nationalization of oil assets in opec countries, but even when oil prices were low in the 1980s and ’90s, the U.S. trade balance and personal income statistics were deteriorating. The declines have come during the period of insistence on free markets in the developing world (another modern-day equivalent of bonanza economics), a doctrine that proved ineffective if not disastrous for those countries over the last quarter century. The free market is attractive in theory, but when pitting transnational corporations against small developing countries it becomes an arena of economic predation. At the same time, rivals for economic dominance, including the European Union, Japan, China, India, Russia, and others, are crowding out U.S. control of markets and resources, a trend that is accelerating. The expansion on this continent was made possible by pushing out the British, French, Spanish, and Mexicans, and by eliminating the indigenous tribes, but this is no longer feasible in the global frontier.

The 2008 crisis in America’s mastery of global finance signaled another sharp reversal. In the midst of the market turbulence that shook Wall Street and foreign markets, German Finance Minister Peer Steinbrück proclaimed that “the United States will lose its status as the superpower of the global financial system. The global financial system will become multipolar” and use a more diversified basket of currencies, undermining one of the last symbols of America’s economic strength—the dollar. It was a sentiment widely echoed throughout the capitals of the world.

The most important reason for the closing frontier, however, is the limits of the earth itself, the biological capacity that is now diminishing with frightening speed. This is a consequence of the “taming of the wilderness,” which has certainly been tamed and is now wreaking its revenge. The longstanding notion that resources were ours for the taking, and for using promiscuously, is no longer viable. The closing of this frontier not only impedes economic growth built on this attitude (the engines fueled by cheap oil in particular), but has other costs as well—the agricultural, health, and safety challenges of rapid climate change, among many others.

The depletion of earth’s resources and the climate change that results from profligate consumption of those resources are well established now among scientists. The Washington reaction to this is right out of the frontier-myth playbook, however, and indeed is reminiscent of the debate that surrounded the onset of outward expansion of a century ago. Then, as now, the anti-imperialists were condemned as elitists and weak willed, people attempting to impede America’s God-given right to take our mission to the rest of the world. Today, the very modest proposals for arresting carbon emissions, for example, are derided by many proponents of big business as part of the global warming “hoax” that seeks to deprive Americans of economic growth and unbridled consumption. The intemperate quality of the attacks signals that a deep chord has been touched, the belief in the ever-expanding frontier that is pioneered and settled by Americans. The deterioration of the earth’s ecosystem was rarely mentioned in the 2008 campaign.

The war in Iraq illustrates how these three phenomena converge. It was fought in part to fulfill the new imperatives of the war on terrorism, and it was a war, so thought the Bush advisers, that we knew how to fight—armored divisions, air power, command and control, and so on, reflecting Cold War preparations. The mission (apart from the alleged nuclear, chemical, and biological weapons) harkens back to the “civilizing” impulse of Roosevelt and Wilson and displays all the racial typing of the natives, and callousness toward them, that marred U.S. interventions in the Philippines, Vietnam, and Latin America. The “bonanza” is the promise of oil, and the control of oil pricing worldwide. With its predecessor, Operation Desert Storm, Operation Iraqi Freedom signals how American consumption has led directly to large-scale resource wars, this one now 18 years in duration. An air of desperation clings to the war, as the mismatch of expectations and outcomes becomes ever more apparent, and as the inability of the United States to treat the world as its virgin domain is exposed.

Given these odious consequences, what is the future of the frontier and its myth? The reflexive answer is to discard it altogether as a guiding set of values. The frontier metaphor imparts ideas of American exceptionalism and the moral right to resources, cultural superiority, and limitlessness in all things we choose to do. If there are no limits, there is no need for common struggle. If the world is our oyster, there is no need for restrictive rules and regulations, for lowering expectations. Four hundred years of this ideology—fostered and promoted by church and state, the news media, schools, and popular culture generally—has nurtured this exceptionalism that feeds arrogance and wastefulness and war.

But the myth is resilient. The alternative is to reinvent it, to co-opt, in effect, frontier symbolism from its destructive tendencies and transform it into something more vital. Many leaders have attempted to use the frontier metaphor as a way of launching ideas for reform or renewal, invoking, for example, “the war on” campaigns—the war on poverty, the war on drugs, the war on cancer—which draw on the conflict and moral struggle that played such a central part on the frontier. Some of the discourse about globalization today uses concepts similar to the frontier ideology: both the “clash of civilizations” (from Samuel Huntington) and the more piquant “clash of globalizations” (from Stanley Hoffmann) grapple with American-led cultural, political, and economic change and the conflicts and bonanzas they may be encountering or inducing. Yet very few political or opinion elites recognize the frontier myth—the restless urge to expand and to dominate—as the root and branch of our self-defined global role. Thus very few have tried to alter its course and meaning.

The most intriguing attempt to harness the myth in recent memory was John F. Kennedy’s New Frontier, which was the core concept in his acceptance speech as the Democratic Party’s nominee and throughout his 1960 campaign. He recalled the past in the conventional way—the pioneers who settled the American West “were not the captives of their own doubts, nor the prisoners of their own price tags,” he told the convention. “They were determined to make the new world strong and free—an example to the world, to overcome its hazards and its hardships, to conquer the enemies that threatened from within and without.” But then he went on with a more interesting twist:

Some would say that those struggles are all over, that all the horizons have been explored, that all the battles have been won, that there is no longer an American frontier. . . . Beyond that frontier are uncharted areas of science and space, unsolved problems of peace and war, unconquered problems of ignorance and prejudice, unanswered questions of poverty and surplus. It would be easier to shrink from that new frontier, to look to the safe mediocrity of the past, to be lulled by good intentions and high rhetoric. . . . I believe that the times require imagination and courage and perseverance. I’m asking each of you to be pioneers towards that New Frontier.

Kennedy still used the older mythic call as a “race for mastery of the sky . . . , the ocean . . . , the far side of space, and the inside of men’s minds,” but the notion that the frontier was not geographical or spatial, but one of applied knowledge and of human relations, was an innovation and one that has not been surpassed. That Kennedy and his cohort did not live up to this new inflection of the frontier myth scarcely needs noting, but the rhetorical framing of a new kind of frontier, a half century later, might have finally met its moment.

Using the metaphor as a way of galvanizing both the public and our political leaders to adopt new challenges—challenges to be explored and tamed, from which public good can be extracted—may be more plausible given what we now can see about global limits. The need to arrest climate change with sustainable development is just such a challenge, one that must broadly mobilize society. How to reshape our politics to confront this challenge is not a problem with an obvious solution. The frontiers of science or knowledge are hoary notions, but as a counterpoint to the decaying frontier myth, they possess renewed vibrancy—and are especially potent if linked to the new mission as a heroic feat. The hero is the human exponent of the frontier myth, and all heroes embody qualities that speak to the anxieties of the age. Self-sacrifice, an innate sense of purpose, physical or intellectual prowess, and a willingness to confront the dangers of the frontier—all are qualities of the hero.

Meeting the environmental challenge requires more than colossal investments in science and intensive diplomacy; it mandates a shift in the way we think about U.S. goals, our range of action, and our commitment to values beyond self-enrichment. It requires collective, heroic action, the kind that can move a society in times of peril. And it requires a new lens on the world, one that sees in developing countries not bounty but common needs and aspirations. The environmental crisis binds us globally in ways that no previous cataclysm ever has—not war, not epidemics, not other natural disasters. If the oil addiction of the industrial countries is not reversed soon, the resource wars we have suffered already will intensify along with the choking effects on air and oceans. If China and India do not reduce their rate of growth in carbon emissions, the earth’s ecosystem will be dangerously degraded. If Brazilian rainforests continue to be mowed down, we lose precious and possibly irreplaceable sources of oxygen to refresh the atmosphere. If sustainable development cannot be fostered in Mexico and Africa and the Middle East, the migrations to the industrial world will induce intolerable social and economic stress. These are collective problems by dint of their inexorably collective outcomes. And in this, the world now differs radically from the one that was merely a frontier for exploitation.

When we look to the three signals of how the frontier has closed—the warrior ethos, bonanza economics, and environmental limits—it is apparent that all three are equally culpable and equally important to a transformative politics. Fortunately, the dominant myth of the frontier is not the only distinctly American modus vivendi, as leaders as far apart in time as John Winthrop and John Kennedy demonstrate. Our political and cultural leaders today, however, have rarely hinted at the imperative to reconstruct our mental architecture of the world and our place in it. If the world is essentially regarded as a font of anti-American terrorism or rivalry, as a social, political, and physical wilderness to be tamed, then we will be battling in the diminishing space our old habits have forced us into. That frontier is closing. The daunting but necessary task of redefining our horizons is upon us.

Where to start? Perhaps at the beginning. Winthrop’s line from his 1630 sermon, “we shall be as a City upon a Hill,” is frequently intoned to suggest that America is uniquely gifted and providential. Countless politicians have sermonized with this gratifying image and used it, erroneously, to celebrate belligerence, individualism, and aggrandizement. Looking at Win throp’s whole text presents a different sense of what the meaning of that phrase might be. He implored the Puritans to

do justly, to love mercy, to walk humbly with our God, for this end, we must be knit together in this work as one man, we must entertain each other in brotherly affection, we must be willing to abridge ourselves of our superfluities, for the supply of others’ necessities, we must uphold a familiar commerce together in all meekness, gentleness, patience and liberality, we must delight in each other, make other’s conditions our own, rejoice together, mourn together, labor, and suffer together, always having before our eyes our Commission and Community in the work, our Community as members of the same body, so shall we keep the unity of the spirit in the bond of peace.

There was more, of course, and not all of it gentle and meek, but it is remarkable how humble and communitarian and ascetic his vision was, a vision reflecting the ethos of the early Massachusetts Bay Colony. More remarkable still is how suited such an ethos could be again. So the answer to the question “What frontier now?” may be to return to the humility of the first frontier. ♥

[John Tirman is the author of several books on global affairs, and more than one hundred articles in a wide range of periodicals. He is now Executive Director of MIT's Center for International Studies, where he is also Principal Research Scientist. Tirman was educated at Indiana University in political science, receiving his B.A. in 1972. His graduate work was at Boston University, where he earned a Ph.D. in political science, specializing in political theory, in 1981.]

Copyright © 2009 The American Scholar

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.

Copyright © 2009 Sapper's (Fair & Balanced) Rants & Raves

Working Smarter: Gee, Mail? No, Gmail!

Here is a way to work smarter, starting right now. Anyone in the world can now create a Gmail address at this link. Dump Yahoo, Oulook, AOL, Hotmail, and the rest of those inferior, mostly-free e-mail programs. Now, unlike all of its competitors, Gmail can be accessed without an Internet connection! If this is (fair & balanced) innovation, so be it.

[x Slate]
The Best E-Mail Program Ever: How Gmail Destroyed Outlook.
By Farhad Manjoo

Tag cloud of the following article

created at TagCrowd.com

As of this week, Gmail has reached perfection: You no longer have to be online to read or write messages. Desktop programs like Microsoft Outlook have always been able to access your old mail. There is a certain bliss to this; if you've got a pile of letters that demand well-composed, delicate responses (say you're explaining to your boss why you ordered that $85,000 rug), unplugging the Internet can be the fastest way to get things done. That's why offline access is a killer feature—it destroys your last remaining reason for suffering through a desktop e-mail program.

Google's not alone in providing this option. Microsoft's Windows Live Mail, Yahoo's Zimbra, and the mail app made by the Web startup Zoho, among other services, also provide some measure of untethered e-mail access. For now, Google calls this addition "experimental"—you've got to turn it on explicitly, and the company is asking users to report any bugs—but I found it easy to set up and a delight to use.

To get offline access, you first need to download and install a small program called Google Gears (except if you're using Google's Chrome browser, which comes with Gears built in). Then, after you enable Gmail's offline capability, the system will download two months of your most recent messages, which should take 30 minutes to an hour. Now you're good to go: When you're offline, type www.gmail.com into your browser, log in—yes, Gears enables you to log in even when you don't have a Web connection—and there's your e-mail. Though I work from home and rarely find myself away from a hot Wi-Fi connection, I shut off my router and parked myself on my couch for about an hour yesterday. I loaded up Gmail on my laptop, and it responded seamlessly—I could read, search through, and respond to any message I'd received during the last two months, all through the familiar Web interface. Eureka! I'll never again be mailless on a plane, a subway, or anyplace else where you don't have the Web but do have a lot of time to kill.

Now that Gmail has bested the Outlooks of the world, it's a good time to assess the state of desktop software. There are some things that work better on your computer (your music app, your photo editor, your spreadsheets), and there are some that work better online (everything else). Over the last few years, we've seen many programs shifting from the first category to the second—now you can get spreadsheets and photo editors online, though they're still not as good as programs hosted on your computer. But e-mail has crossed the line completely. Hosted services like Gmail are now the most powerful and convenient way to grapple with a daily onslaught of mail. If you're still tied to a desktop app—whether Outlook, the Mac's Mail program, or anything else that sees your local hard drive, rather than a Web server, as its brain—then you're doing it wrong.

The shift has been a long time coming. On July 4, 1996, Sabeer Bhatia and Jack Smith, two techies who met while working at Apple, launched Hotmail, the first free e-mail service on the Web. The date wasn't accidental—from the beginning, Web-based e-mail sought to liberate people from the strictures imposed by traditional providers (ISPs, universities, and employers, all of whom required some official affiliation before they gave you an e-mail address). Hotmail would give an inbox to anyone—you could even sign up for multiple addresses—and pretty soon it was impossible to find a soul who didn't e-mail.

But it was a terrible hassle to actually use Hotmail—which Microsoft purchased in 1997—or the rival e-mail systems built by Yahoo, AOL, and the various other Web portals that dominated the last tech boom. Back then, Web-based e-mail was a great idea executed poorly. Internet connections, Web browsers, and Web-design technologies were slow and flaky; you waited an eternity to load up a message, you could easily lose a draft of a long e-mail if something went amiss with your modem, and you had a limited amount of storage space. Web e-mail was a redoubt of amateurs. If you were serious about your inbox, you kept it on your desktop.

Desktop e-mail presented its own challenges, though. People who were serious about e-mail tended to archive all their messages. But desktop e-mail apps performed poorly when overloaded with mail; Outlook, for instance, crawled to halt if you stuffed it with just a few tens of thousands of messages, which for some people is only a few months' worth. What's more, keeping all your mail in one place was both annoying and not very safe. You couldn't easily check your messages on multiple computers. And what if you wanted to switch to a new computer? Or what if a power surge crashed your drive? As a journalist working during the Internet bust, my particular worry was getting a pink slip. If my boss suddenly asked me to turn in my company-provided laptop, all my e-mail—both professional and personal correspondence going back years—would be gone.

By the time Gmail launched in summer 2004, I was desperate for an alternative to Outlook. (I had tried pretty much every other desktop e-mail app.) From the moment I logged on, I found it liberating. Gmail's interface was quick and intuitive, unlike any other major online service at the time. (Gmail did borrow some design ideas from Oddpost, an ahead-of-its-time Web e-mail app developed in 2002; Yahoo bought Oddpost in 2004.) Gmail was the first to display multiple messages on the same subject as threaded conversations—a design idea that user-interface experts had long been saying would make e-mail easier to use. Switching to Gmail also freed me from worrying about how I preserved my mail—Google, whose servers are much more secure than my own computer, was taking care of backups for me.

What separates Gmail from its rivals is a basic design philosophy: It's built for power e-mailers. Late last year I visited the Gmail team at Google's Mountain View, Calif., headquarters. Keith Coleman, Gmail's program manager, told me that from the beginning, Google aimed to build something suitable for people who got a ton of mail—because in the future, everyone will get a ton of mail. Gmail's main features are all catnip for folks who find themselves buried under the weight of their inbox. There's a search engine worthy of the Google name, a slate of keyboard shortcuts that make organizing your messages brutally efficient, and a crowdsourced spam detector that keeps out unwanted messages. Best of all, Gmail is fast—you can switch between messages and folders quicker than you can in any other e-mail program, even desktop-based systems. Coleman told me that the team is constantly measuring and tweaking the responsiveness of its interface. (The software gives coders a readout of how long, on average, various tasks take to complete.) The Gmail managers are also gaga over user-interface tests: Before instituting any major feature, developers bring users into a whiz-bang lab outfitted with cameras and eye-tracking software to see how people react to the new stuff.

Lately Coleman and his staff have been improving Gmail at a breakneck pace. They added a way to let people chat by voice and video, and they put out "themes" that personalize the appearance of your e-mail screen. Last summer, they launched Gmail Labs, a repository of add-on programs that run alongside Gmail. Offline access is one of these many Labs features; you can also add a to-do list, buttons to send people quick canned responses, a mini-program for sending text messages to cell phones, and a "gadget" for monitoring your Google Calendar and Google Docs from your e-mail. All these add-ons were created by Google programmers, but Coleman says that Gmail is also experimenting with letting outside developers add stuff. Google seems to be trying to create more than just a great e-mail program; with all these add-ons, Gmail is becoming a sort of e-mail platform whose users benefit from the best ideas in mail management.

And that gets to what's so exciting about being a Gmail user right now. The app keeps getting better. You might say that's true of desktop systems, too; Outlook is not as clunky as it was five years ago, and, no doubt, it'll be better five years from now. But so will Gmail—and because it's online, you'll get those improvements faster, and without having to install any software. Now that you can use Gmail anywhere—even when you're beyond the reach of broadband—there's no longer any reason to suffer. ♥

[Farhad Manjoo is Slate's technology columnist and the author of True Enough: Learning To Live in a Post-Fact Society. Manjoo graduated from Cornell University in 2000. While there, he wrote for and then served as editor-in-chief of the Cornell Daily Sun campus newspaper.]

Copyright 2009 Washingtonpost.Newsweek Interactive Co.

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.

Copyright © 2009 Sapper's (Fair & Balanced) Rants & Raves

Intelligent Design We All Can Believe In!

The Flatster (Thomas Friedman of the NY Fishwrap, who proclaimed that "the earth is flat") proclaimed that the mantra of our time should not be "drill, baby, drill," but instead, "innovate, baby, innovate." The Flatster went on to proclaim that we need an iCar (following the lead of the iPod and the iPhone) to move us out of the carbon emissions trap. Following up on The Flatster's alarms is The EdgeMan (Joel Garreau's breakout book in 1991 was Edge City.) proclaiming that we need a smarter stimulus package, not a larger stimulus package. Good advice: work smarter, not harder; get more bang for the buck while you're at it, too. If this is (fair & balanced) insight, so be it.

[x WQ]
Get Smart: The Transformative Power Of Information Technology
By Joel Garreau

Tag cloud of the following article

created at TagCrowd.com

In 1876, Western Union decided that telephones would never replace telegram messengers. In 1971, AT&T turned down the opportunity to run the Internet as a monopoly. In 1980, Ma Bell concluded that cell phones would never replace landlines.

These moments come to mind as that painfully unglamorous word infrastructure is increasingly heard on Capitol Hill. Our roads and airports are jammed. Drought threatens from Tucson to Atlanta. Floods are a plague from the Chesapeake to California. Our ­air ­conditioners and computers are straining the capacity of our electrical grid.

We can’t go on like this, goes the hand-wringing refrain.

Turns out that’s true, in an ironic way. Our industrial-age solutions are approaching their limits. Not only are they crumbling into decrepitude, but they have reached levels of physical absurdity that spark kamikaze political resistance, from 17-­story-­tall electrical transmission towers despoiling rare and pristine landscapes to interstate highways approaching the width of the Bosporus.

The ­business-­as-­usual interests lining up for more tax dollars rarely mention the impending obsolescence of their favored projects. Yet increasingly, infrastructure depends as much on wires a few molecules wide, and biology that produces energy, as it does on steel and concrete. The means to fundamentally control matter, energy, and life itself are emerging so fast that it is hard to imagine any existing infrastructure technology not being shaken to its core in the next decade or ­two.

These ­game ­changers can be dated to 1965—six years after the first commercial computer chip appeared. An obscure physical chemist named Gordon E. Moore noticed that the number of transistors you could put on a piece of silicon at the cost of a dollar was doubling every year. He boldly predicted that these doublings would continue for 10 more ­years.

Little did he know. Moore, who would become one of the founders of Intel and a billionaire several times over, will probably be best remembered for what is now known as Moore’s Law. That axiom, which has become the core faith of the global computer industry, is usually stated this way: “The power of information technology will double every 18 months, for as far as the eye can see.”

A doubling is an amazing thing. If we think of progress as a staircase, it makes each step as tall as all of the previous steps put together. Such doublings every 18 months describe a geometric curve. The 20 years behind you are not a guide to the next 20 years; they are at best a guide to the next eight. And your previous 50 years are not a guide to your next 50; they are at best a guide to the next 14. For example, a single iPhone has more processing power than all the computers at the disposal of the North American Air Defense Command in 1965, when Moore ­prophesied.

Even more startling is how Moore’s Law opens entirely new vistas, especially in what I call the GRIN technologies, for the genetic, robotic, information, and nano processes. Each is following its own curve of exponential ­change.

When sequencing the human genome was first proposed in 1985, many thought it could never be accomplished, or would cost the earth. Yet scientists managed the feat by 2000, for a fraction of the anticipated cost. That’s because the computers required to make it happen conformed to the inexorable price-performance curve of Moore’s Law and accelerated the future into being. Soon you will be able to get your own genome ­sequenced—­all 3.5 billion ­bases—­for $1,000. Nathan Myhrvold, the former technology chief of Microsoft, expects the price eventually to drop to $­10.

As the price of oil soars and the cost of computing approaches zero, there is an enormous spur to make infrastructure smarter. The industrial-age way to address congestion, for example, is to pour more concrete. But there is already vastly more capacity in the American road system than we remotely need. If we could find a way to fill the front passenger seat of just 20 percent of the cars on the road, traffic jams could be eliminated ­tomorrow.

How would you do that? One way would be to have your madly clever cell phone alert the world to your desire to go from here to there. The idea would be to create a market of trustworthy people heading in the right direction who might pick you up in the next five minutes in exchange for, say, the price of gas and tolls. Think eBay organizing rides on the ­fly.

Navigation systems already give directions to ­drivers—­today’s cars have far more computers than light bulbs. Nissan and other auto manufacturers are well on the way to fielding smart cruise controls that communicate with other cars and with sensors on the road ahead to maintain high speeds, plan alternate routes to avoid traffic snarls, and prevent ­accidents.

The more urgent our ­problems—­such as global ­warming—­the more likely we are to reach out to our amazing new technologies for solutions. Oil at $100 a barrel is a serious incentive. Already geneticists at companies such as LS9 Inc. are commercializing life forms that eat cellulose and poop gasoline for what is promised to be about a buck a gallon. Craig Venter, who sequenced the human genome in 2000, believes he will have a critter next year that will devour climate-ruining carbon dioxide and turn it into ­gasoline.

But solar power is the real solution to the energy crisis. As it happens, that low-hanging fruit is one of the first targets of nanotechnology. Several companies, such as Nanosolar Inc., are going commercial right now with processes that produce endless sheets of thin plastic with astoundingly tiny ­energy-­converting semiconductors printed on them in ­nano-­ink. If the technology rolls out as hoped, it will be able to turn sunshine into electricity priced as cheaply as power from coal-fired plants. A National Association of Engineers panel recently predicted that solar power will scale up to produce enough energy to meet the needs of everyone in the world in 20 ­years.

Would that profoundly change the infrastructure challenge? You bet. What is now a ­top-­down hierarchy dominated by big generators, big transmission lines, and big coal would become a ­bottom-­up network in which every consumer could also be a creator. Just as the Internet has chewed up the television, radio, movie, newspaper, music, and telephone worlds, distributed GRIN technologies could cause an upheaval in the world of ­utilities.

Slightly farther out on the commercialization horizon are nanotechnology membranes like those developed at UCLA that promise to slash the cost of desalinating water. Along with biotech, they also promise to mitigate the effects of pollutants. None of these are lab curiosities. They are burgeoning businesses that are ramping up now. The question isn’t whether the technologies work, it’s whether the economics do. If so, they could affect quite a few dam, canal, and treatment plant ­calculations.

Will these ­game-­changing technologies become commercially viable in time to solve all our problems? Who knows? But if they do, a transformation on the scale of those that roared past Western Union and AT&T is a serious ­possibility.

The prospects I describe pose two critical questions: First, will we quickly address all our infrastructure problems by pouring concrete and deploying all the tried-and-true industrial-age solutions as fast as we can? There’s a huge range of possibilities between yes and ­no. Second, will ­game-­changing technologies come on line quickly, cheaply, and with no unanticipated ­consequences?

Graph those two uncertainties as axes (see p. 61), each with a negative and positive pole, and you get a vision of four possible worlds we might be entering in the next 10 or 20 ­years.

If we don’t pour all the concrete, and the new technologies don’t live up to their promise, we’re looking at a ­minus-­minus world one might call “Roman Ruins.” Worst case, our cities contract, our fields dry up, our lowlands are covered by ocean, and our economies collapse. You’ve seen the disaster ­movies—­The Day After Tomorrow, for ­example.

That’s a serious scenario. Could ­happen. Look at New Orleans.

In another world—call it “Concrete Nirvana”—it turns out that the new technologies do not rapidly live up to their promises, but we do start listening to all the alarms from our belt-and-suspenders engineers, bless their hearts, who warn about rolling blackouts and empty faucets. In that world of one minus, one plus, we recognize that our civilization is at stake and rapidly decide that there are worse things than building scores of coal and nuclear power plants, waste treatment facilities, dams, and dikes. Roads are widened, rail undergoes a new renaissance, and dramatically enlarged airports and seaports attract awed visitors from around the world.

Again, could happen. All it takes is political will. And a lot of lobbying dollars.

Diagonally across from “Concrete Nirvana” on the matrix is the one-plus, one-minus world we might call “Leapfrog.” In this world, new technologies come to market so fast that old infrastructure worries become quaintly obsolete. Now that cell phone service covers 98 percent of Bangladesh—thanks to Grameenphone, an offshoot of the Nobel­ Prize­winning microlending outfit Grameen Bank—can anyone remember why we ever worried about how much it would cost to cover the planet with landlines?

Diagonally across from “Roman Ruins” is “Intelligent Design.” This is the plus-plus world in which we recognize all the problems, recognize all the possibilities, try everything we can dream up, and see what sticks. In this world, for example, we recognize ways to transform air travel: deploy many more jet taxis like those already developed by Honda, Cessna, Adam Aircraft, Eclipse, and Embraer that are smart, efficient, and can safely and quickly make the hop from a short runway near your house to a short runway near your destination without needing massive hubs and enormous investments in air traffic controllers. Insurance companies mandate that the only way to travel on highly congested roads is to turn the driving over to smart navigation bots that never get drunk or distracted and are far better than people at avoiding accidents. As a side benefit, these bots safely pack many more cars—bumper to bumper, at speeds of 80 miles per hour—into the same amount of space as in the old world, ending traffic jams forever.

The way we get to “Intelligent Design” may be by recalling that, historically, the infrastructure solutions that work best are public-private partnerships. Think private passenger planes landing on public runways, or private cars traveling on public roads. All-private solutions, such as investor-owned toll roads, and all-public ones, such as subways, have their place. But they are specialized tools.

The public-private partnership I most want to see is the one that quickly provides “big broadband” of between 100 million and one billion bits per second to every home in the land. Between 1999 and 2006, the United States fell from third place to 20th in the International Telecommunications Union’s measure of average broadband speeds, behind, oh, Portugal. This is disastrous for the American economy. It means the markets for next-generation information companies will be elsewhere. Just as with that earlier critical economic and social enabler, the telephone, there are few if any market reasons for private-sector providers to install fat information pipes the last mile to every home. That’s why the governments of states such as California and Kentucky have stepped up to the plate, launching innovative public-private partnerships.

Whatever does the job, let’s do it. Now. One idea—surely there are others—is for the federal government, the states, and the private sector together to spend on the task in each of the next four years about what it cost to build Boston’s Big Dig. However we do it, the important idea is for all of us to hook up quickly to imagine mind-blowing solutions to our novel challenges ­together.

Is that a credible “Intelligent Design” scenario? You decide. ♥

[Joel Garreau is a student of culture, values, and change, and a reporter and editor for The Washington Post. He is the author of a book on metropolitan futures, Edge City: Life on the New Frontier (1991). His most recent book, about the future of human nature, is Radical Evolution: The Promise and Peril of Enhancing Our Minds, Our Bodies—and What It Means to Be Human (2005). Garreau has been a fellow at the James Martin Institute at Oxford University, the University of California at Berkeley and George Mason University, and is a member of Global Business Network.]

Copyright © 2008 Wilson Quarterly

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.

Copyright © 2009 Sapper's (Fair & Balanced) Rants & Raves