Saturday, March 13, 2010

Roll Over, Hofstadter! It's "Root Hog, Or Die" — Redux!

In 1651, Thomas Hobbes wrote in The Leviathan that "The life of man... [is] solitary, poor, nasty, brutish, and short." Wesley Ulm, M.D., finds that the legacy of Dutch and his acolytes has given us a culture that is fractious, sclerotic, and dysfunctional. The mantra of the 1980s (thanks to Oliver Stone) was "Greed is good." It is no small coincidence that Oliver Stone has created a sequal to "Wall Street" (1987). In his sequel — twenty-three years later in 2008 — Oliver Stone revisits the ethos of greed with "Wall Street: Money Never Sleeps" (in theaters on September 25, 2010).

Wesley Ulm likewise revisits the theme of Richard Hofstadter's first book (Hofstadter's dissertation), Social Darwinism in American Thought (1944); Ulm advances the Hofstadter critique in our time. Find out in the following article why Glenn Dreck bashes TR (Theodore Roosevelt). If this is (fair & balanced) wide-ranging interdisciplinary imagination, so be it.

[x Democracy]
Cachet Of The Cutthroat
By J. Wesley Ulm

Tag Cloud of the following article

created at TagCrowd.com

Nice guys finish last. Survival of the fittest. Eat or be eaten. For Americans dwelling in the uneasily interesting times of the early twenty-first century, such catchphrases–and the sensibility they embody–strike familiar chords. Stemming from an unwieldy synthesis of fin-de-siecle Social Darwinism and (until recently) trendy Chicago School economics, this ethos claims that ferocious, mercilessly competitive conditions weed out the weak while preserving and enhancing the strongest members of an institution, a market, or a civilization as a whole. Such roughness and ruthlessness render us more competitive, thicker-skinned, and simply better than the rest of the pack. Such maxims are applied to communities and societies as much as to the people who comprise them: The more cutthroat an organization’s culture, the more hardened it is to adversity and tougher the people who emerge from its hallowed halls.

When this belief system bleeds over into the realm of political discourse, it transmogrifies into a paradoxical badge of honor, a disposition toward sink-or-swim hard-heartedness and a spurning of the "distractions" of the broader community. "The public be damned!" Cornelius Vanderbilt famously told a reporter who had asked the nineteenth-century tycoon about his railroad company’s social responsibility. In word and deed, Vanderbilt encapsulated the mindset of a previous American Gilded Age–an eerie precursor to our own–prior to its catastrophic collapse in the Depression. Vanderbilt’s sentiments can still be heard today, couched in p.r.—friendly euphemisms or offered as hearty retorts to the soft communitarianism of Scandinavia, Continental Europe, and Canada.

Of course, there have always been dissenting voices in our policy and cultural debates. Progressive leaders like Robert LaFollette and Eugene Debs, along with their philosophical and administrative successors in charge of the New Deal, exemplified this spirit in the early twentieth century. But such objections have rarely questioned the underlying premise of Social Darwinism itself; they seek to salve the symptoms of its workings in the real world, without curing (let alone preventing) the malaise in the American collective mind, which has paved the way for such a harsh philosophy to insinuate itself into society. Even in the wake of the 2008 economic meltdown, opposition to this "Cachet of the Cutthroat" is generally confined to ethical qualms about the suffering and personal cost imposed on hard-pressed individuals and families—deploring the scale of the misery, rather than addressing its roots. Across the ideological spectrum, the prevailing wisdom holds that such institutionalized harshness generates a more productive, adaptive, and wealthy society overall, with "liberalism" left to debate merely whether the resulting human collateral damage is an acceptable cost of doing business.

Although such moral objections are clearly relevant, the most devastating counterargument to the Cachet of the Cutthroat is that it is simply wrong. Both the social and natural sciences have conclusively demonstrated that ostensibly "softer and fuzzier" qualities in people and the communities they engender—compassion, goodwill, and above all empathy—are integral to sustainable success, particularly in complex organizations, but even in nature at its rawest and bloodiest. By fostering social cohesion and solidarity against adversity, such attributes paradoxically make us more, not less, competitive as individuals and as a society. Over time, the strongest and most productive individuals, communities, and nations all tend to be especially rich in these supposedly soft-hearted characteristics, while the most cutthroat societies collapse in a state of corruption and acrimony—their "winners" ultimately hoisted on their own petards. The latter, if anything, defines the vicious cycles of corrupt banana republics, their leaders utilizing bribery, coups, and even assassinations in the cutthroat march to power.

This is not to say that all competition is bad, but rather that not all flavors of "competitiveness" are equal. A competitive atmosphere can be constructive and productive, driving individual performers to improve and collaborate, to learn and boost creativity, and ultimately to engender innovation and institutional betterment. The extraordinary discoveries of quantum theory in the early twentieth century were a product of such cooperative competition, or "co-opetition." A handful of brilliant minds—Planck, Schrödinger, Einstein, de Broglie, Pauli, Bohr, and Born—vied to outdo one another. Yet this was far from cutthroat competition: From their scattered bases in the universities of Austria, Germany, France, Britain, and Denmark, they periodically met and mutually stimulated one another to devise a theory that is today at the heart of countless high-tech industries–a gift to the world worth trillions of dollars in created wealth.

Too often in the United States, co-opetition is conflated with destructive, lowest-common-denominator competition, which has led to predatory lending, underregulated capital markets, and our costly and ineffective health care system. Our counterparts abroad, however, have more prudently (and prosperously) distinguished them. The European Union, Taiwan, South Korea, and other major economies have more attentively adapted to the delicate policy balance needed for modern technological nations in an era of relative resource scarcity, as recently described by Steven Hill in his cogent exposition Europe’s Promise. Even China, with its comparatively authoritarian system, has recently been hard at work introducing Continental European-style regulatory networks, ecological protections, and safety nets.

For the United States to prosper in the twenty-first century, we must learn from these examples, beginning with a better understanding of Social Darwinist ideology and the historical forces that have facilitated its pernicious infection of our society. Only then can we glimpse precisely why this doctrine fails so disastrously in sustaining the true source of a society’s power—the collective will and inspiration of its people to engender a better world. In doing so, we can also begin to lay the groundwork for a more mutually reinforcing, and ultimately more successful, society.

A Brief History of a Dangerous Misconception

If there is any conclusive lesson to be gleaned from the churn of society’s various "isms" over the past millennium, it is that both nature and human communities are far too complex to reduce down to a single, linear theory. Nevertheless, to paraphrase Bertrand Russell, this has never stopped fools and fanatics in power from becoming dreadfully too sure of themselves, attempting to fit their societies violently into the procrustean beds of narrow interpretations. The annals of history are littered with countless examples—Communism, Maoism, Nazism, and fascism, to name just a few from the past century alone—with often horrendous atrocities expediently justified by simpleminded takes on complex theories and ideas. So it is with the curious history of Social Darwinism.

The work of the nineteenth-century naturalists Charles Darwin and Alfred Russel Wallace led to a coherent theory of evolution by natural selection, culminating in Darwin’s 1859 publication of On the Origin of Species. Their conclusions had far-reaching implications regarding how biological (and, by extension, cognitive and information) diversity evolves and emerges, via accumulated and gradual evolutionary steps over eons, with pre-conscious organisms eventually giving rise to conscious, intelligent life forms through unknown means. Darwin did speculate about the possible implications for his work in human society, but both he and Wallace ultimately remained agnostic on the topic. Neither cast his work as an explicit recommendation for the complex, fast-changing domain of human interactions and societies, populated by conscious, intelligent members whose conduct was guided by concrete ethical codes.

Others were not so circumspect. The term "survival of the fittest" was coined in 1864 by a contemporary of Darwin, the British sociologist and author Herbert Spencer. Even before Darwin’s own meisterwerk had appeared, Spencer had ascribed a degree of moral rectitude to the presumably callous workings of nature, which included, in his mind, the harsh inequalities of aristocratic Victorian society. As he wrote in his 1851 treatise Social Statics, "The poverty of the incapable... and those shoulderings aside of the weak by the strong, which leave so many ‘in shallows and in miseries,’ are the decrees of a large, far-seeing benevolence." In Darwin’s subsequent work, Spencer found a naturalistic scaffolding upon which to buttress his own interpretations of social organization and, in so doing, converted Darwin’s descriptive observations into prescriptive arguments about an ideal society. This became the crux of Social Darwinism.

Others followed, in no small part because Social Darwinism would prove convenient for a number of vested interests. Many of the era’s industrial magnates, imperial officials, and landed aristocrats were more than willing to overlook its evident shortcomings and logical fallacies. Vanderbilt was only the first and most voluble; robber barons like Jay Gould and James Fisk followed his lead closely in their attacks on government intervention and unions, while an army of subservient politicians, lawyers, and even preachers espoused Social Darwinism across the country. Substandard wages, child labor, monopoly capitalism—all were justified by an appeal to the "survival of the fittest."

Gilded Age America wasn’t the only place to see Social Darwinism flower. Darwin’s work emerged at an apex of European colonialism, when the British, French, and Russian Empires, in particular, were marching steadily throughout much of Eurasia, Africa, and the Americas. In such a climate, tweaks upon Darwin’s ideas provided an ideal rationale to condone policies that, under almost any ethical system, would be considered abhorrent: the imposition of foreign hegemony to subjugate free peoples, the vicious realpolitik that would culminate in World War I and the ruthless exploitation of human labor toward the enriching of a tiny elite. Vanderbilt’s candid sentiment meshed well with this unrelentingly Hobbesian zeitgeist.

The push toward Social Darwinism was momentarily blunted by late nineteenth and early twentieth-century reformers, muckrakers, and human rights conventions. In the 1870s, Chancellor Otto von Bismarck introduced workplace protections in Germany to cushion the shocks of rapid industrialization. Across the Atlantic, American reformers followed suit, beginning with child labor and workplace safety laws in the 1890s and culminating in FDR’s New Deal reforms and Johnson’s Great Society. Simultaneously, the world’s major imperial powers—drained of finances and manpower from the world wars and defeated by both political and battlefield opponents in Egypt, Southeast Asia, India, the Near East, Ireland, and elsewhere—were driven out of their colonies. For a fleeting moment, historical events, and the collective moral outrage of the downtrodden, had thus consigned Social Darwinism to the fringes of acceptable policy.

In this milieu, a handful of pragmatic reformers emerged who defended the essence of Smith-Ricardo capitalism yet saw carefully managed markets (assisted by rational safety nets) as tools for growing wealth and bettering society, rather than socially preponderant forces in their own right: economist John Maynard Keynes, theologian Reinhold Niebuhr, and New Dealer Thomas Corcoran, who conceptualized the Fair Labor Standards Act. Social Darwinism, they realized, was not only cruel but detrimental to a healthy economy in promoting raw and destructive competition as an end in itself. As Adolf Augustus Berle, an economist and legal expert who served in FDR’s brain trust, said (in words that would resonate today), "The ‘free market’ . . .  is an instrument, but it has been displaced as the infallible god," not to mention as "universal economic master" and "the only acceptable way of economic life."

But the reformist moment was not to last. To paraphrase George Washington’s cautionary words, human observers must generally "feel before they can see" a slowly building menace. As historical memory and the imminent threat of financial collapse both ebbed, so did the intellectual bulwarks against the speculative and predatory excesses that had brought on the Depression. Eisenhower’s thrifty pragmatism was soon followed by a resumption of polarizing ideological warfare. Meanwhile, in regard to the anthropological assumptions that underpin human systems, traits like compassion and kindness drew little interest among behaviorists and naturalists, generally dismissed as little more than evolutionary static. With few exceptions, such as the Russian zoologist Peter Kropotkin or the American historian Richard Hofstadter (who systematically impugned the logical foundations of Social Darwinism), fewer and fewer in the postwar era questioned the traditional rendition of a cruel, heartless natural world underlying our own.

As a result, reformists eventually saw their gains recede during the vertiginous global economic growth in the postwar era; political fortunes shifted back to advocates of unfettered free markets, and Social Darwinism itself made a comeback. Following the collapse of the Soviet Union in 1991, the United States, by now dominated by neoclassical economic policy and Reaganite anti-statism, emerged as the world’s unquestioned economic and military hegemon. In this context, "Greed is good"—the narcissistic tagline uttered by Michael Douglas’s unscrupulous corporate raider, Gordon Gekko, in the 1987 film "Wall Street"—–became a rallying cry. With the subsequent popularity of neoliberal economic doctrines and the post-1991 "unipolar moment," doctrines of laissez-faire economics guided development across much of the world, whether through the profusion of U.S. trained economists in the world’s finance ministries or the power of the World Bank’s "structural adjustment" policies. The thinking of Austrian School economists, Friedrich Hayek chief among them, was revived and merged with the efficient markets hypothesis of Milton Friedman and his followers to foster what became the "Washington Consensus," the prevailing economic framework since the 1980s. This doctrine, in turn, formed the basis for the mass privatization and "shock therapy" that the U.S.-dominated IMF demanded from struggling South American nations, Russia, and Indonesia in the ensuing years. (As with Darwin and Wallace, however, the work of Hayek and his Austrian colleagues was far more nuanced than the practical policies it was later invoked to support.)

Unregulated markets, whatever their Social Darwinist excesses, were said to be self-correcting and ultimately beneficial to the commonwealth of a nation. Short-term and individual pain would be outweighed by long-term gains. Thus, the highest virtues in society became linked to maximizing perceived shareholder value at all costs and as quickly as possible, a presumed catalyst to generate new wealth—whether or not genuine innovation in products or systems had actually taken place. By the mid-2000s, this philosophy naturally spilled over into other arenas: the office, mass media, the courtrooms, and athletic pursuits. If only the strongest would survive and thrive, then all was indeed fair in this most "competitive" of marketplaces. Qualities such as compassion and empathy were dispensable and regarded as hindrances, while ruthless and even cutthroat behavior—despite the presumed social stigma—encompassed the traits most associated with success and social advancement. As the millennium turned, firms like Tyco, Computer Associates, and an ambitious energy-trading company named Enron were feted for their ruthless selection and pruning of personnel, as well as their envelope-pushing aggressiveness with partners and acquisition targets. The subsequent indictment of their executives, and the earth-shaking fall of Enron, offered cautionary omens which, nonetheless, dropped off the radar screen as the 2000s marched along.

However, in September 2008, the collapse of Lehman Brothers ignited a maelstrom of financial turmoil, sparking a deep recession. The ensuing cascade of calamities in the banking, insurance, health care, and automotive sectors stretched itself out over many more painful months of relentless contraction. Under stress, the Social Darwinist economic model had delivered a worst-of-all-worlds debacle: a system lacking in compassion during an especially arduous crisis, yet unable to demonstrate the resilient wealth creation it supposedly promised in return, unraveling so feebly that its very advocates pled desperately for public assistance. The upshot, unsurprisingly, has been a catastrophic loss of prestige and the discrediting, at least for the moment, of the core of the Anglo-American economic paradigm. Adam Smith himself warned of the noxious effects of such an economy on a nation’s viability: "No society can surely be flourishing and happy, of which the far greater part of the members are poor and miserable." Yet even in post-crisis 2010, the forces of reform are weak, and we seem to be moving inexorably toward Smith’s dreaded dystopia. Rather than try, try again, we’d do well to consider the alternative: that Social-Darwinist systems fail, and fail on their own terms.

The Five Deadly Incentives of Dog-Eat-Dog Economics

What explains that failure? In a complex economy, the ruthless drive to profit and win at all costs becomes divorced from value-added gains in the real world. Instead of sustainably creating and disseminating genuine wealth—tradable goods and services, improved products, better information access—cutthroat conditions encourage shortcuts and destructive behavior, a far cry from the constructive fruits of co-opetition. In the recent real-estate bubble, complex financial instruments like CDOs were prized for demonstrating how extreme free-market banking could expand social goods, like homeownership. In the wake of the bubble’s popping, however, we can see that such hypercompetitive, unregulated markets—and the expansion of homeownership they facilitated—worked only in the short term, and were destined for collapse in the long run.

Fundamentally, the cachet of the cutthroat fosters all the wrong incentives, and without regulatory networks and transparency to manage such raw capitalistic impulses—which Adam Smith himself underscored—five deadly incentives are reinforced. First, it rewards unscrupulous behavior. Social Darwinism encourages an all’s-fair outlook in which backstabbing, exploitation, and outright chicanery are perversely promoted, as demonstrated in fiascoes like Enron, Tyco, and the Madoff scandal. If shady misrepresentation of assets and quasi-legal bilking of customers can yield rapid profits, then more constructive paths are bypassed.

Second, such laissez-faire market fundamentalism too easily equates profits to generated wealth, ignoring the fact that a company can rake in enormous profits for itself without actually contributing real goods and services. While some investment banks in the 1990s did help underwrite the risk-filled rise of productive high-tech startups, virtually the entire industry chose a path of lesser resistance after Glass-Steagall’s repeal in 1999, namely the toxic derivatives that even now threaten global finances. Other examples abound: Far too many health insurers make obscene profits (with outlandish executive salaries) by denying care rather than providing actual services. Sallie Mae, the now-private student loan firm, profits enormously as it saddles struggling graduates with usurious interest, contributing little to the real economy. Not all profits are equal, but in a system of survival of the fittest, any profits will do.

[Third?] Social Darwinist systems also stifle dissent, constructive criticism, and creative thinking by subordinates. In a cutthroat workplace, even those who calmly report obvious design flaws (or supply constructive criticism) are penalized, because they disrupt the quick rollout of short-term-profit-maximizing products. As John Schwartz reported in a landmark New York Times article in 2002, in the wake of Enron’s collapse, the trading giant had used an infamous "rank and yank" system to periodically purge even high-performing employees, based on those employees’ opposition to company policy. Candid reporting of impending perils was disastrously impeded.

Fourth, it commoditizes human beings, with ruinous effects on morale. "Chainsaw" Al Dunlap, the notorious CEO of Sunbeam in the late 1990s, became a ruthless icon during perhaps the zenith of the cachet of the cutthroat in the United States (which the journalist John Byrne, in his post-mortem on Dunlap’s disastrous tenure, called "the Era of Profit-at-Any-Price"). He took particular zeal in mass layoffs of long-contributing workers at firms that he dubbed, in his book Mean Business, "more welfare state than business enterprise," for which he was initially rewarded. But the loss of institutional know-how and declining morale led to an accumulation of blunders and ultimately financial disaster for the company, after thousands of lives had been ruined.

And finally [Fifth], it promotes short-termism, the most pernicious and deadly incentive. Social Darwinism compels an obsession with easily quantifiable, immediate metrics of success that miss the big picture of an institution and economy’s overall viability; in so doing, it also isolates competing individuals from supposedly "fuzzier" yet important considerations of the broader public good. Capital markets reflect only a limited stock of an industry’s relevant information. For example, clear-cutting an ancient forest (for development or timber-harvesting) would be rewarded in an "efficient market" for yielding quick profits, while ignoring less-quantifiable damages (to the local ecology, or to new medicinal sources) that would far outweigh the initial gains. Without prudent regulatory bodies or forward-looking social policies, companies face demands to "win" immediate approval at the expense of a region or industry’s long-term sustainability.

The ravages of short-termism were illustrated with sobering clarity in a forum from late 2009 at the Harvard Business Review, titled "Is the U.S. Killing Its Innovation Machine?" Featuring nearly two dozen accomplished panelists, including ex-CEOs and brilliant innovators and wealth-generators like Pixar co-founder Ed Catmull, the forum noted that U.S. high-tech companies—driven by unflinching demands for rapid cost-cutting—have outsourced even sophisticated, high-end processes, so much so that entire sectors of engineering and computer science effectively lack home-grown expertise. Bright undergraduates increasingly shun these fields, given the incurred student debt; the disappearance of a high-tech career ladder exacerbates the problem. As Catmull emphasized, maintaining leadership in technology, manufacturing, and innovation requires precisely the sort of long-term investments–in basic R&D and in trained professionals–that cannot be easily quantified on a balance sheet. They also yield shared dividends for an entire industry which do not obviously redound to a particular competitor, and so they effectively contradict the five deadly incentives.

The resulting erosion of American scientific leadership is especially alarming. Once the top oil producer, the United States no longer enjoys the historical cushion of abundant natural resources and widespread manufacturing (despite a much higher population), fonts of wealth that aided many a recovery in the nineteenth and twentieth centuries and substantially underpinned our sense of American exceptionalism. (The UK suffers a parallel quandary, with the decline of its North Sea oilfields and the loss of its manufacturing base since the Thatcher years.) With the additional loss of value-added high-tech jobs—driven by the ruthless cost-cutting and short-termism of eat-or-be-eaten market pressures—we lack wealth-producing industries that can help break the structural deflationary spiral of the Great Recession, losing the network of technical expertise that Gary Pisano and Willy Shih dubbed "the industrial commons."

Basic research in the public sector and academia—the wellspring for rich high-tech industries like the Internet and biotech—will not save us either. Cutthroat conditions have also arisen in the realm of scientific grant funding, albeit unintentionally as a result of budgetary constraints. Between ten and twenty percent of grant applications receive funding from sources like the NIH and NSF (among an already highly selected group of professionals). Far from generating a "tougher" and more creative breed of researcher, capable of handling the competition, these circumstances discourage the risky and creative proposals that have made the United States a technological leader. Other scientific centers, in China, Germany, and Japan are increasingly overtaking U.S. leadership in these critical fields.

Unfortunately, these downward trends are masked by Americans’ tendency to frame complex issues as simplistic ideological battles, pitting the needs of business and wealth creation against human rights and oversight. To regulate or not to regulate? Ensure health care and subsidize college tuition, or leave people to their own devices? Yet compassion and competitiveness can go hand in hand. Scandinavian countries—mostly resource-poor nations with high social investment and progressive taxation—illustrate this perhaps most cogently, ranking highly in per-capita GDP, business competitiveness, and high-tech production. Likewise, Germany—with similar policies and resource scarcity—has retained its manufacturing strength and rivals China as the top exporter, despite having barely 5 percent of the latter’s population. Critics may correctly note apples-and-oranges discrepancies between Europe and the United States, but as proof of principle, their carefully managed mixed systems are quite competitive in unquestionably value-added domains.

For these countries, carefully regulated capital markets and tax credits—which discourage short-term thinking and promote professional development, ecological sustainability, and high-tech job creation—incentivize scientific innovation, R&D, and their societies’ general fund of knowledge. Ensuring reliable health care removes the time-consuming and costly burden of coverage from both business owners and employees in an uncertain economy, freeing up their talents for creativity and job creation. Subsidizing college tuition for capable students, enabling them to graduate debt-free, protects them from the suffocating onus of snowballing loan obligations merely to acquire essential training, thereby helping them to start businesses and acquire valuable experience. Notably, all these countries (and even the oft-maligned French economy), in addition to maintaining their core industries, exited recession in 2009 before the United States.

How Cohesion and Compassion Co-Evolve

Ultimately, Social Darwinism fails in practice because it never succeeded as a theory. It’s not even Darwinist—Herbert Spencer, after all, had sketched out its contours even before Darwin published his own work. And when the great naturalist outlined a mechanism of natural selection so ostensibly cold and selfish, he never meant it to go further—even he felt it a faulty oversimplification to apply the same mechanism to the human struggle. More fundamentally, though, Social Darwinism’s presumptions about nature (captured, even today, in the callous connotations of the word "Darwinian") were misleading to begin with. Darwin himself, in his early work, did not challenge the traditionally cruel, heartless depiction of nature; he was deeply troubled by the predator-prey relationship, whose seeming amorality shook his religious convictions.

Nevertheless, Darwin professed a Socratic-ignorance about the mysterious nature of human morality and sympathy, which he recognized as seemingly outside the system of natural selection and yet somehow born of it. University of Chicago Professor Robert J. Richards, in an intriguing biographical treatment in the Proceedings of the National Academy of Sciences, noted that Darwin was both fascinated and puzzled by a certain "fecundity and creativity" in nature as described by the German naturalist Alexander von Humboldt, one of Darwin’s inspirations. He was also baffled by its inexplicable juxtaposition with nature’s seemingly harsh edges, and he therefore doubted what became Social Darwinism: "If the misery of the poor be caused not by the laws of nature, but by our institutions, great is our sin."

The notion of an emergent moral system, borne in a supposedly amoral world, also vexed and perplexed Darwin’s successors for the next century. With the almost singular exception of Peter Kropotkin, who argued that cooperation and altruism had a more basic function in nature, they solved the problem by either ignoring it or (for those of a Social Darwinist persuasion) discounting it as a meaningless epiphenomenon. Compassion, empathy, and altruism—when studied at all—were regarded merely as planks for direct reciprocation or, as Christine Kenneally expressed it in Slate, "recast as self-interest in disguise."

In recent years, however, work in both the biological and social sciences has indicated that traits like compassion and empathy are elemental to the wiring of animal nervous systems, and thus to their behavior and evolutionary change. Among the foremost figures in the field is the renowned Dutch primatologist Frans de Waal, whose professional papers and popular books, such as The Age of Empathy, have outlined how "soft" characteristics augment evolutionary fitness in a population–as demonstrated in robust ape communities whose members diligently tend to their sick and wounded. Fellow biologists, like Samuel Bowles and Marc Bekoff, bioethicists like Jessica Pierce (Bekoff’s co-author of Wild Justice), and anthropologists like Robert Sussman have reinforced such conclusions. Meanwhile, a trickle of industry-oriented literature—such as Kristin Tillquist’s Capitalizing on Kindness and William F. Baker’s Leading with Kindness—has extended the theme to human systems by countering the common business-as-war metaphor, while authors such as Robert Wright, in his book Nonzero, have followed in the philosophical tradition of Gottfried Leibniz and Pierre Teilhard de Chardin to suggest that the physical world itself may be subtly hard-wired to promote intelligence and constructive, moral behavior.

Most of this literature is recent and the case is far from settled, but there is a common thread streaming throughout this body of work: As an association of individuals attains higher orders of complexity, so does the corresponding need to ensure trust and solidarity among its constituents. As Eric Michael Johnson remarked in Seed magazine, in reviewing de Waal’s work, such collaborative units "thrive because of the cooperation, conciliation, and, above all, the empathy that they display towards fellow members." In such a state, the callous precepts of Social Darwinism—which may be adaptive in simplistic, isolated competitive scenarios—become fundamentally maladaptive.

Intelligence in nature might be regarded as a form of causal power, a capacity to purposefully and reproducibly affect the world around us, and thus an adaptive trait for individuals and communities. Moreover, recent work in the field of organizational behavior (summarized in James Surowiecki’s The Wisdom of Crowds) suggests that under conditions of productive collaboration, an emergent "collective mind" can congeal in a group, of which any individual member is not aware but which enables higher-order problem-solving. And at the beginning of the last century, Pierre Teilhard de Chardin and the Russian philosopher Vladimir Vernadsky both posited that human cognition is guiding further evolution toward a so-called noosphere, a united consciousness with growing self-awareness (even as human individuality and uniqueness are preserved).

Whether one regards such notions as literal or metaphorical, the clear implication of recent work is that the process of evolution itself is nonlinear and evolving, with inflection points as one advances from individuals in a species, to small groups and communities, all the way up to cities and nations. At each emergent leap, ruthless, dog-eat-dog behavior—even if it carries some advantage at simpler levels—increasingly poisons the more complex levels of organization, where cooperation and co-opetition are called for. Unlike individuals, dynamic and evolving networks have no natural life cycle; they are in principle immortal, but their resilience and vigor vary depending on their capacity to transition across such inflection points and embrace the generosity and solidarity that, in a seeming contradiction, make them more, not less, competitive. Social Darwinism thus appeals to the selfish simplicity of human beings in isolation, in opposition to the cohesive, empowering complexity of the communities that emerge.

Co-opetition and Policymaking

These principles are of far more than mere academic interest; they are pivotal to guiding the real world of fiscal and public policy amidst the tumultuous uncertainties of the twenty-first century economy and the Great Recession. Our system’s destructive zero-sum adversarialism has reached a disastrous logical endpoint: suffocated by ideological polarization, fruitless partisan bickering, juvenile finger-pointing, stifling parliamentary obstacles (not just in the Senate), and the iron grip of moneyed interests at home and abroad, further empowered by the Supreme Court’s recent undermining of longstanding campaign finance law. Moreover, the political calculus of our constitutionally mandated winner-take-all (plurality voting or "first-past-the-post") elections makes third parties a practical impossibility, so the urgent pleas of the people are thwarted, while they are denied outlets to express their frustration and effect reform.

The ultimate result is a fractious, sclerotic, and dysfunctional U.S. institutional paralysis whatever the party grasping the reins, incapable of tackling the fine-grained nuances of twenty first—century public policy without collapsing into simplistic and ham-fisted ideological quarrels. Subsidized university tuition and quality public schools, universal child and health care, job-creating public works projects and infrastructure spending, government-sponsored research and conservation efforts, carefully-managed unemployment and social safety nets—all of these are misleadingly cast as crude liberal-conservative battlegrounds, with the outcome benefiting one coalition or another. In reality, such policies foster the very cohesion and resilience that enable a country to weather economic storms, free up its citizens’ talents for creative and entrepreneurial endeavors, and emerge as a more competitive and self-reliant entity—thus defying tidy ideological categories. In the twentieth century, it was a Republican president, Theodore Roosevelt, who first understood this paradoxical intricacy with his trust-busting, regulating, and conservation efforts, and his pragmatist successors—FDR, Truman, and especially Eisenhower—spanned both parties.

Today, it is our global peers across our two adjoining oceans that have best managed to abide and reconcile the seeming contradiction of a competitive society steeped in compassion. Commentators across the U.S. political spectrum mistakenly regard Europe as a bastion of "leftism" or "socialism." The United States and Europe differ in their historical contexts, of course, but there is a more fundamental dynamic to the Old World’s progressive social policies, poorly appreciated in the New World: They are motivated as much by cold-eyed pragmatism and a competitive drive as they are by Europe’s genuine longing for social justice. Japan, under its new Prime Minister Yukio Hatoyama, has also recognized this juxtaposition, as has a resurging (and politically evolving) China. These ancient lands of West and East, buffeted by centuries of peasant rebellions and bloody revolutions, understand that a predatory aristocracy will inevitably devour itself amid the fury of a ravaged populace, bereft of the most basic human dignities and the promise of social mobility. It is a lesson that we, as a dynamic yet inexperienced nation in historical terms, would be wise to heed. Ω

[J. Wesley Ulm graduated from Harvard University with an M.D. degree from Harvard Medical School and a Ph.D. in Genetics from Harvard University. Ulm graduated summa cum laude from Duke University in 1996. Ulm has written a forthcoming novel, The Leibniz Demon.]

Copyright © 2010. Democracy: A Journal of Ideas, Inc.

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.

Copyright © 2010 Sapper's (Fair & Balanced) Rants & Raves