Tuesday, April 24, 2018

This Just In — An Orange Julius Stand Is Open In The Oval Office (5¢ Please)

This blogger extends an H/T to a youngster who reads this blog in NW Phoenix (AZ). This essay is a truly religious take on the behavior of the current occupant of the Oval Office that is diametrically opposite of the craven acceptance of the current occupant by the evangelicals who proclaim their religiosity. This blog grants no "mulligans" to the current occupant and his outrageous behavior. If this is truly (fair & balanced) theology, so be it.

[x NY Fishwrap]
Thank Trump, Or You’ll Be Sorry
By Diana Butler Bass

TagCrowd Cloud of the following piece of writing

created at TagCrowd.com

President Trump recently tweeted, “The United States, under my administration, has done a great job of ridding the region of ISIS. Where is our ‘Thank you, America?’ ”

President Trump has often criticized Americans for not being grateful enough. Now he has chastised the whole world as a thankless lot of humanity — a globe of ingrates.

Mr. Trump’s obsession with gratitude is a regular feature of his unscripted remarks and speeches. When people thank him, he likes them. But when slighted, he is quick to criticize unappreciative offenders. He has attacked Puerto Rican leaders as “politically motivated ingrates”; demanded public thanks from his cabinet and members of Congress; wants people to thank him for stock market gains; and excoriated a corporation as failing to thank him when he approved a project to its benefit.

Last December, a pro-Trump “super PAC” expressed its gratitude with a commercial, “Thank you, President Trump,” that expressed appreciation to him for, among other things, “letting us say ‘Merry Christmas’ again.”

Gratitude is central to Mr. Trump’s politics. He demands it of his followers, his cabinet and, indeed, of all citizens. He deploys gratitude against his enemies and critics to embarrass and shame. Being grateful is not an option. It is a requirement.

Donald Trump has made “thank you” divisive.

Yet gratitude has always been political. Sometimes it is used toward good political ends (such as public celebrations of thanksgiving). More often, however, authoritarian leaders have used gratitude to control critics and consolidate power.

The misuse of gratitude in politics goes back a long way — ancient Rome mastered it. In that empire, structured as an economic and political pyramid, a few people at the top held most of the wealth and power. At the bottom, where most people barely survived, there was very little. What held this inherently unjust system together? There was, of course, a feared army. But there was also something else: a social structure based on a particular form of gratitude.

The emperor Caesar — the head of the Roman empire — was believed to be “lord and savior.” He owned everything, the benefactor who distributed his gifts and favors (“gratia” in Latin) at will. Even if you were a slave with a single piece of bread to eat, that bread was considered a gift of the emperor’s.

Caesar’s gifts, however, were not free. They were transactional. When you received from Caesar, you were expected to return gratitude, your “gratia,” through tributes, tithes, taxes, loyalty and military service. Until you returned appropriate thanks, you were in Caesar’s debt. If you failed to fulfill your obligation, you were an “ingrate,” which was a political crime punishable by the seizure of your property, prison, exile or execution.

Rome’s power was built on benefactors and beneficiaries bound by reciprocal obligations of gratitude. It worked, but it was easily corrupted. Lower classes incurred huge debts of gratitude that could never be repaid, functionally enslaving them. Ancient philosophers urged benefactors to eschew corrupted gratia and instead give freely from a desire for the common good. Benevolent gratitude, they insisted, was a virtue. Sadly, it was also rare.

Western societies inherited Roman ideas of gratitude. Medieval rulers tried (and failed) to Christianize political gratitude, but Enlightenment thinkers like John Locke and Adam Smith rejected quid pro quo. They argued that reciprocal gratitude was bad for politics, but also believed that benevolent gratitude was necessary for moral democracy. It was a nuanced and difficult position to achieve. The temptations of corrupted gratitude kept creeping back into Western politics.

Understanding this helps explain Donald Trump. He has always depicted himself as a benefactor: “I alone can fix it.” During the primaries, he boasted that he received no outside gifts or contributions, thus debts of gratitude would never control him. He criticized conventional forms of payback, promising to distribute social largess to the “right” people, rid the system of undeserving beneficiaries and restore upward mobility in a social pyramid. No more corporations, no more politicians. He would be the ultimate benefactor. He would make America great again from the top.

This helps explain why the Russia inquiry makes Mr. Trump angry. The suggestion that he benefited from anyone, much less a foreign government, undermines his self-image as unassailable benefactor. He never receives. He gives as he wills, and to whom he chooses. “Receivers,” like the poor, immigrants, women and persons of color, are considered weaker beings, consigned to the lower ranks of his social pyramid, and who, failing to reciprocate his paternalistic generosity, are chided for a lack of thanks.

There is, however, an alternative to the pyramid of gratitude: a table. One of the enduring images of American self-understanding is that of a Thanksgiving table, where people celebrate abundance, serve one another and make sure all are fed. People give with no expectation of return, and joy replaces obligation.

This vision of gratitude is truly virtuous, sustains the common good, ensures a circle of equality, and strengthens community. Instead of Mr. Trump’s gratitude-as-duty politics, what our country needs is a new vision of an American table of thanks. # # #

[Diana Butler Bass is an independent scholar and the author of eight books — most recently, Grateful: The Transformative Power of Giving Thanks (2018). She received a BA (social science and religious studies) from Westmont College (CA), an MATS (church history and historical theology) from the Gordon Conwell Theological Seminary (MA), and a PhD (religious studies) from Duke University (NC).]

Copyright © 2018 The New York Times Company



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2018 Sapper's (Fair & Balanced) Rants & Raves

Monday, April 23, 2018

Had Enough Yet?

Along with today's 'toon, Tom/Dan wrote:

It’s another tight week for me — was back up at the old homestead last weekend, taking advantage of bulk trash pickup this month to clear out a lot of stuff that I no longer need or have room for in my New York life, and I’m going back up this weekend to finish up the job. I’m filing this cartoon early on Friday and hoping we get through the next ten or twelve hours, let alone the weekend, without some major news story breaking. I used to routinely post my cartoons on Fridays, without constantly wondering if they were going to be completely out of date by Monday morning. I almost can’t remember what that even felt like anymore.

Dan/Tom

This blogger sympathizes with Dan/Tom's media-fatigue. The current occupant of the Oval Office is the news with all of his compulsions and borderline idiosyncratic behaviors. This awful reality is that we are dragged into the current occupant's virtual padded room. The behavior and rants are exhausting at the very minimum. If this is a (fair & balanced) response of "Enough already," so be it.

PS: Here's A Shameless Tease — Wait Until Tomorrow!

[x TMW]
A Matter Of Perspective
By Tom Tomorrow (Dan Perkins)


Tom Tomorrow/Dan Perkins

[Dan Perkins is an editorial cartoonist better known by the pen name "Tom Tomorrow." His weekly comic strip, "This Modern World," which comments on current events from a strong liberal perspective, appears regularly in approximately 150 papers across the U.S., as well as on Daily Kos. The strip debuted in 1990 in SF Weekly. Perkins received the Robert F. Kennedy Award for Excellence in Journalism in both 1998 and 2002. When he is not working on projects related to his comic strip, Perkins writes a daily political blog, also entitled "This Modern World," which he began in December 2001. More recently, Dan Perkins, pen name Tom Tomorrow, was named the winner of the 2013 Herblock Prize for editorial cartooning. Even more recently, Dan Perkins was a runner-up for the 2015 Pulitzer Prize for Editorial Cartooning.]

Copyright © 2018 This Modern World/Tom Tomorrow (Dan Perkins)



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2018 Sapper's (Fair & Balanced) Rants & Raves

Roll Over, Stanley Kubrick — This Blogger Defeated A Rogue Computer Component (With Help)

This post was meant for Earth Day 2018 (yesterday). However, in the closest scene possible to compare with the death-struggle in "2001: A Space Odyssey" between the astronaut Dave Bowman and HAL, the rogue computer aboard the spaceship was the struggle this blogger waged for one-and-a-half days with a failing cable modem. Ultimately, a patient cable technician supplied the blogger with a replacement pair of devices: a modem and a router to replace the failed original modem. And, voilà — the Internet connection was restored. (Cue "Also sprach Zarathustra") If this is a (fair & balanced) belated meditation for Earth Day 2018, so be it.

[x New Yorker]
Anybody There? — “2001: A Space Odyssey”: What It Means, And How It Was Made
By Dan Chiasson


TagCrowd Cloud of the following piece of writing

created at TagCrowd.com

Fifty years ago this spring, Stanley Kubrick’s confounding sci-fi masterpiece, “2001: A Space Odyssey,” had its premières across the country. In the annals of audience restlessness, these evenings rival the opening night of Stravinsky’s “Rite of Spring,” in 1913, when Parisians in osprey and tails reportedly brandished their canes and pelted the dancers with objects. A sixth of the New York première’s audience walked right out, including several executives from MGM. Many who stayed jeered throughout. Kubrick nervously shuttled between his seat in the front row and the projection booth, where he tweaked the sound and the focus. Arthur C. Clarke, Kubrick’s collaborator, was in tears at intermission. The after-party at the Plaza was “a room full of drinks and men and tension,” according to Kubrick’s wife, Christiane.

Kubrick, a doctor’s son from the Bronx who got his start as a photographer for Look, was turning forty that year, and his rise in Hollywood had left him hungry to make extravagant films on his own terms. It had been four years full of setbacks and delays since the director’s triumph, “Dr. Strangelove, Or: How I Learned to Stop Worrying and Love the Bomb.” From the look of things, the Zeitgeist was not going to strike twice. A businessman overheard on his way out of a screening spoke for many: “Well, that’s one man’s opinion.”

“2001” is a hundred and forty-two minutes, pared down from a hundred and sixty-one in a cut that Kubrick made after those disastrous premières. There is something almost taunting about the movie’s pace. “2001” isn’t long because it is dense with storytelling; it is long because Kubrick distributed its few narrative jolts as sparsely as possible. Renata Adler, in the Times, described the movie as “somewhere between hypnotic and immensely boring.” Its “uncompromising slowness,” she wrote, “makes it hard to sit through without talking.” In Harper’s, Pauline Kael wrote, “The ponderous blurry appeal of the picture may be that it takes its stoned audience out of this world to a consoling vision of a graceful world of space.” Onscreen it was 2001, but in the theatres it was still 1968, after all. Kubrick’s gleeful machinery, waltzing in time to Strauss, had bounded past an abundance of human misery on the ground.

Hippies may have saved “2001.” “Stoned audiences” flocked to the movie. David Bowie took a few drops of cannabis tincture before watching, and countless others dropped acid. According to one report, a young man at a showing in Los Angeles plunged through the movie screen, shouting, “It’s God! It’s God!” John Lennon said he saw the film “every week.” “2001” initially opened in limited release, shown only in 70-mm. on curved Cinerama screens. MGM thought it had on its hands a second “Doctor Zhivago” (1965) or “Ben-Hur” (1959), or perhaps another “Spartacus” (1960), the splashy studio hit that Kubrick, low on funds, had directed about a decade before. But instead the theatres were filling up with fans of cult films like Roger Corman’s “The Trip,” or “Psych-Out,” the early Jack Nicholson flick with music by the Strawberry Alarm Clock. These movies, though cheesy, found a new use for editing and special effects: to mimic psychedelic visions. The iconic Star Gate sequence in “2001,” when Dave Bowman, the film’s protagonist, hurtles in his space pod through a corridor of swimming kaleidoscopic colors, could even be timed, with sufficient practice, to crest with the viewer’s own hallucinations. The studio soon caught on, and a new tagline was added to the movie’s redesigned posters: “The ultimate trip.”

In “Space Odyssey: Stanley Kubrick, Arthur C. Clarke, and the Making of a Masterpiece,” the writer and filmmaker Michael Benson takes us on a different kind of trip: the long journey from the film’s conception to its opening and beyond. The power of the movie has always been unusually bound up with the story of how it was made. In 1966, Jeremy Bernstein profiled Kubrick on the “2001” set for The New Yorker, and behind-the-scenes accounts with titles like “The Making of Kubrick’s 2001” began appearing soon after the movie’s release. The grandeur of “2001”—the product of two men, Clarke and Kubrick, who were sweetly awestruck by the thought of infinite space—required, in its execution, micromanagement of a previously unimaginable degree. Kubrick’s drive to show the entire arc of human life (“from ape to angel,” as Kael dismissively put it) meant that he was making a special-effects movie of radical scope and ambition. But in his initial letter to Clarke, a science-fiction writer, engineer, and shipwreck explorer living in Ceylon, Kubrick began with the modest-sounding goal of making “the proverbial ‘really good’ science-fiction movie.” Kubrick wanted his film to explore “the reasons for believing in the existence of intelligent extraterrestrial life,” and what it would mean if we discovered it.

The outlines of a simple plot were already in place: Kubrick wanted “a space-probe with a landing and exploration of the Moon and Mars.” (The finished product opts for Jupiter instead.) But the timing of Kubrick’s letter, in March of 1964, suggested a much more ambitious and urgent project. “2001” was a science-fiction film trying not to be outrun by science itself. Kubrick was tracking NASA’s race to the moon, which threatened to siphon some of the wonder from his production. He had one advantage over reality: the film could present the marvels of the universe in lavish color and sound, on an enormous canvas. If Kubrick could make the movie he imagined, the grainy images from the lunar surface shown on dinky TV screens would seem comparatively unreal.

In Clarke, Kubrick found a willing accomplice. Clarke had served as a radar instructor in the RAF, and did two terms as chairman of the British Interplanetary Society. His reputation as perhaps the most rigorous of living sci-fi writers, the author of several critically acclaimed novels, was widespread. Kubrick needed somebody who had knowledge and imagination in equal parts. “If you can describe it,” Clarke recalls Kubrick telling him, “I can film it.” It was taken as a dare. Meeting in New York, often in the Kubricks’ cluttered apartment on the Upper East Side, the couple’s three young daughters swarming around them, they decided to start by composing a novel. Kubrick liked to work from books, and since a suitable one did not yet exist they would write it. When they weren’t working, Clarke introduced Kubrick to his telescope and taught him to use a slide rule. They studied the scientific literature on extraterrestrial life. “Much excitement when Stanley phones to say that the Russians claim to have detected radio signals from space,” Clarke wrote in his journal for April 12, 1965: “Rang Walter Sullivan at the New York Times and got the real story—merely fluctuations in Quasar CTA 102.” Kubrick grew so concerned that an alien encounter might be imminent that he sought an insurance policy from Lloyd’s of London in case his story got scooped during production.

Clarke was the authority on both the science and the science fiction, but an account he gave later provides a sense of what working with Kubrick was like: “We decided on a compromise—Stanley’s.” The world of “2001” was designed ex nihilo, and among the first details to be worked out was the look of emptiness itself. Kubrick had seen a Canadian educational film titled “Universe,” which rendered outer space by suspending inks and paints in vats of paint thinner and filming them with bright lighting at high frame rates. Slowed down to normal speed, the oozing shades and textures looked like galaxies and nebulae. Spacecraft were designed with the expert help of Harry Lange and Frederick Ordway, who ran a prominent space consultancy. A senior NASA official called Kubrick’s studio outside London “NASA East.” Model makers, architects, boatbuilders, furniture designers, sculptors, and painters were brought to the studio, while companies manufactured the film’s spacesuits, helmets, and instrument panels. The lines between film and reality were blurred. The Apollo 8 crew took in the film’s fictional space flight at a screening not long before their actual journey. NASA’s Web site has a list of all the details that “2001” got right, from flat-screen displays and in-flight entertainment to jogging astronauts. In the coming decades, conspiracy theorists would allege that Kubrick had helped the government fake the Apollo 11 moon landing.

Kubrick brought to his vision of the future the studiousness you would expect from a history film. “2001” is, in part, a fastidious period piece about a period that had yet to happen. Kubrick had seen exhibits at the 1964 World’s Fair, and pored over a magazine article titled “Home of the Future.” The lead production designer on the film, Tony Masters, noticed that the world of “2001” eventually became a distinct time and place, with the kind of coherent aesthetic that would merit a sweeping historical label, like “Georgian” or “Victorian.” “We designed a way to live,” he recalled, “down to the last knife and fork.” (The Arne Jacobsen flatware, designed in 1957, was made famous by its use in the film, and is still in production.) By rendering a not-too-distant future, Kubrick set himself up for a test: thirty-three years later, his audiences would still be around to grade his predictions. Part of his genius was that he understood how to rig the results. Many elements from his set designs were contributions from major brands—Whirlpool, Macy’s, DuPont, Parker Pens, Nikon—which quickly cashed in on their big-screen exposure. If 2001 the year looked like “2001” the movie, it was partly because the film’s imaginary design trends were made real.

Much of the film’s luxe vision of space travel was overambitious. In 1998, ahead of the launch of the International Space Station, the Times reported that the habitation module was “far cruder than the most pessimistic prognosticator could have imagined in 1968.” But the film’s look was a big hit on Earth. Olivier Mourgue’s red upholstered Djinn chairs, used on the “2001” set, became a design icon, and the high-end lofts and hotel lobbies of the year 2001 bent distinctly toward the aesthetic of Kubrick’s imagined space station.

Audiences who came to “2001” expecting a sci-fi movie got, instead, an essay on time. The plot was simple and stark. A black monolith, shaped like a domino, appears at the moment in prehistory when human ancestors discover how to use tools, and is later found, in the year 2001, just below the lunar surface, where it reflects signals toward Jupiter’s moons. At the film’s conclusion, it looms again, when the ship’s sole survivor, Dave Bowman, witnesses the eclipse of human intelligence by a vague new order of being. “2001” is therefore only partly set in 2001: as exacting as Kubrick was about imagining that moment, he swept it away in a larger survey of time, wedging his astronauts between the apelike anthropoids that populate the first section of the film, “The Dawn of Man,” and the fetal Star Child betokening the new race at its close. A mixture of plausibility and poetry, “real” science and primal symbolism, was therefore required. For “The Dawn of Man,” shot last, a team travelled to Namibia to gather stills of the desert. Back in England, a massive camera system was built to project these shots onto screens, transforming the set into an African landscape. Actors, dancers, and mimes were hired to wear meticulously constructed ape suits, wild animals were housed at the Southampton Zoo, and a dead horse was painted to look like a zebra.

For the final section of the film, “Jupiter and Beyond the Infinite,” Ordway, the film’s scientific consultant, read up on a doctoral thesis on psychedelics advised by Timothy Leary. Theology students had taken psilocybin, then attended a service at Boston University’s Marsh Chapel to see if they’d be hit with religious revelations. They dutifully reported their findings: most of the participants had indeed touched God. Such wide-ranging research was characteristic of Clarke and Kubrick’s approach, although the two men, both self-professed squares, might have saved time had they been willing to try hallucinogens themselves.

The Jupiter scenes—filled with what Michael Benson describes as “abstract, nonrepresentational, space-time astonishments”—were the product of years of trial and error spent adapting existing equipment and technologies, such as the “slit-scan” photography that finally made the famous Star Gate sequence possible. Typically used for panoramic shots of cityscapes, the technique, in the hands of Kubrick’s special-effects team, was modified to produce a psychedelic rush of color and light. Riding in Dave’s pod is like travelling through a birth canal in which someone has thrown a rave. Like the films of the late nineteenth century, “2001” manifested its invented worlds by first inventing the methods needed to construct them.

Yet some of the most striking effects in the film are its simplest. In a movie about extraterrestrial life, Kubrick faced a crucial predicament: what would the aliens look like? Cold War-era sci-fi offered a dispiriting menu of extraterrestrial avatars: supersonic birds, scaly monsters, gelatinous blobs. In their earliest meetings in New York, Clarke and Kubrick, along with Christiane, sketched drafts and consulted the Surrealist paintings of Max Ernst. For a time, Christiane was modelling clay aliens in her studio. These gargoyle-like creatures were rejected, and “ended up dotted around the garden,” according to Kubrick’s daughter Katharina. Alberto Giacometti’s sculptures of thinned and elongated humans, resembling shadows at sundown, were briefly an inspiration. In the end, Kubrick decided that “you cannot imagine the unimaginable” and, after trying more ornate designs, settled on the monolith. Its eerily neutral and silent appearance at the crossroads of human evolution evokes the same wonder for members of the audience as it does for characters in the film. Kubrick realized that, if he was going to make a film about human fear and awe, the viewer had to feel those emotions as well.

And then there is HAL, the rogue computer whose affectless red eye reflects back what it sees while, behind it, his mind whirrs with dark and secret designs. IBM consulted on the plans for HAL, but the idea to use the company’s logo fell through after Kubrick described him in a letter as “a psychotic computer.” Any discussion of Kubrick’s scientific prescience has to include HAL, whose suave, slightly effeminate voice suggests a bruised heart beating under his circuitry. In the past fifty years, our talking machines have continued to evolve, but none of them have become as authentically malicious as HAL. My grandfather’s early-eighties Chrysler, borrowing the voice from Speak & Spell, would intone, “A door is ajar,” whenever you got in. It sounded like a logical fallacy, but it seemed pleasantly futuristic nonetheless. Soon voice-command technology reached the public, ushering in our current era of unreliable computer interlocutors given to unforced errors: half-comical, half-pitiful simpletons, whose fate in life is to be taunted by eleven-year-olds. Despite the reports of cackling Amazon Alexas, there has, so far, been fairly little to worry about where our talking devices are concerned. The unbearable pathos of HAL’s disconnection scene, one of the most mournful death scenes ever filmed, suggests that when we do end up with humanlike computers, we’re going to have some wild ethical dilemmas on our hands. HAL is a child, around nine years old, as he tells Dave at the moment he senses he’s finished. He’s precocious, indulged, needy, and vulnerable; more human than his human overseers, with their stilted, near robotic delivery. The dying HAL, singing “Daisy,” the tune his teacher taught him, is a sentimental trope out of Victorian fiction, more Little Nell than little green man.

As Benson’s book suggests, in a way the release of “2001” was its least important milestone. Clarke and Kubrick had been wrestling for years with questions of what the film was, and meant. These enigmas were merely handed off from creators to viewers. The critic Alexander Walker called “2001” “the first mainstream film that required an act of continuous inference” from its audiences. On set, the legions of specialists and consultants working on the minutiae took orders from Kubrick, whose conception of the whole remained in constant flux. The film’s narrative trajectory pointed inexorably toward a big ending, even a revelation, but Kubrick kept changing his mind about what that ending would be—and nobody who saw the film knew quite what to make of the one he finally chose. The film took for granted a broad cultural tolerance, if not an appetite, for enigma, as well as the time and inclination for parsing interpretive mysteries. If the first wave of audiences was baffled, it might have been because “2001” had not yet created the taste it required to be appreciated. Like “Ulysses,” or “The Waste Land,” or countless other difficult, ambiguous modernist landmarks, “2001” forged its own context. You didn’t solve it by watching it a second time, but you did settle into its mysteries.

Later audiences had another advantage. “2001” established the phenomenon of the Kubrick film: much rumored, long delayed, always a little disappointing. Casts and crews were held hostage as they withstood Kubrick’s infinite futzing, and audiences were held in eager suspense by P.R. campaigns that often oversold the films’ commercial appeal. Downstream would be midnight showings, monographs, dorm rooms, and weed, but first there was the letdown. The reason given for the films’ failures suggested the terms of their redemption: Kubrick was incapable of not making Kubrick films.

“2001” established the aesthetic and thematic palette that he used in all his subsequent films. The spaciousness of its too perfectly constructed sets, the subjugation of story and theme to abstract compositional balance, the precision choreography, even—especially—in scenes of violence and chaos, the entire repertoire of colors, angles, fonts, and textures: these were constants in films as wildly different as “Barry Lyndon” (1975) and “The Shining” (1980), “Full Metal Jacket” (1987) and “Eyes Wide Shut” (1999). So was the languorous editing of “2001,” which, when paired with abrupt temporal leaps, made eons seem short and moments seem endless, and its brilliant deployment of music to organize, and often ironize, action and character. These elements were present in some form in Kubrick’s earlier films, particularly “Dr. Strangelove,” but it was all perfected in “2001.” Because he occupied genres one at a time, each radically different from the last, you could control for what was consistently Kubrickian about everything he did. The films are designed to advance his distinct filmic vocabulary in new contexts and environments: a shuttered resort hotel, a spacious Manhattan apartment, Vietnam. Inside these disparate but meticulously constructed worlds, Kubrick’s slightly malicious intelligence determined the outcomes of every apparently free choice his protagonists made.

Though Kubrick binged on pulp sci-fi as a child, and later listened to radio broadcasts about the paranormal, “2001” has little in common with the rinky-dink conventions of movie science fiction. Its dazzling showmanship harkened back to older cinematic experiences. Film scholars sometimes discuss the earliest silent films as examples of “the cinema of attraction,” movies meant to showcase the medium itself. These films were, in essence, exhibits: simple scenes from ordinary life—a train arriving, a dog cavorting. Their only import was that they had been captured by a camera that could, magically, record movement in time. This “moving photography” was what prompted Maxim Gorky, who saw the Lumière brothers’ films at a Russian fair in 1896, to bemoan the “kingdom of shadows”—a mass of people, animals, and vehicles—rushing “straight at you,” approaching the edge of the screen, then vanishing “somewhere beyond it.”

“2001” is at its best when it evokes the “somewhere beyond.” For me, the most astounding moment of the film is a coded tribute to filmmaking itself. In “The Dawn of Man,” when a fierce leopard suddenly faces us, its eyes reflect the light from the projection system that Kubrick’s team had invented to create the illusion of a vast primordial desert. Kubrick loved the effect, and left it in. These details linger in the mind partly because they remind us that a brilliant artist, intent on mastering science and conjuring science fiction, nevertheless knew when to leave his poetry alone.The interpretive communities convened by “2001” may persist in pockets of the culture, but I doubt whether many young people will again contend with its debts to Jung, John Cage, and Joseph Campbell. In the era of the meme, we’re more likely to find the afterlife of “2001” in fragments and glimpses than in theories and explications. The film hangs on as a staple of YouTube video essays and mashups; it remains high on lists of both the greatest films ever made and the most boring. On Giphy, you can find many iconic images from “2001” looping endlessly in seconds-long increments—a jarring compression that couldn’t be more at odds with the languid eternity Kubrick sought to capture. The very fact that you can view “2001,” along with almost every film ever shot, on a palm-size device is a future that Kubrick and Clarke may have predicted, but surely wouldn’t have wanted for their own larger-than-life movie. The film abounds in little screens, tablets, and picturephones; in 2011, Samsung fought an injunction from Apple over alleged patent violations by citing the technology in “2001” as a predecessor for its designs. Moon landings and astronaut celebrities now feel like a thing of the past. Space lost out. Those screens were the future. # # #

[Dan Chiasson is a poet, critic, and journalist; he is the author of five books. Chiasson is a professor of English at Wellesley College [MA]. He received a BA summa cum laude (English) from Amherst College (MA) and a PhD (English) from Harvard University (MA).]

Copyright © 2018 The New Yorker/Condé Nast Digital



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2018 Sapper's (Fair & Balanced) Rants & Raves

Saturday, April 21, 2018

Today's Theme — "Don't Worry, Be Happy"

Today's essay caused the blogger to wonder about relative happiness among the 50 states in the USA. It would seem that the Nordic connection to relative happiness is reflected in the #1 happiest state — Minnesota. Then, this blogger noted that relative happiness in the Lone Star State where he currently resides is the 28th happiest place in the US. That ranking is ironic because Texas was the 28th state admitted to the Union. If this is (fair & balanced) regret that Thomas Jefferson tasked us all to pursue happiness, so be it.

[x HNN]
Which Nations Are The Happiest?
By Lawrence Wittner


TagCrowd Cloud of the following piece of writing

created at TagCrowd.com

America’s oft-quoted Declaration of Independence, when discussing “unalienable rights,” focused on “Life, Liberty, and the pursuit of Happiness.” Although “happiness” is rarely referred to by today’s government officials, the general assumption in the United States and elsewhere is that governments are supposed to be fostering the happiness of their citizens.

Against this backdrop, it’s worth taking a look at the 2018 World Happiness Report, released in mid-March 2018 by the Sustainable Development Solutions Network, a UN venture. Drawing on Gallup Poll surveys of the citizens of 156 countries from 2015 to 2017, the report, written by a group of eminent scholars, focused on the influence of per capita gross domestic product, social support, healthy life expectancy, social freedom, generosity, and absence of corruption in securing public happiness. Its rankings were based on assessments by people in different nations of their well-being.

What the report found was that the nations whose people reported themselves happiest were: Finland, Norway, Denmark, Iceland, and Switzerland. And right after these came the Netherlands, Canada, New Zealand, Sweden, and Australia. By contrast, the allegedly “great” powers that have proclaimed themselves models for the world―including the United States, Britain, France, Germany, Russia, and China―fared relatively poorly. The United States ranked 18th (having dropped four spots from the previous year’s report), while Russia placed 59th and China 86th.

The ten happiest nations share a number of characteristics. Preeminent among them is the fact that they have been governed for varying periods by Social Democratic and other parties of the moderate Left that have provided the populations of their countries with advanced social welfare institutions. These include national health care, free or inexpensive higher education, child and family support, old-age pensions, public housing, mass transit facilities, and job retraining programs―usually funded by substantial taxes, especially on the wealthy. Certainly the five Nordic countries that appear among the ten happiest nations, with four of them occupying the top four spots, fit this model very well. As Meik Wiking, CEO of Copenhagen’s Happiness Research Institute, remarked, they “are good at converting wealth into well-being.” They have also been good at defending the rights of workers, women, immigrants, racial and religious minorities, and other disadvantaged groups.

Of course, it’s also true that the ten happiest nations are relatively prosperous ones. And the least happy nations tend to be those that are most impoverished [PDF].

Nonetheless, wealth alone cannot account for the highest rankings. Finland (#1) has less than half the wealth per adult that the United States (#18) has, while Norway, Denmark, Sweden, New Zealand, Canada, and the Netherlands also lag behind the United States in average wealth per adult. Discounting the dominant influence of wealth on national happiness, the writers of the report contend that belonging and respect in civil society also play vital roles, as do “high levels of mutual trust, shared purpose, generosity, and good governance.” A minimum level of economic well-being might be necessary to pull people out of misery, but once this level has been reached, further wealth doesn’t necessarily produce greater happiness.

The United States provides a good example of this point. As Professor Jeffrey Sachs of Columbia University observes in the 2017 report [PDF], America’s per capita gross domestic product continues rising, “but happiness is now actually falling.” Indeed, between 2012 and 2018, the US happiness ranking dropped from 11th to 18th. Over the years, he notes, there has been a “destruction of social capital” in the United States, caused by the power of “mega-dollars in US politics,” “soaring income and wealth inequality,” a “decline in social trust” related to immigration, a climate of fear triggered by the 9/11 attacks, and a “severe deterioration of America’s educational system.” Economic growth, Sachs argues, has not been (and will not be) successful in fostering greater American happiness. That will only be secured “by addressing America’s multi-faceted social problems―rising inequality, corruption, isolation, and distrust.”

One factor not considered by the report is the role of violence in reducing happiness. The ten happiest nations certainly have much lower murder rates than the United States and Russia. Also, when it comes to gun homicide rates, these vary from 1/35th (Norway) to 1/8th (Canada) of the rate in the United States. It is also worth noting that at least five of the world’s least happy countries are war zones: Ukraine (#138), Afghanistan (#145), Syria (#150), Yemen (#152), and South Sudan (#154).

Although the ten happiest nations maintain armed forces, none can be classified as a major military power or has chosen to become one. For example, given their economic strength and technological prowess, they could easily develop nuclear weapons. But none has opted to do so. This contrasts with the nine nuclear powers, which retain some 15,000 nuclear weapons and are engaged in a new, vastly expensive nuclear arms race. Whatever else these nuclear nations have achieved through prioritizing the building of nuclear weapons, it has not―as the happiness rankings show us―led to widespread happiness among their citizens.

Overall, then, it appears that the pursuit of ever-greater wealth and military power by national governments doesn’t necessarily create happiness for their people. By contrast, governments that seek to improve everyone’s lives―or, in the words of the preamble to the US Constitution, “promote the general welfare”―do a much better job of it. # # #

Lawrence S. Wittner is a professor of history emeritus at SUNY at Albany and the author of Confronting the Bomb: A Short History of the World Nuclear Disarmament Movement (2009). See other books by Wittner here. He received both a BA and a PhD (both in history) from Columbia University (NY).]

Copyright © 2018 History News Network



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2018 Sapper's (Fair & Balanced) Rants & Raves

Friday, April 20, 2018

Today's Helpful Hint To The Current Occupant Of The Oval Office & His Crack Legal Team: Don't Mess With Federal Judge Kimba Wood

Handy Andy (Borowitz) strikes twice in the same week in this blog in a record-setting event. Borowitz alludes to the scandal of the revelation of bias at Faux News without mentioning the source of the scandal. (Hint: The photo accompanying this post provides a clue and if another hint is necessary, the scandalous Faux News personality is known as S.H. to his friends. It is rumored that the initials provided cries of "$hit-Head" on the playground. If this is (fair & balanced) scandal-mongering, so be it.

[x New Yorker]
Nation Shocked To Learn Of Possible Bias At Fox News
By Handy Andy (Borowitz)


TagCrowd Cloud of the following piece of writing

created at TagCrowd.com

Millions of Americans were stunned and incredulous on Monday after learning of a possible incident of bias at Fox News Channel.

At a time when so many American institutions have been under attack, the possibility that Fox, one of the nation’s most respected news organizations, might be susceptible to hidden agendas was too much for many to take.

In interviews across the country, Fox viewers expressed disappointment, confusion, and shock that a news network known for its exacting standards had imperilled its hard-earned reputation for fairness.

“I’m devastated by this,” Carol Foyler, a viewer from Scottsdale, Arizona, said. “If we can’t trust Fox News, who can we trust?”

Tracy Klugian, a viewer from Akron, Ohio, said that he had been “walking around in a state of disbelief” since he learned of possible bias at the network. “I’m trying to be strong, but it’s tough,” he said. “I know I speak for a lot of people when I say that today was the day that America lost its innocence.”

But some Fox viewers, like Harland Dorrinson, of Topeka, Kansas, warned of a “rush to judgment” against Fox, urging people to remember the network’s stellar record of journalistic accomplishments.

“Whenever there was a national emergency, whether it was Benghazi, Hillary’s e-mails, or Obama’s birth certificate, Fox News was there,” he said. “One little mistake doesn’t wash all that away.” # # #

[Andy Borowitz is the creator the Borowitz Report, a Web site that is a lot funnier than the stuff posted by Matt Drudge and his ilk. Borowitz is a comedian and writer whose work appears regularly in The New Yorker. He is the first winner of the National Press Club's humor award and has won seven Dot-Comedy Awards for his web site. His most recent book (and Amazon's Best Kindle Single of the Year) is An Unexpected Twist (2012). Borowitz received a BA, magna cum laude (English) from Harvard University (MA).]

Copyright © 2018 The New Yorker/Condé Nast Digital



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2018 Sapper's (Fair & Balanced) Rants & Raves

Thursday, April 19, 2018

Will The Current Occupant Of The Oval Office Speak About The OKC Bombing 23 Years Ago? (Cue The Crickets)

Today marks the 23rd anniversary of the terrorist bombing of the Alfred P. Murrah Federal Building in Oklahoma City. The primary suspect, Timothy McVeigh, was apprehended and — after trial — executed for the murder of 169 bombing victims, including 19 children. The white power movement that produced Timothy McVeigh is still very much present in the United States of America and history professor Kathleen Belew offers a compelling explanation of the persistence of the white power movement. She omitted mention of very important current supporter of white power *#151; the current occupant of the Oval Office. His attacks on the Justice Department and the Federal Bureau of Investigation have import beyond the current talking head noise about abuse of power. These attacks provide sanctuary for all of the white supremacists and their fellow travelers in the United States. If this is a (fair & balanced) warning about a national security threat, so be it.

[x NY Fishwrap]
The History Of White Power
By Kathleen Belew


TagCrowd Cloud of the following piece of writing

created at TagCrowd.com

When neo-Nazi and alt-right demonstrators attacked counterprotesters in Charlottesville, VA, last September, killing one and injuring several others, many Americans responded with surprise that white supremacists were suddenly in their midst. But white-power activism is not new, nor has it been part of an underground history. We knew. And we forgot.

Twenty-three years ago, on April 19, 1995, a Ryder rental truck filled with fertilizer exploded in front of the Alfred P. Murrah Federal Building in Oklahoma City. The bombing killed 168 people, including 19 children — the largest deliberate mass casualty event on American soil between Pearl Harbor and the Sept. 11 attacks.

And yet, in these 23 years, the bombing remains misunderstood as an example of “lone wolf” terrorism. People repeat the words of the bomber Timothy McVeigh, an avowed white-power advocate who before his execution pointed out how scary it was that one man could wreak “this kind of hell.”

But in fact, the bombing was the outgrowth of decades of activism by the white-power movement, a coalition of Ku Klux Klan, neo-Nazis, skinheads and militias, which aimed to organize a guerrilla war on the federal government and its other enemies.

Its network of activists spanned regional, generational, gender and other divides. Membership numbers are hard to pin down, but scholars estimate that in the 1980s the movement included around 25,000 hard-core members, 160,000 more who bought white-power literature and attended movement events, and 450,000 who read the literature secondhand.

These hundreds of thousands of adherents were knit tightly together. As a historian of the movement, I have spent a decade connecting threads among thousands of documents, including original correspondence and ephemera of activists, government surveillance documents, court records and newspaper reports.

From the formal unification in 1979 of previously antagonistic groups under a white-power banner, through its revolutionary turn to declare war on the government in 1983, through its militia phase in the early 1990s, the white-power movement mobilized through a cohesive social network using commonly held beliefs. Its activists operated with discipline and clarity, training in paramilitary camps and undertaking assassinations, mercenary soldiering, armed robbery, counterfeiting and weapons trafficking.

White-power violence was discussed in major newspapers, on public access television, on talk shows and morning news shows, on the radio, and portrayed in television mini-series and movies. How, then, were white-power activists so misunderstood by so many Americans so that today we are once again stunned to find them marching in our streets?

One answer is Fort Smith, ArkR. In 1987, prosecutors indicted 13 white supremacists on federal charges, including seditious conspiracy. Jurors heard testimony about 30 gallons of cyanide seized just before it could be used to poison the water of a major city; assassinations of a talk-radio personality, fellow group members and state troopers; and endless paramilitary training, parading and harassment of various enemies. They saw two huge laundry hampers of the movement’s military-grade weapons pushed through the courtroom. Witnesses described how separatist compounds manufactured their own Claymore-style land mines and trained in urban warfare.

But at the end of the Fort Smith trial, all 13 defendants were acquitted. Court records show that the weapons in the two laundry hampers were returned to them.

The trial was flawed from the start. Two jurors developed romantic pen-pal relationships with defendants, and one of those couples married after the trial. Large swaths of evidence were excluded, as were jurors familiar with white-power activity in the area, which had been widely reported. One juror later spoke of a belief that the Bible prohibited race mixing.

It was such an embarrassment that — along with the calamities that were also public relations disasters at Ruby Ridge, Idaho, and Waco, TX, in the early 1990s — it clouded prosecutors’ approach to Oklahoma City and other instances of white-power violence. Framing such acts as part of a movement, they decided, was too risky; easier to go after defendants individually.

Indeed, the FBI established a policy to pursue only individuals in white-power violence, with, according to FBI internal documents, “no attempts to tie individual crimes to a broader movement.” This strategy not only obscured the Oklahoma City bombing as part of a social movement but, in the years after McVeigh’s execution, also effectively erased the movement itself from public awareness.

After a brief wave of copycat violence and subsequent small-scale crackdowns, white-power activism largely relocated to the internet. There, it gathered strength even as much of the country came to believe in a colorblind, multicultural or post-racial United States. But the white-power movement reveals a sustained current of overt racism and violence in the years we thought of as peaceful, one that is resurgent today.

White-power activity in the United States is not new, nor has it been as shadowy as we may have imagined. It was known and then forgotten. We must, collectively, recognize its strength and history, or our amnesia will make it impossible to respond to such activism and violence in the present. # # #

[Kathleen Belew, an assistant professor of history at the University of Chicago, is the author of a first book entitled Bring the War Home: The White Power Movement and Paramilitary America (2018). She received an AB (comparative history of ideas) from the University of Washington at Seattle and a PhD (American studies) from Yale University (CT).]

Copyright © 2018 The New York Times Company



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2018 Sapper's (Fair & Balanced) Rants & Raves

Wednesday, April 18, 2018

Quel est alors ce culte de Tocqueville?? (What Then, Is This Cult Of Tocqueville?)

Once again, this blogger must utter a mea culpa for drinking the Tocquevillian Kool-Aid of the Cold War era. The blogger completely misread Democracy in America as an insight into the ethos of this country. To say that Tocqueville had a lot of skeletons in his closet is a classic understatement. If this is (fair & balanced) disillusionment, so be it.

[x The American Interest]
Non To Tocqueville!
By Ben Judah


TagCrowd Cloud of the following piece of writing

created at TagCrowd.com

Alexis de Tocqueville, the melancholy-eyed, floppy haired French noble setting out for the Mississippi, is not who he seems. A false impression reigns in America, where more have heard of him and his work than of almost anyone else writing in French today. And for a Frenchman who hasn’t been in the United States in over 185 years, it is quite something to see him so frequently rolled out as a liberal icon in the pages of America’s premier publications.

Take the New York Times. On January 13, Tocqueville made an appearance in a Ross Douthat column questioning “Is There Life After Liberalism?” Just two weeks later he was back in the opening line of a Bret Stephens column on “The GOP’s Bonfire of the Sanities.” Meanwhile, David Brooks wove the forever-young and venturing Frenchman into not one but two of his columns last year, having, of course, integrated him three times into his column the previous year, too. Over the last year Tocqueville has also featured in Times columns on “How living abroad taught me to love America,” philanthropy, lawyers, Trump’s foreign policy, Charlottesville, and the Supreme Court, and just as frequently in the paper’s books and letters section on topics as diverse as American involvement in Asia and, oddly, Australia.

At first glance, what should be surprising here? Why wouldn’t an erudite figure from the upper reaches of aristocratic France be so often quoted chez les Anglo-Saxons? And why wouldn’t he also be canonized in French intellectual life?

Nothing at all—until you realize that Tocqueville was not only out of print, absent from the curriculum, and more or less unknown in France eighty years ago, whilst his name had not come up much even in America since the Civil War.

How Tocqueville got dusted off is a secret history of how the canon gets made. Elegant, eloquent, an adventurer, Tocqueville is fascinating. But even more intriguing is how exactly this forgotten 19th-century politician was enshrined as a liberal icon, obscuring his role that mattered: as cheerleader and theorist of Algerian slaughter.

Unlike his cousin François-René de Chateaubriand, France forgot almost instantly about Tocqueville, who died in 1859. His French revival did not flow from America, but straight from that systematic, little-examined, mid-20th-century project: the quest for the anti-Marx.

In Anglo-American conservatism, this quest resulted in the elevation of the thin diatribes of Friedrich Hayek, the results of which can still be seen around us, in the stacks of The Road to Serfdom (1944, 1994) for sale at CPAC every year, and in the tomes that adorn the bookshelves of many a serious Senator. Not defending conservative thought the hard way (as a set of principles: skepticism, tradition, evolution), Anglo-American thinkers built a cult around Hayek’s work to rival that surrounding Das Kapital (1867, 2011)—valorizing a theory of capitalism, insisting on economic rules for history—with all the brittle dogmatism they scorned on the Left.

The French quest for the anti-Marx, however, has been far more skillfully executed, and was in all senses more profound. Tocqueville became the liberal icon for a Fifth Republic badly shaken by the streets of May 1968—their philosophe.

Behind Tocqueville’s rehabilitation stands the most brilliant intellectual operator of Gaullism, Raymond Aron. Though similarly sleight, Aron was the anti-Jean-Paul Sartre: anti-Soviet and pro-American, anti-Marxist and pro-free markets, a liberal both committed to Anglo-America, and—rarely for a Parisian of his time—frequently visiting Washington.

In fact, Aron’s revival of Tocqueville began in America. In a series of conferences in Berkeley in 1963 (which led to Essais Sur Les Libertés), Aron established him as both a competitor and superior to Marx. The project intensified by 1967, with Aron arguing (in Les Étapes De La Pensée Sociologique) that Tocqueville was nothing less than an equal of Durkheim, Montesquieu and of course, Marx. The first new edition of Democracy in America for the general reader in France followed quickly—in 1968.

As the student uprisings gathered steam, Charles de Gaulle did not fully appreciate what was happening and rushed to Germany to ensure the French military tank division with NATO over the Rhine was loyal. But Aron better understood the stakes. “I played de Tocqueville,” wrote Aron, “just as others played Saint-Just, Robespierre, or Lenin.” He knew instantly that the mass protests, armed with the Marxist ideas, had broken de Gaulle’s charismatic authority and with it, had thrown the entire social order into question. (Once he realized this too, de Gaulle resigned in 1969, choosing fittingly to visit General Franco in Madrid on his first major foreign vacation.)

Aron knew he needed to fight back, and that he needed an anti-Marx whose thinking he could use to dismantle the triptych that was on the students’ lips: liberté, égalité, révolution. Unlike the Anglo-Saxon gospel cult of Hayek, which rather than dismantle simply inverts Marxist historical materialism (“free markets must lead to prosperity”), Aron’s cult of Tocqueville was inspired, the start of a brilliant assault on 1789 by French conservative thinkers focusing on discrediting the revolutionary idea itself, not just Marxist economics.

What delighted Aron was that Democracy in America established a dichotomy between liberty and equality. The more equal men are, Tocqueville argued, the less liberty they can enjoy because of the conformist “tyranny of the majority.” Better still, in The Ancien Regime and The Revolution, Tocqueville argued French revolutions always fell back to the old centralized state of Louis XIV. In other words, revolutions are impossible, because state structures are entrenched beyond uprisings. You can never escape the old order. Together, these formed Aron’s anti-Kapital: not only will the state conquer all French revolutions, but liberté with equal égalité is not desirable. Therefore equality—the socialists’ égalité, the equality of conditions—must never be permitted.

Raymond Aron lost the hearts and minds of ’68, but he won the war. Drawing on careful partnerships with financial and political authorities (perhaps inspired by his frequent visits to the Hudson Institute in DC’s nascent think-tank scene), he succeeded in creating a cult of Tocqueville and inserting it into the French curriculum. His liberal conservatism, fringe amongst Parisian students at the time would be what the next generation found in their Baccalauréat.

There is something charming about the young intellectual setting sail across the Atlantic, to New York, to think democracy. There is nothing charming about the mature politician crossing the Mediterranean, to Algiers, to plan colonialism.

Who was the real Tocqueville? In some senses, because that later man, that incarnation, did not endure, this question hardly matters. But when intellectuals collectively create a national monument out of a body of work, what they choose to ignore shows us what they think is irrelevant, revealing their true values. And the truth is, Tocqueville’s hostility to egalité is hardly accidental in his earlier writing. It goes on to inform all his later work and life.

The Tocqueville who shaped French history is not the famous writer, but a member of parliament from 1839 to 1851, a man who when he was briefly French Foreign Minister in 1849 appointed his friend Arthur de Gobineau (the author of Essay on The Inequality of The Human Races (1853–1855, 2016), the source of Aryan race theory) as his Chef de Cabinet. This is the Tocqueville who was the rapporteur on the notorious 1847 Report on Algeria.

“I have often heard men who I respect,” Tocqueville wrote, “but with whom I do not agree, find it wrong that we burn harvests, that we empty silos, and finally that we seize unarmed men, women and children. These, in my view, are unfortunate necessities, but ones to which any people who want to wage war on Arabs are obliged to submit.” All this was to Tocqueville, “a necessary barbarism.”

France’s anti-Marxist Nouveaux Philosophes, who followed on from Raymond Aron, have progressively debased the concept of totalitarianism. They, like Hayek, found it everywhere, in the many pushes for more equitable living conditions, in the conceptual framework of social democracy, and now in feminist call-out culture. But like Aron, they ignored Tocqueville’s call in Algeria, “to ravage the country.” Worse than deny it, they shrug off Tocqueville’s championing of colonialism, and with French liberalism’s pact with it, as just a detail of history.

The American cult of Tocqueville had different roots, but also engaged in cherry-picking, albeit of a different sort. Democracy in America was acclaimed almost immediately as it was published in 1835 as a work staggering genius, but interest in it vanished completely after the outbreak of the American Civil War. With faith in democracy shattered, Tocqueville read ridiculously and his books disappeared from print.

He was saved from obscurity in 1938 by the historian George Wilson Pierson. In the throes of the Great Depression, and perhaps not by chance at a moment of crisis in faith in American exceptionalism, Pierson reconstructed Tocqueville’s journey in Tocqueville and Beaumont in America (1938, 1959), to great acclaim. This placed him on the radar for the greatest moment of American triumphalism of all: the search to make sense of a destiny now manifested in 1945. New editions of Democracy in America appeared in 1945 (Knopf), 1947 (Oxford), 1951 (Henry Regnery), 1954 (Vintage) and 1956 (New American Library). This flurry of editions were quickly placed on the reading lists of the emerging fields of American Studies and Western Civilization, and soon became a cornerstone of US politics in a liberal arts education.

And yet it’s easy to forget that Democracy in America was not written under President Lincoln, but under President Jackson, in the America of the Trail of Tears, and it can only feel strange that a book from this moment in time is the one frequently hailed as capturing some of America’s finest characteristics. True, Tocqueville, an abolitionist, both condemned African-American slavery and Native American dispossession, and did so eloquently. Yet his democracy stops long before them, in his elegiac passages of the happy slaves at work in the fields, or in his conviction that “the Indians will never civilize themselves, or that it will be too late when they may be inclined to make the experiment.” One would happily toil, one would quietly vanish—that was Tocqueville’s shrug.

In Algiers, like on the Mississippi, Tocqueville seems to have found a sense of destiny. Though it might seem jarring to us, his jump from America to Algeria presented no contradictions to him. Tocqueville saw his work as a colonial panorama: from America, the freed colony, to Algeria the future colony. America was his inspiration for Algeria. Algiers, he wrote in 1832—“it is Cincinnati transported to African soil.”

In Algeria, Tocqueville found inferiors again. Writing to Arthur de Gobineau (who was by now well into developing his Aryan race theories), Tocqueville announced that a study of the Koran had convinced him that “there are few religions as deadly to men as Islam.” It was worse than polytheism, “a form of decadence rather than a form of progress in relation to paganism itself.” Degenerate, vile, amenable only to force—that was his Arab.

This is the real Tocqueville: The politician who saw his vocation as the colonial expert of the Chamber of Deputies in advocating for France to build its own America in North Africa. This is why he wanted both to “ravage the country” but fiercely opposed “the dictatorship” General Bugeaud ran in Algiers. Tocqueville’s indignation here was that Paris was slowing down European colonization. He was only opposed to dictatorship over Europeans in Algeria; as far as Arabs were concerned, he was a firm supporter of military rule. What he wanted was the white settler democracy he so admired in the United States to emerge in North Africa. The Swiss Colonization Society, he lamented, was sending families to the “wildest parts of North America” and not Algeria, because there settlers enjoyed democratic institutions and pacified terrain.

After the conquest, Tocqueville rediscovered his enlightened imperialist, exhorting the French parliament not to repeat the worst excesses of New World genocidal colonization “that has dishonored the human race.” But this was not because he had renounced colonialism, but because he believed he had found a better approach, “the astonishing greatness of the British in India.” So infatuated was he with British India that during the 1857 revolt against the East India Company which neatly shattered the empire of France’s great rival, Tocqueville angsted that a British withdrawal, “would be disastrous for the future of civilization and the future of humanity.” As soon as the so-called “Sepoy Mutiny” was crushed, he rejoiced in “a victory for Christianity and civilization.” A study of Britain in India was to be his great unfinished work: the capstone to his great colonial panorama.

Should Americans still read Tocqueville? Absolutely. But they ought to consider the flattering and glibly quotable extracts so recurrent in columns in the New York Times less as timeless verities pulled from scripture but nuggets from a life dedicated to analyzing and actualizing empire.

To put it bluntly, the Tocqueville America needs to grapple with is the writer who saw his hymn to white-settler society as a roadmap of sorts. Reading this Tocqueville would mean facing the fact that for 19th-century Europeans, the United States was not only, or even first and foremost, an inspiration for democracy, but also one for colonization—that Tocqueville, like Cecil Rhodes, saw in the great white empire stretching over the Mississippi, the sublime.

Democracy in America is a fabulous historical artifact. I am not arguing that it should somehow be forgotten once again. But it cannot be blindly quoted, as a visionary paean to simple virtues. It’s a lot more troublesome than that. And revealing. # # #

[Ben Judah is a British-French journalist who has written for The New York Times, and The [London] Sunday Times. He is the author of Fragile Empire (2013) and This Is London (2016). He received a BA, first class (modern history and politics) from Oxford University.

Copyright © 2018 The American Interest



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2018 Sapper's (Fair & Balanced) Rants & Raves