Saturday, October 31, 2009

WTF? An Internet Milestone Timeline Without June 24, 2003????????????

On Tuesday, June 24, 2003, the first-ever post to this blog read:

Hello! Welcome to my world! Rather than send e-mail to my friends (and foes), I decided to enter the 21st century and publish a Web Log (Blog). Visit daily. Youneverknow.

More than 11K visitors have clicked on this blog's URL to look among nearly 2,500 posts. However, another gap in the Associated Press 40th Anniversary Timeline of Internet Milestones makes no mention of blogging or Wikipedia. The magic of hyperlinks mends those flaws. Now, thanks to this blog, the 40th Anniversary Timeline is improved. If this is a (fair & balanced) delusion of grandeur, so be it.

[x AP]
The Internet’s 40th Anniversary Milestone Timeline

Tag Cloud of the following timeline

created at TagCrowd.com

Summer of 2009: Bad URLs swamp the Internet. Through the first half of 2009, IBM’s X-Force team tracks a 508% leap in the number of new malicious Web links versus the first half of 2008. Most bad links function as relays to other Web pages set up to quickly embed a wormhole (referred to as a Trojan downloader) to the hard drive of the visitor’s PC. The attacker then uses this wormhole to install code that groups the PC with thousands of other infected machines in a botnet. The attacker is then able to lease out the botnet to other criminals who need computing power to deliver spam, steal data, spread promos for fake antivirus subscriptions and hijack online banking accounts. Bad links are moot, of course, if no one clicks on them. So the Internet has become swamped with ploys to steer people to bad links. They turn up in search query results and in e-mail spam. And bad links are surging through messages and postings on popular social networks.

2009: The Koobface worm steals logons and contact lists from users of Facebook, MySpace, Twitter, YouTube, Friendster, Bebo and Hi5. It delivers bad links in messages and microblogs that appear to come from trusted acquaintances.

2009: The Seattle Post-Intelligencer becomes the first major daily newspaper to move entirely online. Google announces development of a free computer operating system designed for a user experience that primarily takes place on the Web.

2009: Twitter emerges as the fastest growing site on the Internet with 6 million unique monthly visitors and 55 million monthly visits — getting 1,400 percent larger every month

2009: Conficker RPC-DCOM worm; The Conficker worm has created a secure, worldwide infrastructure for cybercrime. The worm allows its creators to remotely install software on infected machines. What will that software do? We don’t know. Most likely the worm will be used to create a botnet that will be rented out to criminals who want to send SPAM, steal IDs and direct users to online scams and phishing sites.

2008: Cyber thieves crack database of Heatland Payment Systems and steal 130 million payment card transaction records over 13 months before getting detected.

2008: World Internet population surpasses 1.5 billion. China’s Internet population reaches 250 million, surpassing the United States as the world’s largest. Netscape’s developers pull the plug on the pioneer browser, though an offshoot, Firefox, remains strong. Major airlines intensify deployment of Internet service on flights.

2007: Storm email virus; Poor Microsoft, always the popular target. Like Blaster and others before, this worm’s payload performed a denial-of-service attack on www.microsoft.com. During Symantec’s tests an infected machine was observed sending a burst of almost 1,800 emails in a five-minute period.


2007: Apple releases iPhone, introducing millions more to wireless Internet access.

2006: Cyber thieves breach TJX retail chain database to steal 94 million credit and debit card transaction records over an eight month period.

2006: World Internet population surpasses 1 billion.

2005: Launch of YouTube video-sharing site.

2004: Mark Zuckerberg starts Facebook as a sophomore at Harvard University.

2004: Sasser LSASS worm; This nasty worm spread by exploiting a vulnerable network port, meaning that it could spread without user intervention. Sasser wreaked havoc on everything from The British Coast Guard to Delta Airlines, which canceled some flights because of computer infection.

2003: MSBlast RPC-DCOM worm; Blaster is a worm that triggered a payload that launched a denial of service attack against windowsupdate.com, which included the message, “billy gates why do you make this possible? Stop making money and fix your software!!”

2003: Slammer SQL server worm; This fast-moving worm managed to temporarily bring much of the Internet to its knees in January of 2003. The threat was so aggressive that it was mistaken by some countries to be an organized attack against them.

2002: World Internet population surpasses 500 million.

2001: Code Red ISS worm; Websites affected by the Code Red worm were defaced by the phrase “Hacked By Chinese!” At its peak, the number of infected hosts reached 359,000.

2001: Nimda email virus; A mass-mailing worm that uses multiple methods to spread itself, within 22 minutes, Nimda became the Internet’s most widespread worm. The name of the virus came from the reversed spelling of “admin.”

2000: The dot-com boom of the 1990s becomes a bust as technology companies slide. Amazon.com, eBay and other sites are crippled in one of the first widespread uses of the denial-of-service attack, which floods a site with so much bogus traffic that legitimate users cannot visit.

2000: I Love You email virus; Who wouldn’t open an e-mail with “I Love You” in the subject line? Well, that was the problem. By May 2000, 50 million infections of this worm had been reported. The Pentagon, the CIA, and the British Parliament all had to shut down their e-mail systems in order to purge the threat.

2000: Mafiaboy installs bots on computers at Yale and Harvard universities and used them to crash CNN’s Web site for four hours and create chaos at the Web sites of Yahoo, eBay, Amazon, Dell, Excite, and E-Trade. He bragged in chat rooms that the FBI would never catch him.

1999: First Harry Potter book is published; Ricky Martin has hit single, "In Vida Loca"; Amazon loses millions selling books on line, but investors shower it with funds, and its stock prices soars from $6 per share to $106, giving founder Jeff Bezos plenty to laugh about.

1999: Melissa email virus; Melissa was an exotic dancer and David L. Smith was obsessed with her and also with writing viruses. The virus he named after Melissa and released to the world on March 26th, 1999, kicked off a period of high-profile threats that rocked the Internet between 1999 and 2005.

1999: Napster popularizes music file-sharing and spawns successors that have permanently changed the recording industry. World Internet population surpasses 250 million.

1998: Google forms out of a project that began in Stanford dorm rooms. U.S. government delegates oversight of domain name policies to Internet Corporation for Assigned Names and Numbers, or ICANN. Justice Department and 20 states sue Microsoft, accusing the maker of the ubiquitous Windows operating system of abusing its market power to thwart competition from Netscape and others.

1996: Passage of U.S. law curbing pornography online. Although key provisions are later struck down as unconstitutional, one that remains protects online services from liability for their users’ conduct, allowing information — and misinformation — to thrive.

1995: Amazon.com opens its virtual doors.

1994: Marc Andreessen and others on the Mosaic team form a company to develop the first commercial Web browser, Netscape, piquing the interest of Microsoft and other developers who would tap the Web’s commerce potential. Two immigration lawyers introduce the world to spam, advertising their green card lottery services.

1993: Andreessen and colleagues at University of Illinois create Mosaic, the first Web browser to combine graphics and text on a single page, opening the Web to the world with software that is easy to use.

1990: Tim Berners-Lee creates the World Wide Web while developing ways to control computers remotely at CERN, the European Organization for Nuclear Research.

1989: Quantum Computer Services, now AOL, introduces America Online service for Macintosh and Apple II computers, beginning an expansion that would connect nearly 27 million Americans online by 2002. Source: AP
1988: One of the first Internet worms, Morris, cripples thousands of computers.

1988: Morris worm. An oldie but a goodie; without Morris the current threat “superstars” wouldn’t exist. The Morris worm (or Internet worm) was created with innocent intentions. Robert Morris claims that he wrote the worm in an effort to gauge the size of the Internet. Unfortunately, the worm contained an error that caused it to infect computers multiple times, creating a denial of service.

1983: Domain name system is proposed. Creation of suffixes such as “.com,” “.gov” and “.edu” comes a year later.

1974: Vint Cerf and Bob Kahn develop communications technique called TCP, allowing multiple networks to understand one another, creating a true Internet. Concept later splits into TCP/IP before formal adoption on January 1, 1983.

1973: Arpanet gets first international nodes, in England and Norway.

1972: Ray Tomlinson brings e-mail to the network, choosing “at” symbol — @ — as way to specify e-mail addresses belonging to other systems.

1970: Arpanet gets first East Coast node, at Bolt, Beranek and Newman in Cambridge, Mass.

1969: On September 2, two computers at University of California, Los Angeles, exchange meaningless data in first test of Arpanet, an experimental military network. The first connection between two sites — UCLA and the Stanford Research Institute in Menlo Park, CA — takes place on October 29, though the network crashes after the first two letters of the word “logon.” UC Santa Barbara and University of Utah later join. Ω

Copyright © 2009 Associated Press

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.

Copyright © 2009 Sapper's (Fair & Balanced) Rants & Raves

Friday, October 30, 2009

Checkout — Not Checkmate — For The Neo-Nazi, Bobby Fischer

In 1964, Cassius Clay fell under the spell of Elijah Muhammad and his son, Herbert Muhammad, and became Muhammad Ali. In 1963, Bobby Fischer fell under the spell of Herbert W. Armstrong, and his son, Garner Ted Armstrong, and became... Bobby Fischer. Muhammad Ali suffers from the effects of repeated blows to the head and resulting brain trauma so that he is barely able to speak and suffers from a palsy akin to Parkinson's Syndrome. Bobby Fischer, at the end of his life, was able to speak and walk. But unlike Muhammad Alo, Fischer was consumed by hatred. Ali may be brain-damaged, but Fischer was simple-minded. If this is (fair & balanced) long-distance diagnosis, so be it.

[x CSI]
Bobby Fischer: Genius and Idiot
By Martin Gardner

Genius in 1972 — Idiot in 2007




















Tag Cloud of the following article

created at TagCrowd.com

Is it possible for someone to be extremely intelligent and creative in a certain field and at the same time, in other respects, to be simple minded? The answer is yes.

Consider Isaac Newton. He was certainly a genius in the fields of mathematics and physics. On the other hand he devoted most of his life to studying the prophecies of the Bible, calculating the year in which God created the entire universe in six days, and determining the probable year that Jesus would return!

Consider Arthur Conan Doyle. He was a brilliant writer, creator of Sherlock Holmes and Dr. Watson, yet he firmly believed in the reality of fairies. He even wrote an entire book defending the authenticity of several crude photographs of the tiny winged fairies taken by two little girls.

My third example is Bobby Fischer, perhaps the greatest chess player of all time, certainly the best known. I have written elsewhere about Newton and Doyle. Here I will tell briefly the sad story of Fischer.

Robert James Fischer was born in Chicago in 1943 the illegitimate son of Jewish parents. His Polish mother, Regina, was an active Communist and a great admirer of the Soviet Union. She had a brief affair with Bobby’s German father.

Bobby grew up in Brooklyn. At age six he became captivated by chess. At fourteen he was the U.S. chess champion. The following year he was declared a grandmaster. In 1972 he became world champion by defeating Boris Spassky at a tournament in Iceland. There is not the slightest doubt that Bobby was a genius, with a mind that could have made him a great mathematician had events in his childhood taken a different turn.

Aside from chess, Fischer came close to being a moron. I once thought his refusal to play chess on Saturday was because he was Jewish. No, it was because he had become a convert to the Worldwide Church of God, a strange sect founded by former Seventh-day Adventist Herbert W. Armstrong. Like the Adventists, Armstrong believed that Saturday is still the God-appointed Sabbath. In 1972 Bobby gave $61,000 to Armstrong, part of the prize money he had won by defeating Spassky.

The Worldwide Church of God was soon scandalized by the womanizing of Herbert’s son Garner Ted. After being excommunicated by his father, Ted moved to Tyler, Texas, where he continued to preach his father’s doctrines. Disenchanted by this rift in the Worldwide Church—and on one occasion physically assaulting a lady official of the church—Fischer left the fold to become an ardent admirer of Hitler and the Nazis!

Fischer’s hatred of Jews turned paranoid. Pictures of Hitler decorated his lodgings. He denied the Holocaust. America, he was convinced, had fallen into the hands of “stinking Jews.” When the September 11, 2001, attacks occurred, he called it “wonderful news.” Wanted by the U.S. government for violating an order not to play a return match with Spassky in Yugoslavia, Fischer renounced his U.S. citizenship and settled in Iceland.

Fischer died of kidney failure in 2008. His Japanese wife, Myoko Wakai, flew to Iceland for the funeral. A devout Buddhist and the woman’s chess champion of Japan, she and Fischer were legally married after living together for a short period. Presumably she will inherit Fischer’s sizeable fortune.

John Carlin, in an article titled “The End Game of Bobby Fischer” in the Observer/Guardian (February 10, 2008), described Fischer, during his final years, as looking like a homeless bum. “His teeth were rotten, and his white hair and beard were long and unkempt.” Bobby had a low opinion of doctors and dentists. He had all the metal fillings in his teeth removed because he thought radiation from them was injuring his health, or perhaps American or Russian enemies were causing the harmful radiation from his molars. Fischer seldom changed his clothes or removed his baseball cap. After his death in 2008 at age sixty-four, he was buried late one night near a tiny church in Iceland. A brief, shabby funeral was attended by a Catholic priest he had never known.

Fischer had an older sister, Joan, who died a few years earlier. She was the wife of Russell Targ, the physicist and parapsychologist whose chief claim to fame is having validated the psychic powers of Uri Geller. Ω

[Martin Gardner graduated from the University of Chicago (B.A., 1936). His first job was as a reporter for the Tulsa Tribune. In the 1950s he moved to New York and in 1957 became associated with Scientific American, for which he has written a column on mathematical games for many years. Gardner's most recent book is When You Were a Tadpole and I Was a Fish: And Other Speculations About This and That (2009.]

Copyright © 2009 The Committee for Skeptical Inquiry

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.

Copyright © 2009 Sapper's (Fair & Balanced) Rants & Raves

Thursday, October 29, 2009

Eags Goes Snarky

More praise rolls in for Matthew Hoh, former Marine Captain and up-and-coming foreign service officer, who refused to drink the Kool-Aid of the Bright Shining Lie in Afghanistan by resigning from the Foreign Service. Eags (Timothy Egan) is an innocent abroad in Paris at the moment and a visit to Napoleon's Tomb brought forth some sensible insight about our Twin Quagmires: Iraq and Afghanistan. Out Now! Not one more drop of blood shed in either of those hellish places. Build a fleet of drones and rain missiles on all Shi'ite, Sunni, Ba'athist, Taliban, or Al-Qaeda malefactors until they are quiet. If a drone is fired upon by a ground-to-air weapon, the response must be doubled. Send wave after wave of drones, but shed not another drop of blood from our armed forces. Call it Operation Dyno-MITE! If this is (fair & balanced) robotic bloodlust, so be it.

PS: The POTUS (44) traveled to Dover, DE this AM to become the first president to honor the dead (18) returning from Afghanistan.

[x NY Fishwrap]
Napoleon’s Dynamite
By Timothy Egan

Tag Cloud of the following article

created at TagCrowd.com

PARIS — He’s in there somewhere, under the gilded dome of Les Invalides in the 7th arrondissement of Paris. The Emperor of the French, Napoleon Bonaparte, is entombed by six coffins in what has to be the most spectacular sarcophagus in all the City of Light.

I stared at this extravagance of marble and mortality not long ago, thought about Napoleon’s campaigns in Russia, Italy and Prussia, the wars that briefly remade Europe, and realized that I owed a considerable part of my heritage as a citizen of the American West to the Little Corporal in the coffin.

Distracted as he was in trying to build an empire, Napoleon looked across the Atlantic and decided he had little use for the mid-section of a distant continent. Needing cash for conquest, he then sold the French holdings for a pittance to the fledgling United States.

Putting aside the fact that these lands had Native Americans living on them, with deep attachments and rights of sovereignty of their own, the United States got one of the greatest real estate deals of all time from the French.

For barely 5 cents an acre, the U.S. picked up more than 800,000 square miles in the Louisiana Purchase of 1803. With the stroke of a pen at Thomas Jefferson’s behest, and without the loss of a single life, America doubled in size.

We were wary, following the advice of Jefferson and others, of ceaseless and senseless overseas wars. Wars for territory. Wars for defense. Wars for revenge. Wars because one religion was better than another. This was not our way. We didn’t meddle. We fought “good wars,” against imperial occupiers like Great Britain and, much later, the Nazis.

And we were slow to rouse, intervening only when called to the rescue. That was — perhaps still is — our narrative as a people.

From that peaceful triumph with France, you pivot to the present day, and wonder how we will fit what are likely to be our two longest wars into this story. The United States has been in Afghanistan coming up on a decade. Iraq is not far behind.

In Iraq, some Sunnis have always hated some Shiites, and vice-versa, for more years than the United States has been a country, and they will continue to dismember each other and their children whether we are there or not. I suspect most historians will judge the Iraq War an epic mistake. Already, most of our efforts in blood and capital have been spent trying to clean up the mess of the initial hubristic invasion, an ill-planned act from the shallows of a light-thinking president. (Now moonlighting as a motivational speaker — go figure! See Jon Stewart’s take.)

The never-ending quality of that war was reinforced over the weekend, with the worst bomb blast in a year, a bloodbath at the heart of Iraqi government buildings.

Afghanistan is more difficult, of course. The jihadists who killed American citizens on September 11, 2001, had their base in that seemingly ungovernable mountain country. It is the graveyard of empires — Soviet, British — for good reason, as most Americans have come to understand. Progress is not yet a word that can be used with any credibility after eight years of war. And this month was the deadliest for U.S. forces since troops arrived, a danger heightened by Wednesday’s bombing in Peshawar.

Now comes the first United States official known to resign in protest of American strategy in Afghanistan. Matthew Hoh, former Marine Captain and up-and-coming foreign service officer, says American presence has thus far only fueled the insurgency.

Yet, to leave now, we are told, would be to abandon a country to people who live 8th century lives with 21st century weapons. And they have a hatred warped by religion — making for the worst kind of enemy.

There is little advice floating around, and much that is bad. Chief among the latter was the suggestion of Dick Cheney, co-architect of the present disasters, that the president quit “dithering.” This is particularly galling coming from a man with five draft deferments during the Vietnam War. His dithering kept him out of combat.

For the president, if thoughtful dithering produces a more enlightened policy, he will be well served by stretching time.

The rest of us can look at Napoleon’s tomb, holding the body of the man who led so many men to war, trying repeatedly to do with military might what the French could usually only do with their cultural exports.

Americans have never been empire builders. Napoleon, who was dynamite in a small package, even though he reigned before it was invented, had dreams of flying le tricolor over distant lands.

In a fit of historic distraction, this emperor gave us Montana and Missouri among many fine places. We paid less than $15 million, thanks to one leader who dithered until the right moment presented itself, and another who let his army slow bleed to collapse. Ω

[Timothy Egan writes "Outposts," a column at the NY Fishwrap online. Egan — winner of both a Pulitzer Prize in 2001 as a member of a team of reporters who wrote the series "How Race Is Lived in America" and a National Book Award (The Worst Hard Time in 2006) — graduated from the University of Washington with a degree in journalism, and was awarded an honorary doctorate of humane letters by Whitman College in 2000 for his environmental writings. Egan is the author of four other books, in addition to The Worst Hard TimeThe Good Rain: Across Time and Terrain in the Pacific Northwest, Lasso the Wind: Away to the New West, Breaking Blue, and The Winemaker's Daughter. Egan's most recent book is The Big Burn: Teddy Roosevelt and the Fire that Saved America (2009).]

Copyright © 2009 The New York Times Company

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.

Copyright © 2009 Sapper's (Fair & Balanced) Rants & Raves

Wednesday, October 28, 2009

Wobegon Boy Has An Aiken Moment

In the great national debate over stay-or-go during the Vietnam era, Senator George Aiken (R-VT) is widely believed to have suggested that the U.S. should declare victory and bring the troops home. However, Wikipedia tells us:

Actually, what he said was that "the United States could well declare unilaterally... that we have 'won' in the sense that our armed forces are in control of most of the field and no potential enemy is in a position to establish its authority over South Vietnam," and that such a declaration "would herald the resumption of political warfare as the dominant theme in Vietnam." He added: "It may be a far-fetched proposal, but nothing else has worked."

Aiken's formula (Declare victory and bring 'em home.) provides the solution to the messes in both Iraq and Afghanistan. U.S. presence in Iraq for a thousand years will not end the enmity between Shi'ite and Sunni; they will kill each other whether the U.S. is present or not. U.S. presence in Afghanistan is not going bring Afghanistan into the modern world. Enough! Kudos to Matthew Hoh for refusing to drink the Kool-Aid of the Bright Shining Lie. If this is (fair & balanced) realpolitik, so be it.

[x Salon]
Time To Move On From Afghanistan
By Garrison Keillor

Tag Cloud of the following article

created at TagCrowd.com

The former Marine officer Matthew Hoh, who resigned his Foreign Service post in Afghanistan because he feels the war is pointless and not worth dying for, deserves all the attention he's gotten and more. The Obama administration faces hard decisions there, and the man made a good case against deeper American involvement. He says that our presence among the Pashtun people, the rural, religious people, is only aggravating a civil war between them and the urban, secular (and, it seems, fraudulent) government of Kabul, and the role of the Taliban and al-Qaida is not central — the real issues are tribal and cultural.

American families, he said, "must be reassured their dead have sacrificed for a purpose worthy of futures lost, love vanished, and promised dreams unkept. I have lost confidence such assurances can be made any more."

It is rare that a high-level official — he was the senior State Department guy in Zabul province — resigns in protest, and in all the to-do about his four-page resignation letter, nobody had a single bad thing to say about Matthew Hoh.

The American people tend not to admire quitters, which is maybe why protest resignations are so rare. You can get up on your high horse and talk about your principles, but we suspect that you're just another slacker looking for an easy way out. Your old football coach told you that when the going gets tough, the tough get going, and by "get going" he didn't mean "write a four-page letter about your disillusionment with his coaching and the split-T offense in general" — he meant, Toughen Up, Assume the Three-Point Stance, Hit 'Em Hard, Eat Some Turf, Get Up and Hit 'Em Again.

On the other hand, you don't want to be the last man to believe in the mission after everyone else has seen the light and gone home. Sunday in San Francisco, they set out to celebrate the 40th anniversary of Woodstock by gathering 3,000 guitarists in Golden Gate Park to play Jimi Hendrix's "Purple Haze" and 50 showed up and some of them were playing ukuleles. The '60s are over. Time to move on.

This is the great divide, between the true believers and the skeptics, and we cross over it every day, back and forth. On the one hand, we admire persistence and the good workers who go at the job and get it done, but then we listen to management huff and puff and realize that the ship is becalmed and liable to be boarded by pirates. Time to look for other work.

The box-elder bugs that flock into my house seeking shelter from the cold seem untroubled by skepticism. They march in and are squished and more bugs walk across the smeared innards of box-elder brethren and nobody is the wiser, the message is never passed on toward the rear.

Our time is brief. No matter how smart you are or pretty, the demand for you is limited. This is the hard lesson of adult life. Vancouver wants you to come and perform your work and you say yes and hundreds of e-mails fly back and forth — What beverage would Mr. Keillor wish us to place in the back seat of the limo? Fermented persimmon juice? Not a problem. Should the flower petals that young maidens strew in his path be rose or narcissus? — and then, two days before the big day, you are struck by a sore throat and propulsive sneezing. So you call Vancouver and tell them you can't come. They take the news calmly. They don't shriek, "No! No! Not this! Our lives will be shattered if you cancel, esteemed one." Your non-appearance is No Problemo.

And this is how you find out the hard truth. The world can get along without you pretty well.

You don't want to be the last person to write a novel in Esperanto or compose a 12-tone symphony, the last Socialist Labor candidate trying to hand out literature to the working class as they go into Wal-Mart, or the last Christian Science person to believe in the efficacy of prayer after all your friends have slipped away to have surgery, or the consumer of the last contaminated tuna left on the grocery shelf — you don't want that.

Time to move on. Tell the others. It's a brand-new day. Let us start making our way on out of Afghanistan, Mr. President. Ω

[Garrison Keillor is an author, storyteller, humorist, and creator of the weekly radio show "A Prairie Home Companion." The show began in 1974 as a live variety show on Minnesota Public Radio. In the 1980s "A Prairie Home Companion" became a pop culture phenomenon, with millions of Americans listening to Keillor's folksy tales of life in the fictional Midwestern town of Lake Wobegon, where (in Keillor's words) "the women are strong, the men are good looking, and all of the children are above average." Keillor ended the show in 1987, and 1989 began a similar new radio show titled "American Radio Company of the Air." In 1993 he returned the show to its original name. Keillor also created the syndicated daily radio feature "A Writer's Almanac" in 1993. He has written for The New Yorker and is the author of several books, including Happy to Be Here (1990), Leaving Home (1992), Lake Wobegon Days (1995), and Good Poems for Hard Times (2005). Keillor's most recent books include a new Lake Wobegon novel, Liberty (2009) and 77 Love Sonnets (2009). His radio show inspired a 2006 movie, "A Prairie Home Companion," written by and starring Keillor and directed by Robert Altman. Keillor graduated (B.A., English) from the University of Minneosta in 1966. His signature sign-off on "The Writer's Almanac" is "Be well, do good work, and keep in touch."]

Copyright © 2009 Salon Media Group, Inc.

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.

Copyright © 2009 Sapper's (Fair & Balanced) Rants & Raves

Tuesday, October 27, 2009

This Blog Needs An Ensurance Policy (Against Aggrieved Authors AND Readers)!

If assure, ensure, and insure all mean the same thing: "To make certain of something," what's the big deal? If this is (fair & balanced) theoretical lexicography, so be it.

[x CJR]
Assurance Policy
By Merrill Perlman

Tag Cloud of the following article

created at TagCrowd.com

In Washington, legislators are trying to “assure” their constituents that they are working to “ensure” that any new health-care bill will “insure” them.

All three of these transitive verbs mean the same thing: To make certain of something. (Surely you knew that.)

But there are subtle differences as well, which have evolved over the years.

Let’s start with the (relatively) easy one: “Assure.” It’s a transitive verb, to be sure, but its object should be personal—“I assure you” about something. You shouldn’t “assure” an inanimate object of anything. Yet many times “assure” is used when “ensure” is meant, as in “a new health insurance bill is supposed to assure that all people are covered.” Though Garner’s Modern American Usage says the substitution of “assure” when “ensure” is meant is “ubiquitous but …” (the verbatim Language-Change Index rating, meaning no one will be able to stop it, wrong though it may appear), the substitution does not appear frequently at all nowadays.

The differences between “insure” and “ensure” are also open to debate. “Insure,” some people say, must be used only where someone makes a legal wager that something will happen or not happen. People pay premiums for “life insurance” policies to ease the financial damage to their heirs; some financial companies have created complicated financial instruments that “insure” them against loss (though that bet did not pay off for many).

“Ensure,” some people say, should be used when the “assurance” is neither personal nor financial: “I will ensure that the next health-care bill will cover flu shots.”

The British did and do use “assure” in financial contexts where we Yankees would use “insure,” though Fowler’s Dictionary of Modern English Usage (second edition) pooh-poohed the usage. (Fowler did, however, distinguish between “life assurance,” which was “assured” of paying off eventually, and “term insurance,” which was a bet that someone would not die within a specified period.)

For many years “ensure” was viewed as a Britishism, possibly because of the “en” prefix, which sounds so, um, British. Until 1999, in fact, The New York Times mandated “insure” in both financial and nonfinancial contexts, though one of its main style gurus, Theodore M. Bernstein, was sure that there was no difference between the two words.

Nowadays, most usage authorities “assure” us that “insure” for “ensure” is perfectly fine: Garner’s supports “insure” only in financial contexts such as “life insurance,” but the Language-Change Index acknowledges the ubiquity of “insure” for “ensure.” Using “ensure” in financial contexts to mean “insure,” though, is just plain wrong. For sure, while “assurance” and “insurance” are perfectly acceptable noun forms, “ensurance” simply doesn’t exist. But rest “assured”: Writers, for the most part, have “ensured” against its appearance. Ω

[Merrill Perlman is a consultant who works with news organizations, private companies and journalism organizations, specializing in editing and the English language. She spent 25 years at The New York Times in jobs ranging from copy editor to director of copy desks, in charge of all 150-plus copy editors at The Times. Before going to The Times, she was a copy editor and assistant business editor at the Des Moines Register. Previous to that, she was a reporter and copy editor at the Southern Illinoisan newspaper. She has a bachelor of journalism degree from the University of Missouri and a master of arts in mass communication from Drake University.]

Copyright © 2009 Columbia Journalism Review

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.

Copyright © 2009 Sapper's (Fair & Balanced) Rants & Raves

This Year's Dream Nightmare Team!

Sparky the Wonder Penguin and his trusty sidekick, Blinky the Dog, are tricked out for a Halloween party. Spark is going as The Mighty Quinnette, Rogue of the North, and Blinky is going as Dingbat Bachmann, Wacko from the Land of 10,000 Lakes. Some sage once said that we get the leaders we deserve. Woe to the Land O'The Free and the Home O'The Brave should we see this pair of True Republican Women elected to the highest offices in the land. Their campaign theme should be "Play Misty For Me." If this is (fair & balanced) horror, so be it.

[x Salon]
This Modern World — "Trick Or Treat"
By Tom Tomorrow (Dan Perkins)

Click on image to enlarge. Ω

Tom Tomorrow/Dan Perkins

[Dan Perkins is an editorial cartoonist better known by the pen name "Tom Tomorrow". His weekly comic strip, "This Modern World," which comments on current events from a strong liberal perspective, appears regularly in approximately 150 papers across the U.S., as well as on Salon and Working for Change. The strip debuted in 1990 in SF Weekly.

Perkins, a long time resident of Brooklyn, New York, currently lives in Connecticut. He received the Robert F. Kennedy Award for Excellence in Journalism in both 1998 and 2002.

When he is not working on projects related to his comic strip, Perkins writes a daily political weblog, also entitled "This Modern World," which he began in December 2001.]

Copyright © 2009 Salon Media Group

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.

Copyright © 2009 Sapper's (Fair & Balanced) Rants & Raves

Monday, October 26, 2009

Just A Hunka Hunka Burnin' (Aggie) Love?

Texas A&M (not abbreviations for Agricultural & Mechnaical) University pulled the biggest Aggie joke of the decade by defeating this blogger's Texas Technique Red Faiders in their annual game. The final score was humiliating for the Faiders: 52-30. Ironically, both schools changed their names from "College" to "University in the 1960s. In the case of Texas Technique, "Tech" is not an abbreviation from the original name (Texas Technological College). Instead, Texas Tech (not an abbreviation for Technological) University enabled the athletic department to retain the iconic Double-T emblem. Similarly, the Agricultural & Mechanical College of Texas chafed at the lack of the "university" and gained legislative approval to change the institutional name to Texas A&M University. "A&M" is not an abbreviation from the original name. Thus the A&M in the university name allowed retention of the Aggie nickname for the athletic teams as well as the A&M students. However, THE football game for the Aggies is not the game with Texas Technique. Instead, the final game of each regular season in November is THE game between A&M and UT-Austin. Each year, during the month of November prior to that final game, the ultimate campus activity at A&M was Bonfire (1909?-1999). Since the disastrous collapse of the huge pile of logs, utility poles, and an outhouse at the top of the pile in 1999 and the death of twelve Aggies and injury to scores of student workers at the site, there has not been an officially-sanctioned Bonfire on campus. There will not be an official Bonfire in 2009. If this is a (fair & balanced) Aggie-pyromania, so be it.

[x TX Monthly]
Memory Of Fire
By Jake Silverstein

[On left: Actual Bonfire at Texas A&M — On right: Computer-Generated Image For TM Cover Story]



Tag Cloud of the following article

created at TagCrowd.com

The cliché about any great tragedy is that it creates indelible markers in time and space: Had John F. Kennedy visited Dallas in 1963 without incident, few Americans would be able to recall much about where they were or what they were doing on November 22 of that year. As shocking news spreads, it generates hundreds of thousands of individual memories that fill the dark days on the calendar. For Aggies, and for many Texans, the date of November 18, 1999, is densely packed with these grim reminders. Everyone knows where he was when he heard that the Texas A&M Bonfire had collapsed early that morning and that a number of students had been killed. The loss of life was shocking and deeply upsetting; what made it even more painful was the knowledge that A&M’s most passionately observed tradition—perhaps the most passionately observed tradition at any university in the world—was to blame.

Every story about Texas A&M is a story about tradition versus change. In this way it is the most Texan of all our schools. Not because its particular customs are so emblematic of the state—they’re more a reflection of A&M’s military history than anything else—but because a similar struggle between mythic heritage and contemporary reality has defined Texas throughout the decades. Like Texas, the Aggies have age-old rituals (Elephant Walk, senior boots) that to the outsider seem like the customs of a foreign nation. A&M prides itself on being different—really different—from other schools, and it reveres all the little details of Aggie life that make it so. Much the same thing could be said about Texas. We are all Aggies.

Or at least we were ten years ago. It is commonly said of A&M that “from the outside looking in, you can’t understand it. From the inside looking out, you can’t explain it.” But on that day, and in the weeks and months that followed, this barrier was transcended, as it is again in Pamela Colloff’s oral history of the Bonfire collapse (“Ring of Fire”). If, like me, you did not attend A&M, this story will give you a vivid sense of why Bonfire meant so much to those who stacked and burned it, and if, like me, you like to build fires, it will make you wish that you could have been there at least once to witness the rowdy glory yourself. Pam contacted nearly one hundred Aggies, including Governor Rick Perry, former San Antonio mayor Henry Cisneros, dozens of students, and many of the survivors who were on the stack that night. In their own words, from the inside looking out, they powerfully evoke the significance (and the tremendous fun) of the tradition, the terror of the collapse, and the agony over the university’s subsequent decision to suspend Bonfire.

It has not burned since. Starting in 2002, a smaller, student-run bonfire has been held off-campus, but the official Bonfire, for now, belongs to history. Had it continued, this month would have been its centennial. To mark the twin anniversary—a poignant alignment of tradition and tragedy—our cover tries to imagine what it would have looked like. Though there were hundreds of great archival images to choose from (some of which you’ll see inside), you cannot observe a moment like this with a picture of the past, so we hired a CGI firm to painstakingly build, computer-generated log by computer-generated log, a model of the stack. This is Bonfire as it might have looked on the afternoon of November 25, 2009, just hours before the fire was to be set. The outhouse has been affixed to the top, but the sheets doused with diesel fuel have not yet been placed around the lowest tier. Bonfire may never again burn as it once did, but like all great traditions, it lives in the minds of those who cherished it—part memory, part dream, part myth. Ω

[Jake Silverstein received a BA in English from Wesleyan University, an MA in English from Hollins University in Virginia, and an MFA in Creative Writing from the Michener Center for Writers at the University of Texas at Austin. He was a reporter at the Big Bend Sentinel in Marfa from 1999 to 2000 and a 2002 Fulbright Scholar in Zacatecas, Mexico. He was a Contributing Editor at Harper’s Magazine from 2003 to 2006. Silverstein joined Texas Monthly in 2006 as a Senior Editor. In September 2008 he was named Editor of Texas Monthly. His first book, Nothing Happened and Then It Did, a Chronicle in Fact and Fiction, is forthcoming in 2010.]

Copyright © 2009 Emmis Publishing dba Texas Monthly

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.

Copyright © 2009 Sapper's (Fair & Balanced) Rants & Raves

Sunday, October 25, 2009

Roll Over Russell Lynes, Make Way For A Lowbrow Blog!

William Pannapacker felt the sting of intellectual snobbery when a fellow grad student sneered at Pannapcker's set of Great Books: "Your clay feet are showing." Whoa! What does this blog reveal about the blogger? Is it highbrow? Is it middlebrow? Is it lowbrow? If you chose the last option, you would have been a big winner on a quiz show, like Herbert Stempel or Charles Van Doren on "Twenty One." If this is (fair & balanced) cultural phrenology, so be it.

[x CHE]
Confessions Of A Middlebrow Professor
By W.A. Pannapacker

Tag Cloud of the following article

created at TagCrowd.com

Back in the 70s, when I was a kid, I used to run to the television—there was only one in the house in those days—whenever I heard the opening notes of the "Fanfare-Rondeau" by the French composer Jean-Joseph Mouret. As the music played, the camera panned over objects that might be found in the drawing room of an English country manor: old books, sepia photos in silver frames, musical instruments, fountain pens, a long-necked decanter, some Roman coins, a model ship of the line, and a clutch of medals from the Great War. As the music concluded, the camera came to rest on a large, leatherbound volume with marbled endpapers. On the frontispiece was "Masterpiece Theatre," Introduced by Alistair Cooke.

I was still too young to appreciate Upstairs, Downstairs, but there was something about the introduction to that program that expressed the feelings of cultural aspiration that permeated my childhood (and perhaps a touch of postcolonial complex). Neither of my parents went to college. My father repaired sewing machines, and my mother sometimes worked as a typesetter. We lived in a working-class, row-house neighborhood in Philadelphia where nearly everyone was some kind of sports fan. But our family outings were almost always educational in some way: museums (Academy of Natural Sciences, Franklin Institute, Philadelphia Museum of Art); historic sites (Independence Hall, Franklin Court); libraries; and free concerts (at the Robin Hood Dell, as I recall). We read The Philadelphia Inquirer and Time magazine (not the tabloid Philadelphia Daily News), and—in addition to Masterpiece Theatre—we watched every PBS documentary series on science and culture, including "The Ascent of Man," "Cosmos," "Life on Earth," and the granddaddy of them all, "Civilisation," with Lord (Kenneth) Clark.

Those experiences with my family marked me as different from most of the other kids, but in some ways I was proud to be different. I thought of myself as destined for great things, like college, even if I had only a vague idea what that involved. I wanted to be seen reading instead of playing. Teachers and other adults praised me, as if I was some kind of prodigy. It wasn't until I arrived in graduate school that I learned there were people who took the intellectual life for granted—who didn't think reading was praiseworthy in itself—and who looked down on the striver's culture from which I emerged as "middlebrow."

"If any human being, man, woman, dog, cat or half-crushed worm dares call me 'middlebrow,'" wrote Virginia Woolf in an unsent letter to the editor of The New Statesman, "I will take my pen and stab him, dead." Woolf claimed to love "lowbrows"; "I study them; I always sit next the conductor in an omnibus and try to get him to tell me what it is like—being a conductor." But middlebrows, she wrote, "are the people, I confess, that I seldom regard with entire cordiality." Middlebrow culture was a "mixture of geniality and sentiment stuck together with a sticky slime of calves-foot jelly."

Unlike the independent highbrows and unself-conscious lowbrows, middlebrows, it seems, are so invested in "getting on in life" that they do not really like anything unless it has been approved by their betters. For Woolf and her heirs, middlebrows are inauthentic, meretricious bounders, slaves to fashion and propriety, aping a culture they cannot understand; they are the prototypes of Hyacinth Bucket in the BBC program "Keeping Up Appearances," who answers her "pearl-white, slim-line, push-button telephone" with "The Bouquet residence, the lady of the house speaking."

Of course, the only acceptable lowbrows are the ones who know their place, who have no aspirations to anything better, such as Hyacinth's unpretentious sister, Daisy, and her unemployed husband, Onslow, the sort of bloke who attends football matches wearing a cap that holds two cans of beer.

As the Harper's Magazine editor Russell Lynes argued in his 1949 essay "Highbrow, Lowbrow, Middlebrow," the ideal world for Woolf is a caste system in which billions of bovine proles produce the raw materials for a coterie of sensitive, highbrow ectomorphs who spring fully formed from the head of Sir Leslie Stephen. At the very least, lowbrows with upward aspirations should have the courtesy to keep themselves out of sight until they complete their passage through the awkward age of the middlebrow.

In my early 20s, when I was starting out as a graduate student in the humanities, I hosted a small gathering at my apartment. It didn't take long for my guests to begin scrutinizing my bookshelves. (I do the same thing now, of course, whenever I am at a party.) I remember that there were numerous battered anthologies, at least a hundred paperback classics, the Compact Edition of the Oxford English Dictionary (acquired as a Book-of-the-Month Club premium), probably six copies of PMLA, and several shelves of books that I had retained from childhood, including the Time-Life Library of Art and the Old West Time-Life Series in "hand-tooled Naugahyde leather."

Perhaps the most revered set of volumes from my childhood—proudly displayed—was Great Books of the Western World, in 54 leatherette volumes. I remember I bought them all at once for $10 at a church sale when I was about 13; it took me two trips to carry them home in plastic grocery bags.

"Your clay feet are showing," said one of my guests, another graduate student, as she removed Volume 1 of the Great Books from my shelves. I caught the biblical allusion, but it took me a couple of years to realize the implication of the remark: My background was lacking. If graduate school was a quiz show, then I was Herbert Stempel trying to make it in the world of Charles Van Doren.

Eventually all of those beloved volumes were boxed, hidden in a closet, and replaced by hundreds of university-press monographs on literary and cultural criticism—mostly secondhand—along with ever larger piles of mostly unreadable scholarly journals. Of course, such acquisitions only affirmed my middlebrow-status anxiety, since so many of them were motivated by what I thought other people thought, rather than by my own interests.

My recollections of that experience were prompted by a recent book by Alex Beam: A Great Idea at the Time: The Rise, Fall, and Curious Afterlife of the Great Books (Public Affairs, 2008). With a healthy dose of mockery for his subject, Beam recounts the inception, production, and reception of those maligned volumes up to the present time. (His project expands a chapter from the more scholarly work of Joan Shelley Rubin in The Making of Middlebrow Culture, published by the University of North Carolina Press in 1992, which, in turn, extends the chronology of Lawrence Levine's Highbrow/Lowbrow: The Emergence of Cultural Hierarchy in America, published by Harvard University Press in 1988.)

The brainchild of the philosopher Mortimer Adler and Robert Maynard Hutchins, president of the University of Chicago, the Great Books—originally published in 1952—gained prominence in the context of the GI Bill and the post-Sputnik emphasis on intellectual competition. Perhaps more notably, it was an era of rapid social mobility, when many of those in the newly middle class were insecure about their lack of education. "The ability to Discuss and Clarify Basic Ideas is vital to success. Doors open to the man who possesses this talent," declared one advertisement for the series, and door-to-door salesmen gained entry by posing as assistant professors offering the Great Books as a public service. Something like 50,000 sets were sold—typically on installment plans—before 1961. The Great Books were expressions of hope for many people who had historically not had access to higher education.

There was something awe-inspiring about that series for me, even if I acquired it a generation late. The Great Books seemed so serious. They had small type printed in two columns; there were no annotations, no concessions to the beginner. They emphasized classical writers: Homer, Aeschylus, Sophocles, Plato, and Aristotle, among others, like Galen and Marcus Aurelius, who are still remembered but rarely read. Their readings also included Bacon, Locke, Rousseau, Kant, Gibbon, Mill, and Melville; the series functioned like a reference collection of influential texts. I'd hear someone say, "I think, therefore I am," find out that it came from Descartes, and then I'd read the first few chapters of his Meditations on First Philosophy.

The Great Books gave me a realization in my teens that was something like what Jack London described in his fictionalized autobiography, Martin Eden: "He had never dreamed that the fund of human knowledge bulked so big. He was frightened. How could his brain ever master it all? Later, he remembered that there were other men, many men, who had mastered it; and he breathed a great oath, passionately, under his breath, swearing that his brain could do what theirs had done."

But actually reading all of the Great Books was impossible; it could be undertaken only as a stunt, like the one described by Ammon Shea in Reading the OED: One Man, One Year, 21,730 Pages (Penguin Group, 2008). A few times I made schedules, like those of Benjamin Franklin and Jay Gatsby, that included daily readings (alongside regimes of diet and exercise). Like many owners of that series, my intentions were good, but I can't say I had much success at joining "the Great Conversation." I could only listen, like a seminar participant intimidated into silence.

On the other hand, I did enjoy touring the circles of hell with Dante; I chased the White Whale with Ahab, and I enjoyed reading aloud Shakespeare's soliloquies, imitating the accents of the BBC performers (I can still do Derek Jacobi). I also found Freud just in time to psychoanalyze my adolescence; and I eventually began to upset my teachers at the Father Judge Catholic High School for Boys by quoting from Nietzsche in my classes on religion.

I am sure most academics would approve of my subversive impulses as a teenager, but there was a reason that you could buy the Great Books for $10 by that time. The whole notion of a stable canon of books had gone out of fashion, and not even recently: Writers such as Dwight MacDonald had been mocking the Great Books since they first appeared. As Beam observes, "The Great Books were synonymous with boosterism, Babbittry, and H.L. Mencken's benighted boobocracy." Display them in your living room, and you might as well put plastic covers on the colonial couch beneath your reproduction Grandma Moses with the copy of The Power of Positive Thinking on your coffee table. Great Books, Beam writes, "were everything that was wrong, unchic and middlebrow about middle America."

As Paul Fussell wrote in Class: A Guide Through the American Status System, "It is in the middle-class dwelling that you're likely to spot the 54-volume set of the Great Books, together with the half-witted two-volume Syntopicon, because the middles, the great audience for how-to books, believe in authorities."

By the end of the 1980s—when I was an undergraduate—it had become clear to seemingly everyone in authority that the notion of "Greatness" was a tool of illegitimate power; Adler and Hutchins were racist and sexist in their choices of texts; their valorization of the "Western World" made them complicit with imperialism and worse. "This is more than a set of books, and more than a liberal education," said Hutchins. "Great Books of the Western World is an act of piety. Here are the sources of our being. Here is our heritage. This is the West. This is its meaning for mankind."

"Dead white men" like Adler (though he was, in reality, an urban ethnic striver, like me, who had the misfortune to still be alive) remained committed to Matthew Arnold's vision of culture as "the best that has been thought and known in the world." The Syntopicon—an anthology of writings on themes such as "Fate" and "Pain"—had exactly 102 topics, and his list of "Greats" was nonnegotiable. "This is the canon, and it's not revisable," Adler said, making himself into a straw man for the culture warriors of the 80s and 90s.

Beam makes light of Adler's inflexibility, but he does not entirely embrace the by-now clichéd disdain for the Great Books, because they represent something admirable that, perhaps, should be revived in our culture: "The animating idea behind publishing the Great Books, aside from making money for Britannica and the University of Chicago," Beam observes, "was populism, not elitism." The books were household gods. They shared the living room with the television, and they made you feel guilty for being intellectually passive, for not taking control of your own mental development, for putting democracy at risk. "And thousands of copies, perhaps tens of thousands, were actually read, and had an enormous impact on the lives of the men, women, and children who read them."

As David Brooks has observed in Bobos in Paradise: The New Upper Class and How They Got There (Simon & Schuster, 2000), middlebrow culture "seems a little dull and pretentious but well intentioned, and certainly better than some of the proudly illiterate culture that has taken its place." "Masscult" has triumphed over "midcult," coinages of Dwight MacDonald in a 1962 essay, and hardly anyone feels guilty about being entertained all the time.

The most comprehensive recent analysis of the cultural turn is Susan Jacoby's The Age of American Unreason (Pantheon, 2008). In one chapter, Jacoby remembers the 1950s as a brief moment of intellectual aspiration among many Americans: "I look back on the middlebrow with affection, gratitude, and regret rather than condescension," she writes, "not because the Book-of-the-Month Club brought works of genius into my life, but because the monthly pronouncements of its reviewers encouraged me to seek a wider world."

The Great Books—along with all those Time-Life series—were often "purchased on the installment plan by parents who had never owned a book but were willing to sacrifice to provide their children with information about the world that had been absent from their own upbringing," Jacoby writes. They represented an old American belief—now endangered—that "anyone willing to invest time and energy in self-education might better himself."

What has been lost, according to Jacoby, is a culture of intellectual effort. We are increasingly ignorant, but we do not know enough to be properly ashamed. If we are determined to get on in life, we believe it will not have anything to do with our ability to reference Machiavelli or Adam Smith at the office Christmas party. The rejection of the Great Books signifies a declining belief in the value of anything without a direct practical application, combined with the triumph of a passive entertainment—as anyone who teaches college students can probably affirm.

For all their shortcomings, the Great Books—along with many other varieties of middlebrow culture—reflected a time when the liberal arts commanded more respect. They were thought to have practical value as a remedy for parochialism, bigotry, social isolation, fanaticism, and political and economic exploitation. The Great Books had a narrower conception of "greatness" than we might like today, but their foundational ideals were radically egalitarian and proudly intellectual.

As Beam concludes, "The Great Books are dead. Long live the Great Books." And, I might add: Long live middlebrow culture. Ω

[William A. Pannapacker is an Associate Professor of English and Towsley Research Scholar at Hope College in Holland, MI. Pannapacker is the author of Revised Lives: Walt Whitman and Nineteenth-Century Authorship. Pannapacker earned a B.A. in English at St. Joseph's University, an M.A. in English at Miami University, and a Ph.D. in the History of American Civilization at Harvard University.]

Copyright © 2009 The Chronicle of Higher Education

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.

Copyright © 2009 Sapper's (Fair & Balanced) Rants & Raves