Friday, October 01, 2004

Do Your Duty

Jim Holt has an interesting take on most things. I haven't sat out a presidential election since 1964; my pathetic voting history has been told more than once in this blog. However, I may cast my vote for a third party candidate this year. In 1992, I voted for Ross Perot because I loved to hear him say "chicken" as he taunted the Slickster about the role of Tyson Foods in the Arkanasas economy. I think that Perot drained support from Bush 41 and helped elect the Slickster. Jim Holt argues that the sole way to swing an election is to run as a third party candidate, a'la Ralph Nader. As Richard (Kinky) Friedman said as he threw his hat into the gubernatorial ring: Why the hell not? If that works for the Kinkster, why not me? If this is (fair & balanced) delusion, so be it.

[x NYTimes Magazine]
Is Voting Worth the Trouble?
By Jim Holt

Why does voting in a presidential election feel at the same time both terribly important and utterly pointless? There is a paradox here, and it is not easy to make it go away. On the one hand, casting a ballot on Election Day strikes us as a kind of civic obligation; neglecting to do so is perhaps not so serious as neglecting to file a tax return, but it is still something you feel guilty about. On the other hand, nearly half of those Americans who are eligible to vote evidently don't think that it's worth the bother. And, in a sense, they're right.

Some nonvoters, no doubt, couldn't care less about which candidate wins. (The ancient Greeks had a word for a person who is indifferent to public affairs in this way: idiotes, or idiot.) Others may be passionately interested in which candidate wins, but they suspect that their own ballot is immaterial to the outcome.

What is the chance, after all, that a single vote will swing an election? That is a tricky question, depending as it does on how close the race is. Still, ballpark estimates have been proposed. The simplest of them is just one divided by the total number of voters. (This is the chance that a given voter casts the last necessary vote for the winner.) Since there are a hundred million voters or so in U.S. presidential elections these days, the probability that any one of them will decide the outcome is on the order of .00000001.

If that is the infinitesimal impact you can expect to have, is it rational to take the trouble to cast a ballot? Perhaps not. Suppose you're one of the proverbial voters who ''vote their pocketbooks.'' Let's say that the benefit to you personally if candidate A beats candidate B would be a million dollars (that might be because of tax cuts, not having your job outsourced, etc.). If you multiply this benefit by the probability that you could affect the election (.00000001) you end up with . . . one lousy cent. That is what economists call the expected utility of your vote. But wait -- voting also has costs, both in time (getting to the polling place, waiting in line) and money (I had to use two 37-cent stamps last week to send the last four digits of my Social Security number to the local Board of Elections). To be conservative, let's put the cost of voting at $10. The expected payoff is a penny. This is one lottery ticket you don't want to buy.

The fact that more than half of the U.S. electorate nevertheless does go through the effort of voting is something of a puzzlement to political scientists who theorize about rational choice. Some of them have speculated that people must greatly overestimate the likelihood that their vote will be decisive, perhaps because of hearing about past elections that supposedly turned on a couple of hundred votes. Others have wondered whether voters aren't motivated primarily by a desire for self-expression, or by the ''entertainment value'' of going to the polls, or even by a fascination with voting machines. (Perhaps this is why so many people bestir themselves to vote without bothering to learn anything about the issues.)

Even though an individual voter is virtually powerless to affect an election, some voters are more powerless than others, thanks to the Electoral College system. Here too, though, the reality is not what it seems. Most people appear to believe that the Electoral College favors voters in less populous states, since each state gets an extra two electors corresponding to its two senators, regardless of how paltry its population is. Thus the state of Wyoming (half a million people, 3 electors) has almost four times the representation per capita in the Electoral College as California (36 million people, 55 electors).

But there is another feature of the Electoral College system that rewards large states: the winner-take-all rule. In 48 of the 50 states, the candidate who wins the popular vote gets all the state's electors. This means that a voter in California commands potential influence over a far larger bloc of electors than one in Wyoming; and large blocs are vastly more likely to swing elections than small blocs. In fact, as the political scientist Steven J. Brams of New York University has shown, an individual voter in a large state can have as much as three times the power of one in a small state. And presidential campaigns seem to know this: they spend considerably more money per elector in large states.

Of course, living in a big state doesn't mitigate your powerlessness if that state is one-sided, like New York or Texas. And even in the notoriously close 2000 contest in Florida, the chance of a single voter tipping the election was only on the order of one in 10,000.

The moral, if there is one, is to vote out of duty, not self-interest. Why duty? For the simple reason that (as the Marquis de Condorcet once suggested) the more people who vote, the greater the chance of a happy result -- provided that each person is more likely to vote for the superior candidate. (If you fail to meet that proviso, stay home, for heaven's sake.) But be aware, as you pull the lever, that your action will not swing the election. If that's your goal, run for president on a third-party ticket to draw votes away from the candidate you want to see lose. Or get appointed to the Supreme Court.



Jim Holt writes the "Egghead" column for Slate. He also writes for The New Yorker and the New York Times Magazine.

Copyright © 2004 The New York Times Company

Student Evaluation Blues

One of the things I hated most about the Collegium Excellens (My place of employment for 32 years.) was the implementation of student evaluations in the early 1980s. That first year, the procedure was a convoluted dance wherein each department assigned its faculty to distribute and take up the scannable forms in classes other than their own. The presumption here was that professors could not be trusted with the forms and they would attempt to skew the results by means of coercive demeanor during the evaluation period. Thus, on the assigned day, a different professor would meet the class, distribute the forms, collect the completed forms, and notify the evaluatee (lurking about the hall outside the classroom) that the class was waiting. My favorite evaluation story did not involve me. The English Department harbored the worst teacher among all of us. This poor man—walked into one classroom after midterm one year—and proceeded to lecture the class. He was interrupted by the actual teacher of the class who was tardy to that class meeting. Professor Incompetent taught in the classroom next door! On evaluation day, Professor Incompetent was blabbing away when a younger colleague entered the room. Incompetent looked puzzled. The new arrival said, "I'm here to do the student evaluation of this class, Incompetent. The confused man gathered his papers and walked toward the door. As he got to the door, Incompetent whirled around and, in a loud voice, proclaimed: "I just want to say that this is the finest class it has ever been my pleasure to teach!" With that, Incompetent strolled out of the room. Incompetent—like me—retired from the Collegium Excellens after decades of uttering nonsense. If this is (fair & balanced) appraisal, so be it.

[x Poets & Writers]
All Entertainment All the Time
by Mark Edmunson

The following essay is excerpted from Why Read? by Mark Edmunson, which was published in September by Bloomsbury.

I can date my sense that something was going badly wrong in my own teaching to a particular event. It took place on evaluation day in a class I was giving on the works of Sigmund Freud. The class met twice a week, late in the afternoon, and the students, bout fitly undergraduates, tended to drag in and slump. Looking slightly disconsolate, waiting for a jump start. To get the discussion moving, I often provided a joke, an anecdote, an amusing query. When you were a child, I had asked a few weeks before, were your Halloween costumes id costumes, superego costumes, or ego costumes? Were you monsters—creatures for the black lagoon, vampires and werewolves? Were you Wonder Women and Supermen? Or were you something in between? It often took this sort of thing to raise them from the habitual torpor.

But today, evaluation day, they were full of life. As I passed out the assessment forms, a buzz rose up in the room. Today they were writing their course evaluations; their evaluations of Freud, their evaluations of me. They were pitched into high gear. As I hurried from the room, I looked over my shoulder to see them scribbling away like the devil’s auditors. They were writing furiously, even the ones who struggled to squeeze out their papers and journal entries word by word.

But why was I distressed, bolting out the door of my classroom, where I usually held easy sway? Chances were that the evaluations would be much like what they had been in the past: they’d be just fine. And in fact, they were. I was commended for being “interesting,” and complimented for my relaxed and tolerant ways; my sense of humor and capacity to connect the material we were studying with contemporary culture came in for praise.

In many ways, I was grateful for the evaluations, as I always had been, just as I’m grateful for the chance to teach in an excellent university surrounded everywhere with very bright people. But as I ran from that classroom, full of anxious intimations, and then later as I sat to read the reports, I began to feel that there was something wrong. There was an undercurrent to the whole process I didn’t like. I was disturbed by the evaluation forms themselves with their number ratings (“What is your ranking of the instructor?—1, 2, 3, 4 or 5") which called to mind the sheets they circulate after a TV pilot plays to the test audience in Burbank. Nor did I like the image of myself that emerged—a figure of leaned but humorous detachment, laidback, easygoing, cool. But most of all, I was disturbed by the attitude of calm consumer expertise that pervaded the responses. I was put off by the serenely implicit belief that the function of Freud—or, as I’d seen it expressed on other forms, in other classes, the function of Shakespeare, of Wordsworth or of Blake—was diversion and entertainment. “Edmundson has done a fantastic job,” said one reviewer, “of presenting this difficult, important and controversial material in an enjoyable and approachable way.”

Enjoyable: I enjoyed the teacher. I enjoyed the reading. Enjoyed the course. It was pleasurable, diverting, part of the culture of readily accessible, manufactured bliss: the culture of Total Entertainment All the Time.

As I read the reviews, I thought of a story I’d heard about a Columbia University instructor who issued a two-part question at the end of his literature course. Part one: What book in the course did you most dislike; part two: What flaws of intellect or character does that dislike point up in you? The hand that framed those questions may have been slightly heavy. But at least it compelled the students to see intellectual work as confrontation between two people, reader and author, where the stakes mattered. The Columbia students were asked to relate the quality of an encounter, not rate the action as though it had unfolded across the big screen. A form of media connoisseurship was what my students took as their natural right.

But why exactly were they describing the Oedipus complex and the death drive as interesting and enjoyable to contemplate? Why were they staring into the abyss, as Lionel Trilling once described his own students as having done, and commending it for being a singularly dark and fascinatingly contoured abyss, one sure to survive as an object of edifying contemplation for years to come? Why is the great confrontation—the rugged battle of fate where strength is born, to recall Emerson—so conspicuously missing? Why hadn’t anyone been changed by my course?

To that question, I began to compound an answer. We Americans live in a consumer culture, and it does not stop short at the university’s walls. University culture, like American culture at large, is ever more devoted to consumption and entertainment, to the using and using up of goods and images. We Americans are six percent of the world’s population: We use a quarter of its oil; we gorge while others go hungry; we consume everything with a vengeance and then we produce movies and TV shows and ads to celebrate the whole consumer loop. We make it—or we appropriate it—we “enjoy” it and we burn it up, pretty much whatever “it” is. For someone coming of age in America now, I thought, there are few available alternatives to the consumer worldview. Students didn’t ask for it much less create it, but they brought a consumer Weltanschauung to school, where it exerted a potent influence.

The students who enter my classes on day one are generally devotees of spectatorship and of consumer-cool. Whether they’re sorority-fraternity denizens, piercer-tattooers, gay or straight, black or white, they are, nearly across the board, very, very self-contained. On good days, there’s a light, appealing glow; on bad days, shuffling disgruntlement. But there is little fire, little force of spirit or mind in evidence.

More and more, we Americans like to watch (and not to do). In fact watching is our ultimate addiction. My students were the progeny of two hundred available cable channels and omnipresent Blockbuster outlets. They grew up with their noses pressed against the window of that second spectral world that spins parallel to our own, the World Wide Web. There they met life at second or third hand, peering eagerly, taking in the passing show, but staying remote, apparently untouched by it. So conditioned, they found it almost natural to come at the rest of life with a sense of aristocratic expectation: “What have you to show me that I haven’t yet seen?”

But with this remove comes timidity, a fear of being directly confronted. There’s an anxiety at having to face life firsthand. (The way the word “like” punctuates students’ speech—“I was like really late for like class”—indicates a discomfort with immediate experience and wish to maintain distance, to live in a simulation.) These students were, I thought, inclined to be both lordly and afraid.

The classroom atmosphere they most treasured was relaxed, laid-back, cool. The teacher should never get exercised about anything, on pain of being written off as a buffoon. Nor should she create an atmosphere of vital contention, where students lost their composure, spoke out, became passionate, expressed their deeper thoughts and fears, or did anything that might cause embarrassment. Embarrassment was the worst thing that could befall one; it must be avoided at whatever cost.

Early on, I had been a reader of Marshall McLuhan, and I was reminded of his hypothesis that the media on which we as a culture have become dependent are themselves cool. TV, which seemed on the point of demise, so absurd had it become to the culture of the late sixties, rules again. To disdain TV now is bad form; it signifies that you take yourself far too seriously. TV is a tranquilizing medium, a soporific, inducing in its devotees a light narcosis. It reduces anxiety, steadies and quiets the nerves. But also deadens. Like every narcotic, it will consumed in certain doses, produce something like a hangover, the habitual watchers’ irritable languor that persists after the TV is off. It’s been said that the illusion of knowing and control that heroin engenders isn’t entirely unlike the TV consumer’s habitual smug-torpor, and that seems about right.

Those who appeal most on TV over the long haul are low-key and nonassertive. Enthusiasm quickly looks absurd. The form of character that’s most ingratiating on the tube, that’s most in tune with the medium itself, is laid-back, tranquil, self-contained, and self-assured. The news anchor, the talk-show host, the announcer, the late-night favorite—all are prone to display a sure sense of human nature, avoidance of illusion, reliance on timing and strategy rather than on aggressiveness or inspiration. With such figures, the viewer is invited to identify. On what’s called reality TV, on game shows, quiz shows, inane contests, we see people behaving absurdly, outraging the cool medium with their firework personalities. Against such excess the audience defines itself as wordly, laid-back, and wise.

Is there also a financial side to the culture of cool? I believed that I saw as much. A cool youth culture is a marketing bonanza for producers of right products, who do all they can to enlarge that culture and keep it humming. The Internet, TV, and magazines teem with what I came to think of as persona ads, ads for Nikes and Reeboks, and Jeeps and Blazers that don’t so much endorse the powers of the product per se as show you what sort of person you’ll inevitably become once you’ve acquired it. The Jeep ad that featured hip outdoorsy kids flinging a Frisbee from mountain top to mountaintop wasn’t so much about what Jeeps can do as it was about the kind of people who own them: vast, beautiful creatures, with godlike prowess and childlike tastes. Buy a Jeep and be one with them. The ad by itself is of little consequence, but expand its message exponentially and you have the central thrust of postmillennial consumer culture: buy in order to be. Watch (coolly) so as to learn how to be worthy of being watched (while being cool).

To the young, I thought, immersion in consumer culture, immersion in cool, is simply felt as natural. They have never known a world other that the one that accosts them from every side with images of mass-marketed perfection. Ads are everywhere: on TV, on the Internet, on billboards, in magazines, sometimes plastered on the side of school bus. The forces that could challenge the consumer style are banished to the peripheries of culture. Rare is the student who arrives at college knowing something about the legacy of Marx or Marcuse, Gandhi or Thoreau. And by the time she does encounter them, they’re presented as diverting, interesting, entertaining—or perhaps as object for rigorously dismissive analysis—surely not as goads to another kind of life.

As I saw it, the specter of the uncool was creating a subtle tyranny for my students. It’s apparently an easy standard to subscribe to, the standard of cool, but once committed to it, you discover that matters are different. You’re inhibited, except on ordained occasions, from showing feeling, stifled from trying to achieve anything original. Apparent expression of exuberance now seem to occur with dimming quotation marks around them. Kids celebrating at a football game ironically play the roles of kids celebrating at a football game, as it’s been scripted on multiple TV shows and ads. There’s always self-observation, no real letting-go. Students apparently feel that even the slightest departure from the reigning code can get you genially ostracized. This is a culture tensely committed to a laid-back norm.

In the current university environment, I saw, there was only one form of knowledge that was generally acceptable. And that was knowledge that allowed you to keep your cool. It was fine to major in economics or political science or sociology, for there you could acquire ways of knowing that didn’t compel you to reveal and risk yourself. There you could stay detached. And—what was at least as important—you could acquire skills that would stand you in good financial stead later in life. You could use your educations to make yourself rich. All of the disciples that did not traduce the canons of cool were thriving. It sometimes seemed that everyone of my first-year advisees wanted to major in economics, even when they had no independent interest in the subject. They’d never read an economics book, had no attraction to the business pages of the Times. They wanted economics because word had it that econ was the major that made you look best to Wall Street and the investment banks. “We like economics majors,” an investment banking recruiter reportedly said, “because they’re people who’re willing to sacrifice their educations to the interest of their careers.”

The subjects that might threaten consumer cool, literary study in particular, had to adapt. They could offer diversion—it seems that’s what I (and Freud) had been doing—or they could make themselves over to look more like the so-called hard, empirically based disciplines.

Here computers come in. Now that computers are everywhere, each area of enquiry in the humanities is more and more defined by the computer’s resources. Computers are splendid research tools. Good. The curriculum turns in the direction of research. Professors don’t ask students to try to write as Dickens would, experiment with thinking as height, were he alive today. Rather, they research Dickens. They delve into his historical context; they learn what the newspapers were gossiping about on the day that the first installment of Bleak House hit the stands. We shape our tools, McLuhan said, and thereafter our tools shape us.

Many educated people in America seem persuaded that the computer is the most significant invention in human history. Those who do not master its intricacies are destined for a life of shame, poverty, and neglect. This more humanities courses are becoming computer-oriented, which keeps them safely in the realm of cool, financially negotiable endeavors. A professor teaching Blake’s “The Chimney Sweeper,” which depicts the exploitation of yong boys whose lot is not altogether unlike the lot of many children living now in American inner cities, is likely to charge his students with using the computer to compile as much information bout the poem as possible. They can find articles about chimney sweepers from 1790s newspapers; contemporary pictures and engravings that depict these unfortunate little creatures; critical articles that interpret the poem in a seemingly endless variety of ways; biographical information on Blake, with hints about events in his own boyhood that would have made chimney sweepers a special interest; portraits of the author at various stages of his life; maps of Blake’s London. Together the class might create a Black—Chimney Sweeper Web site: www.blakesweeper.edu.

Instead of spending class time wondering what the poem means, and what application it has to present-day experience, students compile information about it. They set the poem in its historical and critical context, showing first how the poem is the product and the property of the past—and, implicitly, how it really has nothing to do with the present except as an artful curiosity; and second how, given the number of ideas about it already available, adding more thought would be superfluous.

By putting a world of facts at the end of a key-stroke, computers have made facts, their command, their manipulation, their ordering, central to what now can qualify as humanistic education. The result is to suspend reflection about the differences among wisdom, knowledge, and information. Everything that can be accessed online can seem equal to everything else, no datum more important or more profound than any other. Thus the possibility presents itself that there really is not more wisdom; there is no more knowledge; there is only information. No thought is a challenge or an affront to what one currently believes.

Am I wrong to think that the kind of education on offer in the humanities now is in some measure an education for empire? The people who administer an empire need certain very precise capacities. They need to be adept technocrats. They need the kind of training that will allow them to take up an abstract and unfelt relation to the world and its peoples—a cool relation, as it were. Otherwise, they won’t be able to squeeze forth the world’s wealth without suffering debilitating pains of conscience. And the denizen of the empire needs to be able to consumer the kinds of pleasures that will augment his feeling of rightful rulership. Those pleasures must be self-inflating and not challenging; they need to confirm the current empowered state of the self and not challenge it. The easy pleasures of this nascent American empire, akin to the pleasures to be had in first-century Rome, reaffirm the right to mastery—and, correspondingly, the existence of a world teeming with potential vassals and exploitable wealth.

Immersed in preprofessionalism, swimming in entertainment, my students have been sealed off from the chance to call everything they’ve valued into question, to look at new forms of life, and to risk everything. For them, education is knowing and lordly spectatorship, never the Socratic dialogue about how one ought to live one’s life.

These thoughts of mine didn’t come with any anger at my students. For who was to blame them? They didn’t create the consumer biosphere whose air was now their purest oxygen. They weren’t the ones who should have pulled the plug on the TV or disabled the game port when they were kids. They hadn’t invited the ad flaks and money changers into their public schools. Wheat I felt was an ongoing sense of sorrow about their foreclosed possibilities. They seemed to lack chances that I, born far poorer than most of them, but into a different world, had abundantly enjoyed.

As I read those evaluation forms and thought them over, I recalled a story. In Vienna, there was once a superb teacher of music, very old. He accepted a few students. There came to him once a young many whom all fo Berlin was celebrating. Only fourteen, yet he played exquisitely. The young man arrived in Austria hoping to study with the master. At the audition, he played to perfection; everyone surrounding the old teacher attested to the fact. When it came time to make his decision. The old man didn’t hesitate. “I don’t want him,” he said. “But, master, why not?” asked a protégé. “He’s the most gifted young violinist we’ve ever heard.” “Maybe,” said the old man. “But he lacks something, and without this thing real development is not possible. What that young man lacks in inexperience.” It’s a precious possession, inexperience; my students have had it stolen from them.

But what about the universities themselves? Do they do all they can to fight the reign of consumer cool?

From the start, the university’s approach to students now has a solicitous, maybe even a servile tone. As soon as they enter their junior year in high school, and especially if they live in a prosperous zip code, the information materials, which is to say the advertising, come rolling in. Pictures, testimonials, videocassettes, and CD-ROMs (some hidden, some not) arrive at the door from colleges across the country, all trying to capture the students and their tuition dollars.

The freshman-to-be sees photographs of well-appointed dorm rooms; of elaborate physed facilities; of expertly maintained sports fields; of orchestras and drama troupes; of students working joyously, off by themselves. It’s a retirement spread for the young. “Colleges don’t have admissions offices anymore, they have marketing departments,” a school financial officer said to me once. Is it surprising that someone who has been approached with photos and tapes, bells and whistles, might come to college thinking that the Shakespeare and Freud courses were also going to be agreeable treats?

How did we reach this point? In part, the answer is a matter of demographics and also of money. Aided by the GI Bill, the college-going population increased dramatically after the Second World War. Then came the baby boomers, and to accommodate them colleges continued to grow. Universities expand readily enough, but with tenure locking in faculty for lifetime jobs, and with the general reluctance of administrators to eliminate their own slots, it’s not easy for a university to contract. So after the baby boomers had passed through—like a tasty lump sliding the length of a boa constrictor—the colleges turned to promotional strategies—to advertising—to fill the empty chairs. Suddenly college, except for the few highly selective establishments, became a buyers’ market. What students and their parents wanted had to be taken potently into account. That often meant creating more comfortable, less challenging environments, places where almost no one failed, everything was enjoyable, and everyone was nice.

Just as universities must compete with one another for students, so must individual departments. At a time of rank economic anxiety (and what time is not in America?), the English department and the history department have to contend for students against the more success-ensuring branches, such as the science departments, and the commerce school. In 1968, more than 21 percent of all the bachelor’s degrees conferred in America were humanities degrees; by 1993 that total had fallen to about 13 percent, and it continues to sink. The humanities now must struggle to attract students, many of whose parents devoutly wish that they would go elsewhere.

One of the ways we’ve tried to be attractive is by loosening up. We grade much more genially than our colleagues in the sciences. In English and history, we don’t give many D’s, or C’s either. (The rigors of Chem 101 may create almost as many humanities majors per year as the splendors of Shakespeare.) A professor at Stanford explained that grades were getting better because the students were getting smarter every year. Anything, I suppose, is possible.

Along with easing up on grades, many humanities departments have relaxed major requirements. There are some good reasons for introducing more choice into the curricula and requiring fewer standard courses. But the move jibes with a tendency to serve the students instead of challenging them. Students can float in and out of classes during the first two weeks of the term without making any commitment. The common name for this span—shopping period—attests to the mentality that’s in play.

One result of the university’s widening elective leeway is to give students more power over teachers. Those who don’t like you can simply avoid you. If the students dislike you en masse, you can be left with an empty classroom. I’ve seen other professors, especially older ones, often those with the most to teach, suffer real grief at not having enough students sign up for their courses: their grading was too tough; they demanded too much; their beliefs were too far out of line with the existing dispensation. It takes only a few such incidents to draw other professors into line.

Before students arrive, universities ply them with luscious ads, guaranteeing them a cross between summer camp and lotusland. When they get to campus, flattery, entertainment, and preprofessional training are theirs, if that’s what they want. The world we present them is not a world elsewhere, an ivory tower world, but one that’s fully continuous with the American entertainment and consumer culture they’ve been living in. They hardly know they’ve left home. Is it a surprise, then, that this generation of students—steeped in consumer culture before they go off to school; treated as potent customers by the university well before they arrive, then pandered to from day one—are inclined to see the books they read as a string of entertainments to be enjoyed without effort or languidly cast aside?

So I had my answer. The university had merged almost seamlessly with the consumer culture that exists beyond its gates. Universities were running like businesses, and very effective businesses at that. Now I knew why my students were greeting great works of mind and heart as consumer goods. They came looking for what they’d had in the past, Total Entertainment All the Time, and the university at large did all it could to maintain the flow. (Though where this allegiance to the Entertainment-Consumer Complex itself came from—that is a much larger question. It would take us into politics and economics, becoming, in time, a treatise in itself.)

But what about me? Now I had to look at my own place in the culture of training and entertainment. Those course evaluations made it clear enough. I was providing diversion. To some students I was offering an intellectualized midday variant of Letterman and Leno. They got good times from my classes, and maybe a few negotiable skills, because that’s what I was offering. But what was I going to do about it? I had diagnosed the problem, all right, but as yet I had nothing approaching a plan for action.

I’d like to say that I arrived at something like a breakthrough simply by delving into my own past. In my life I’ve had a string of marvelous teachers, and thinking back on them was surely a help. But some minds—mine, at times, I confess—tend to function best in opposition. So it was looking not just to the great and good whom I’ve known, but to something like an arch-antagonist, that got me thinking in fresh ways about how to teach and why.

From the book Why Read? by Mark Edmundson. Published by Bloomsbury USA.Copyright © 2004 by Mark Edmundson. Reprinted courtesy of Bloomsbury Publishing. Available wherever books are sold.

Mark Edmundson is a professor of English at the University of Virginia. A prizewinning scholar, he has published a number of works of literary and cultural criticism, including Literature Against Philosophy, Plato to Derrida, and Teacher: The One Who Made the Difference. He has also written for such publications as the New Republic, the New York Times Magazine, the Nation, and Harper's, where he is a contributing editor.


Copyright © 2004 Poets & Writers, Inc.