Friday, July 01, 2016

A,Note To The Dumbo Enablers Of Die Liegen Blödmann: Take The Jobs-Creator Canard & Shove It

In 1983, Todd Rundgren sang an anthem for joblessness: ""I don't want to work, I just wanna bang on de drum all day." And it provides a background for jobs and lack of jobs in 2016.
[x YouTube/Todd RundgrenVevo]
"Bang the Drum All Day" (1983)
By Todd Rundgren with Utopia

The song makes about as much sense as the current Dumbo whining about the jobs lost in 2008... thanks to Dumbos on Wall Street and in the White House. If this is a (fair & balanced) revelation of Dumbo hypocrisy, so be it.

[x WQ]
The Great American Job Machine Is Sputtering, But Still Powerful
By Scott Winship
TagCrowd cloud of the following piece of writing

created at TagCrowd.com

Hardly a week goes by without at least one commentator somewhere in America heralding the demise of the middle-class worker. Because of the historic severity of the Great Recession and its aftermath, it is not hard to stoke anxiety. The unease coheres in a conventional wisdom that connects a number of short- and long-term economic trends with today’s tepid conditions, creating a fearful narrative about the future of jobs in America. It is a narrative that mischaracterizes the past and only feeds the anxieties it claims to explain.

In the conventional story, the economy has been unable for decades to produce jobs for all the people who want them. It is also said that the middle class is becoming “hollowed out” as job growth is increasingly confined to occupations that require either very low-level skills or highly sophisticated ones—and that pay accordingly. “Job polarization,” as this pattern is called, has been driven by technological changes that have automated many “middle-skill” jobs and encouraged their offshoring to lower-paid workers in other countries. Increasingly, we are warned that robots and other products of the information technology revolution will usher in mass unemployment in the not-too-distant future. Wages will continue to stagnate or decline despite rising productivity, as they have done for decades.

How worried should we be that such a dark future awaits? In answering that question, it makes sense to focus on the experience of men during their prime working years. Women have seen such strong gains in education and employment over the past few decades as a result of increasing gender equality that it is very difficult to draw many broad conclusions about the underlying condition of the economy from their experience.

At first glance, men have had a hard time of it. Ninety-five percent of men between the ages of 25 and 54 were working in an average week in 1969. By 1983, the employment rate among this segment of the work force had fallen to 86 percent, and it fell again, dramatically, with the onset of the Great Recession. Employment had begun to recover by 2012, but it still stood at just 83 percent.

One reason employment fell is that it became harder for those looking for work to find a job. But while the unemployment rate is abnormally high at the moment, over the longer term it rises and falls with the business cycle. The main reason employment has declined is that more and more men are not looking for work. Between 1969 and 2012, the share of men ages 25 to 54 who were out of the labor force rose from four percent to 11 percent.

Remarkably, if one had forecast the 2012 labor force participation rate (the fraction of men working or looking for work) by simply extending the curve of the 1948-to-2000 decline on a graph, the prediction would have exactly matched the actual 2012 rate. That suggests that whatever is causing the rise in the share of working-age men who are out of the labor force is much more deeply rooted than the past few years of economic ups and downs.

What could be behind this change? People can be out of the labor force for reasons other than despair of finding a job. Many have illnesses or disabilities. Others, even in the 25-to-54 age group, are full-time students. Some are able to retire early. A small number of men have primary responsibility for maintaining their home while a partner works. Others are sustained by unreported sources of income, including off-the-books jobs.

In fact, the vast majority of working-age men who are out of the labor force today tell survey researchers that they do not want to work. When you factor in that preference, the story of employment decline begins to look quite different. By the conventional measure, employment among working-age men declined by seven percentage points between 1969 and 2007, just before the Great Recession, reaching 88 percent. But in my analysis based on “work-interested” men in those years, the drop amounts to just three percentage points—hardly a dire trend. (Counting only work-interested men, 94 percent were employed in 2007.)

To be sure, many of the men who are uninterested in working are only out of the labor force because federal disability benefits have been steadily extended to people who in the past would have had to look for work. Beyond its historic role as a safety net for those with severe conditions, disability has become a welfare program for able-bodied men with low skill levels, since the kinds of jobs they can get don’t command high wages. The number of working-age men drawing benefits has climbed over time, and there has been no overall increase in the incidence of health problems to explain it. Still, the rise in the number of disabled men who say they do not want to work is too small—adding another point to the three-percentage-point decline—to alter the conclusion that the drop in male employment has been modest.

When one looks beyond the core working-age group of men, the long-term employment picture is no more worrisome. Men under 25 have seen a large decline in labor force participation since 1979, but US Department of Education statistics show that this decline is mostly explained by rising high school and postsecondary enrollments. Among 18-to-19-year-old men, school enrollment rose 20 percentage points from 1980 to 2010, while labor force participation declined by 22 points. Among their slightly older peers in the 20-to-24-year-old group, school enrollment gains fully offset the participation decline.

Among men older than 54, the labor force participation rate has actually been rising for the past couple of decades, reversing an old trend. That alarms some observers, who argue that it is another sign of distress—that men are being forced out of retirement or have been unable to retire in the first place. But research by Brookings Institution economist Barry Bosworth suggests that this increase has been concentrated among the best-educated and highest-earning workers, often men who are staying on the job less for a paycheck than for mental stimulation, camaraderie, and other intangible benefits.

If the total supply of jobs has not shrunk that much over the long run, what about the supply of good jobs? In recent years, the work of MIT economist David Autor showing an increase in what he calls “job polarization” has stirred fears that the middle class is being “hollowed out.” Job growth, Autor suggests, has been occurring primarily in occupations that pay either quite well or quite poorly. Solid middle-class occupations—such as clerical, administrative, and production jobs—have seen slower growth or outright declines. A future in which the occupational structure was shaped like an hourglass—fat at the top and bottom and thin in the middle—would condemn us to rising inequality and perhaps diminished economic mobility.

A growing number of routine tasks, Autor argues, can be done by new information technologies or by lower-wage workers in other countries. Increasingly, jobs in the United States will require either abstract skills associated with high levels of education and intelligence or—because nobody has figured out a way to offshore the jobs of short-order cooks and house painters—more basic skills requiring no formal schooling. Jobs in the first group pay well because the demand for abstract thinking outstrips the number of workers who can supply it, while those in the second group pay poorly because so many people can do the work.

Autor’s research, however, has been blown out of proportion by its popularizers even as it has been effectively challenged by other economists. Harry Holzer, of Georgetown University, and Robert Lerman, of the Washington-based Urban Institute, for example, conclude that there has been only a modest decline in “middle-skill” jobs, from 55 percent of the total in 1986 to 48 percent in 2006. “Stories of dramatic polarization . . . seem inconsistent with these facts,” Holzer has written. He and Lerman predict that middle-skill jobs will account for 40 to 45 percent of new hiring in this decade, with particularly strong demand for certain types of workers, such as “technicians, licensed practical nurses, and therapists in health care.” Holzer writes that there will be “substantial opportunities for earnings improvements to many youth and adults for whom a bachelor’s degree might be out of reach.”

The Washington think-tank trio of Lawrence Mishel and Heidi Shierholz of the Economic Policy Institute and John Schmitt of the Center for Economic and Policy Research confirm that the share of “middle-wage” jobs declined only modestly from the late 1980s through 2007. But they see a much steeper decline if the beginning date is stretched back to 1959, from 66 percent to 48 percent. That might be alarming, except that it mostly reflects a net upgrading of jobs. “High-wage” jobs grew from 21 percent of all employment to 34 percent from 1959 to 2007. Employment has shifted much more from middle-paying jobs to higher-paying managerial, professional, and technical positions than from middle- to low-paying jobs.

Indeed, this dynamic held in each decade from the 1960s to the ’90s. From 2000 to 2007, growth was strongest in low-paying jobs, but that was a period dominated by two trends—slow growth in the supply of native-born workers (due to an aging population) and a large increase in the number of immigrants with lower levels of schooling. It is always true that the supply of jobs depends significantly on the supply of labor—an important fact to remember when we evaluate the future of job growth.

While Autor emphasizes the threat of automation, Princeton economists Alan Blinder and Alan Krueger stress the negative effects of offshoring. They estimate that, in principle, a quarter of American jobs are “offshorable,” in that they do not require working in physical proximity to colleagues or customers. However, just because a job is offshorable does not mean it will be eliminated. If the benefits of face-to-face interaction among workers were small, employers would not go to the trouble and expense of bringing workers together in central offices and dense metropolitan areas. Perhaps more important, our history of job upgrading shows that the ill effects of offshoring can be offset. Blinder and Krueger surmise that a quarter of jobs were also offshorable in 1960, and the United States did indeed send many manufacturing positions and other work overseas in the ensuing decades. But the larger story is that the economy adapted to change, and, thanks to continuing domestic job growth, the period brought steady increases in higher-paying occupations.

We have a tendency, when thinking about technology and trade, to zero in on their harmful effects. But they also have a strong upside: lower prices for American consumers for everything from toothbrushes to refrigerators. Evidence from economists Christian Broda and John Romalis, for example, suggests that thanks in part to imports from developing countries, the cost of living over the last 20 years has risen less among lower-income Americans than among richer households.

Consider the impact of Walmart, the often-maligned retail colossus. Jason Furman, the incoming chair of President Barack Obama’s Council of Economic Advisers, noted back in 2005 that Walmart’s “everyday low prices”—the result of relentless efforts to minimize costs, including reliance on imports and foreign labor—had effectively boosted American family incomes by 1.5 to 4.5 percent. In total dollars, that benefit was more than $100 billion, at least 20 times the reduction in wages that Walmart’s critics claimed the company caused.

The poor, Furman found, benefited more from lower prices than others because a bigger share of the things they bought—clothing, groceries, paper products—were goods sold by Walmart. Even if one assumes that the bottom fifth of households bears the entire cost of wage reductions caused by Walmart (a figure that is likely exaggerated), the price-lowering benefits of Walmart for this group are still two and a half times the costs.

In short, if trade and technology reduce demand for labor, the lowered labor costs paid by businesses will translate into lower prices. That can be expected to benefit Americans—including lower-income families—in the aggregate, despite the highly visible costs to those who bear the brunt of the resulting economic dislocations. The dystopic fantasy of an economy based on robots and overseas suppliers with mass involuntary joblessness at home will simply not come to pass.

The widespread feeling that the American economy is failing the middle class owes a great deal to the belief that wages have stagnated or declined. That belief is only half correct. For women, wages have risen smartly. For example, The State of Working America, an annual report by the Economic Policy Institute, indicates that median hourly wages among female workers increased by 24 percent from 1979 to 2007. That number grows to 35 percent (an increase of $4.60 per hour) with adjustments for the value of benefits such as health insurance and to better account for changes in the cost of living. Among men, however, the adjusted increase was only four percent.

During those same years, pay for both women and men badly trailed productivity, or the value of what workers produce per hour, which rose by 60 percent. We would be very right to worry if that disparity between productivity and wages were to continue. Here again, however, a longer-term perspective provides important context.

The hourly compensation of workers has failed to keep pace with productivity since the mid-20th century, but in the 1930s and ’40s pay raced ahead of productivity gains. By 1950, productivity was 65 percent higher than it had been in 1929, but hourly compensation was 115 percent higher. In contrast, pay and productivity rose by the same amount between 1900 and 1929. Workers in 1950 were making about 30 percent more than their productivity should have dictated. Correcting that overpayment required that compensation growth fall behind productivity growth. As of 2010, workers still made 14 percent more than productivity levels suggested they should have, despite the fact that productivity had grown faster than compensation since 1950.

The current Great Correction in the relationship between pay and productivity has surely been frustrating for men, who have borne the brunt of the pay slowdown. Women, who started from a lower base, have fared much better as a group, moving into better-paying jobs thanks to the erosion of discrimination and occupational segregation. But for women and men alike, there is a silver lining to this story. In time, the Great Correction will run its course, bringing productivity growth and compensation back into long-term alignment. At that point, pay and productivity should begin to move in tandem once more, putting Americans’ wages back on an upward trajectory. When will that happen? It would be foolish to attempt a prediction, but the closing of the compensation-productivity gap has proceeded slowly, suggesting that we may have to wait a while for the Great Correction to end.

The US economy has shown an amazing ability over the course of two centuries to create good jobs for Americans and to supply their wants and needs. A clear-eyed reading of long-term trends does not point to a fundamental breakdown in that ability. Even with the decline of manufacturing and the peaks and valleys of recent decades, the economy has been strong and dynamic enough to create jobs for millions of additional female and immigrant workers. There is no reason to think it cannot adapt to today’s challenges and whatever disruptions may lie in store.

Indeed, Americans, by and large, appear to have a healthy attitude toward the vicissitudes of the job market. According to a Kaiser Family Foundation survey conducted earlier this year, only 20 percent of employed workers are “very worried” about the possibility of losing their job. Thirty-six percent say they are “not at all” worried. And while job anxiety has to be taken seriously, it is not always well founded. After all, 20 percent of adults also say they are very worried about being a victim of gun violence, though their real risks are miniscule, and 15 percent fret about being caught up in a terrorist attack.

For most Americans, anxiety about work is a low-grade background concern, not a dark cloud over their everyday existence. The relentless focus in so much public debate on the most negative evidence, and on economic challenges much more than economic strengths, may needlessly raise anxiety levels. It also distracts us from the real problems we face. These include too many workers with limited skills, the plateauing of college graduation rates, distressingly stable economic inequality between white and black Americans, and persistent inequality of opportunity between children born into advantageous and disadvantageous circumstances. Not everyone faces pressing job insecurity, but we can do better by those at risk if we maintain the proper perspective. Ω

[Scott Winship is the Walter B. Wriston Fellow at the Manhattan Institute. Previously, he was a fellow at the Brookings Institution. His research interests include living standards and economic mobility, inequality, and insecurity. His research has been published in City Journal, National Affairs, National Review, The Wilson Quarterly, and Breakthrough Journal. Winship received a BA (sociology and urban studies) from Northwestern University and a PhD (social policy) from Harvard University.]

Copyright © 2016 Woodrow Wilson International Center for Scholars/Wilson Quarterly



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2016 Sapper's (Fair & Balanced) Rants & Raves

Thursday, June 30, 2016

He Keeps Coming Back — Five Pages At A Time (After Breakfast)

It must have been thirty years ago when this blogger heard Larry McMurtry speak at a speakers' series hosted by the Collegium Excellens. Unlike other writers the blogger heard in this series of annual lectures (James Dickey — Deliverance [1970] and Maya Angelou — I Know Why the Caged Bird Sings [1969, 1999]) who gave performances, McMurtry was laconic and unaware of himself. His words in today's essay by Texas Monthly's W.N. "Skip" Hollandsworth echo his undramatic appearance at the Collegium. Unlike the celebrity writers like Dickey and Angelou. you felt like you could ride to the river with Larry McMurtry. If this is (fair & balanced) Lone Stardom, so be it.

[TM]
The Minor Regional Novelist
By W.N. "Skip" Hollandsworth

TagCrowd cloud of the following piece of writing

created at TagCrowd.com

When Larry McMurtry is not in Archer City, the one-stoplight town in North Texas where he was raised, he can often be found in Tucson, Arizona, in a single-story, flat-roofed home in the immaculate neighborhood of Oracle Foothills Estates. He was there one morning this spring, standing at the glass-plated front door, dressed in a white collared shirt, black athletic pants, and blue New Balance athletic shoes. A red sweater he’d pulled over his shirt sported a couple of holes. His hair, once tousled and black, was now tousled and completely white. His handshake was soft. “Well, here you are,” he said matter-of-factly.Behind him, six dogs—five small, one large—raced to the entryway, a cacophonous committee of barking and sniffing and frantic circles. “You will find this to be a house of many creatures,” McMurtry said as he led me down a short hallway toward a bedroom, where a large black bunny sat in a hutch next to the bed, chewing contentedly. “Behold, Diana’s rabbit,” he said. “Her name is Beauty, because apparently Diana believes she is beautiful.”

Beauty’s owner—also, incidentally, the owner of the six dogs—is Diana Ossana, McMurtry’s longtime friend and writing partner. The house, a cozy and cluttered home full of books and photos, is hers too. McMurtry, who turned eighty last month, has lived there off and on since 1991; in 2011, after he married Faye Kesey, the widow of novelist Ken Kesey, she moved in as well.

McMurtry turned around and shuffled us to the other end of the hallway, to his and Faye’s bedroom. The room held a queen-size bed, draped in a white comforter and flanked on either side by bedside tables, on whose surfaces sat vitamin bottles and medications. There was also a sofa, a small desk with a computer for Faye, and a large wooden table for McMurtry. On the table was a stack of typing paper, some manila folders containing manuscripts, a clock, a bottle of Advil, a box of tissues, and McMurtry’s Hermes 3000 manual typewriter.

“Your bedroom is also your office?” I asked.

“It’s all I need,” he said, lowering himself slowly onto the sofa.

“And obviously, there’s no truth to the rumors that you’ve retired?” I asked, pointing to the manuscripts on the table.

“Retire?” said McMurtry, and he shrugged. “Who knows? I might have one more novel left in me.”

For more than fifty years, he has been writing novels—thirty in all, the plots ranging from Old West adventures to small-town comedies to contemporary domestic dramas. He has co-written two other novels and published fourteen books of nonfiction—short memoirs, collections of essays, a travel book, and even biographies of such frontier figures as Crazy Horse—as well as reams of book reviews and essays. He has written or co-written more than forty teleplays and screenplays. As if that were not enough, he has also found time to carry on a fulfilling side profession as a “bookman,” as he likes to call himself, traveling around the country to hunt for rare books and overseeing a huge antiquarian bookstore that he opened in Archer City in the eighties.

In American letters, he is something of an icon—winner of both a Pulitzer Prize (for the novel Lonesome Dove (1985, 2010), about a cattle drive in the 1870s) and an Oscar (for the screenplay to "Brokeback Mountain," (2005) which he co-wrote with [Diana] Ossana, about two sexually conflicted modern-day cowboys [sic, sheepherders]). His storytelling has been compared to that of Charles Dickens and William Faulkner, and even the famously self-absorbed novelist Norman Mailer—himself a winner of two Pulitzers—once confessed his admiration. “He’s too good,” he said, explaining his resistance to McMurtry’s novels. “If I start reading him, I start writing like him.”

Nowhere is that writing as fiercely cherished or as deeply felt as in Texas, the setting for the majority of his work, and which McMurtry has by turns elevated and eviscerated with the kind of marrow-piercing observations only ever allowed native sons. His fans in Texas—and they are legion—treat him with the adulation typically reserved for movie stars. In 2014, when he appeared at the Dallas Museum of Art to promote his latest best-seller, a western titled The Last Kind Words Saloon (2014), the 425-seat auditorium was filled to capacity, and dozens more ticket holders were ushered into overflow rooms to watch on simulcast. During a question-and-answer segment, audience members took turns commandeering the microphone to tell McMurtry what his books had meant to them. One woman spoke of her love of The Last Picture Show (1966, 1994), his novel about teenagers coming of age in the fictional North Texas town of Thalia; another brought up Terms of Endearment (1975, 1989), his novel about an indomitable grande dame in Houston’s wealthy River Oaks neighborhood. A man confessed that he had read the 843-page Lonesome Dove three times. And then another man rose to recall, in almost reverential tones, how as a student at Texas Christian University in the early sixties, he had played a game of Ping-Pong against McMurtry, who was then teaching at the school. The man had lost.

“Yes, I was quite good at Ping-Pong,” McMurtry replied, and the audience roared with laughter, as if it was the funniest thing anyone had ever heard.

I was raised just 25 miles from McMurtry’s hometown, in the metropolis—at least relatively speaking—of Wichita Falls. In the mid-seventies, when I was in high school, I bought a paperback copy of The Last Picture Show, turned to the first page, and started reading about Sonny Crawford, a high school senior who plays football for Thalia High, hangs out at a pool hall, drives a butane gas truck for his boss, Fred Fartley, and obsesses about having sex with girls in town, including his classmate Charlene Duggs, who kisses him “convulsively, as if she had just swallowed a golf ball and was trying to force it back up” but who allows him to touch her breasts for only a few minutes at a time.

I was mesmerized. I couldn’t believe that someone had written a novel about teenagers just like ones I knew in real life. Nor could I believe that such a book—set in the same plains where I’d grown up, with a character named Fred Fartley, of all things—was being praised as a literary masterpiece. “A performance rarely equalled in contemporary fiction,” read one critic’s quote on the cover of the paperback. Although I had no earthly idea then what good literature was, I knew I had stumbled onto something. The prose was both dramatic and slapstick funny, and the dialogue—pages and pages of it—was curiously riveting in its plainspokenness. I read almost all of it in one sitting.

I still own that paperback. It sits on a bookshelf in my house—a reminder, in some ways, of all the stories there are to tell, and that remain to be told, about Texas. No one knows this better than McMurtry, of course, who despite his age and prolific career continues to sit in front of his Hermes 3000 at least a few days a week, trying to knock out another chapter or scene.

In fact, when I looked closer at the manuscripts on his desk, I realized that he was working on not one but two novels. The first, which he had tentatively titled Boss Charlie, was based on the life and times of the nineteenth-century Texas cattleman Charles Goodnight; the second, Rich Girl, was about the life and times of a wealthy twenty-first-century woman who lives on a ranch outside Fort Worth.

What’s more, McMurtry told me, he and Ossana were on a deadline to complete a screenplay they had been commissioned to write by producer-director Cary Fukunaga; it was based on the true story of a man in Oregon who decides to walk across the United States after his fifteen-year-old son, bullied for being gay, hangs himself.

Surely, I said, taking a seat next to the sofa, such a workload must take its toll on a man who has just turned eighty.

“Well, my fingers aren’t as nimble as they once were, so I have trouble changing my typewriter ribbon,” McMurtry replied. “And there are days my vision gets so blurry that I can’t always see what I’ve typed. There are other days when my energy lags.”

He shrugged again. “But no, I’m not ready to quit. Not yet.”

Just then, Ossana, an attractive woman in her sixties with thick blond hair, walked in. She had been doing some work at the other end of the house. She gave me a cheerful grin. “Larry is like an old cowboy who has to get up in the morning and do some chores,” she said. “He has to get up and write. I don’t think he would know what to do with himself if he didn’t have something to write.”

Spend any time with McMurtry, and it doesn’t take long to be struck—no, run over—by his restless, roving intellect. During my time with him, he talked about the personal lives of European leaders during World War I, a Siberian leper colony, the 2016 presidential campaign, concussions among professional football players, an afternoon he spent playing tennis with Barbra Streisand in Hollywood, the problems with air travel, his love of Dr Pepper and Fritos, the geoglyphs that can be found in the Atacama Desert of Chile, and a rodeo performer he once knew whose boot was ripped off during a bull ride, then sent flying through the air until it clobbered a spectator sitting in the bleachers.

“I’ll never forget one night, when my whole family was here for dinner, Larry started talking about early-twentieth-century authors and he ended up talking about women and inverted nipples,” Ossana told me. “We just sat there, our mouths open, wondering how his brain works.”

But ask McMurtry about his writing—why he became a writer in the first place, or what inspires him, or if there’s an underlying meaning to his fiction, or any other such forced attempt at introspection—and he is steadfastly unreflective. “I like making stuff up,” he told me, simply.

When I tried again—What about process? Did he ever get stuck developing a plot? Seize up sometimes before a blank page?—he sighed. “I just write,” he replied. “You either do it, or you don’t.”

Nor does he have any particular desire to discuss the characters he has created or the books he has written. “As soon as I finish a novel and ship it to the publisher,” he told me, “I almost immediately lose interest in it and never read it again.”

“Even Lonesome Dove?” I asked.

“I’ve never reread it. I don’t hang on to any of my books. If I did that, I wouldn’t have time to think about what I’m going to do next.”

I looked at him for a few seconds to see if he was joking. He looked right back at me, his face impassive.

That his voracious curiosity, and his ability to spin yarns, were forged on the empty flatlands of rural Texas makes either no sense at all or all the sense in the world. He spent his early childhood on a small ranch fifteen miles outside Archer City, where his father had him riding a horse by the age of three and herding cattle at four. McMurtry told me there were no books in the house—not a single one—until a cousin heading off to World War II dropped off nineteen boys’ adventure books with such titles as Sergeant Silk: The Prairie Scout. His parents did not read to him. “They preferred sitting on the porch, swapping tales with other relatives, or we listened to the radio,” he said.

When McMurtry was six, his father moved the family to a white frame home in Archer City (population: 1,675) so that McMurtry could be close to school. Scrawny and bespectacled, McMurtry was a good student. (“Keep in mind,” he cautioned, “that a good student in Archer City was any student who actually attended class.”) When he got to high school, he joined the 4-H club, played the clarinet (and later trombone) in the marching band, acted in school plays, wrote what he called “one-paragraph editorials” for the school newspaper, ran the mile for the track team, and was a starter on the school’s dreadful basketball team, which had the distinction, he recalled, of losing one game, to Crowell High School, by a score of 106–4.

His greatest extracurricular interest, however, was books—the very thing he hadn’t had access to on the ranch. After devouring his cousin’s adventure series, he bought pulp novels from the paperback rack at Archer City’s drugstore. When he was in Fort Worth one weekend for a track meet, he took a city bus downtown just so he could wander through Barber’s Book Store. He read Don Quixote and Madame Bovary. He even leafed through the Bhagavad Gita. “Anything I could get my hands on, I’d read,” he said. “Reading took me away, at least for a little while, from the drabness of Archer City.”

McMurtry’s father, realizing his son had no aptitude for ranch work, hoped that he would enroll at Texas A&M and become a veterinarian. But after watching a television program about the Rice Institute (now Rice University), which then provided free tuition, McMurtry applied, was accepted, and soon was off to Houston. Upon receiving a score of 2 on his first calculus exam, however, he realized that he would never pass any of Rice’s math courses, and he eventually transferred to North Texas State College (now the University of North Texas), in Denton, just under two hours’ drive from Archer City. There, he took a creative-writing class and composed poems and short stories, including two about Texas ranchers that he decided to combine and turn into a novel.

“So even back then you dreamed about becoming a novelist?” I asked, remembering how, when I got to college, I’d decided to be a writer just like him and produce my own novel about a boy growing up in Texas. (I never made it past the second chapter.)

“No,” McMurtry said. “I wrote only to fulfill the requirements of my class. If there had been something more exciting to do, I would have done that.”

After graduating, in 1958, he married Jo Ballard, who had studied English at Texas Woman’s University, in Denton. He returned to Rice to study for a master’s degree in English—no math required—and planned to pursue a Ph.D. and spend the rest of his life teaching. But he kept working on the novel, banging out five double-spaced pages on his manual typewriter every morning, just after breakfast.

“So you were consumed with some mystical urge to write,” I said hopefully. McMurtry shook his head. “It was only an urge to finish what I had started,” he said. “I wanted the book done so I could move on with my life.”

Still, based on the first draft, he was awarded a fellowship to attend Stanford University’s prestigious creative-writing program [founded by Wallace Stegner]. His class was filled with other aspiring authors, such as Robert Stone, Tillie Olsen, Ernest Gaines, and Ken Kesey (who was married to his high school sweetheart, Faye). If there was an unspoken pressure to write the next great American novel—Kesey, for one, would soon go on to publish the wildly popular One Flew Over the Cuckoo’s Nest (1962)—McMurtry didn’t feel it. “I saw no conflict about writing about ranch life,” he told me evenly. “I thought it was a perfectly suitable topic for exploration.”

The novel, Horseman, Pass By, whose title comes from a poem by W. B. Yeats—was published by Harper in 1961, after the Stanford fellowship had ended and McMurtry had moved to Fort Worth, where he got a job teaching English at TCU (and yes, played Ping-Pong). Set in the fifties, the book tells the story of a noble but financially struggling North Texas rancher named Homer Bannon; his coarse, unscrupulous stepson, Hud; and his earnest teenage grandson, Lonnie. The book opens with a lovely description of the Texas plains in April “after the mesquite leafed out.” In a subsequent chapter, Lonnie describes a horseback ride across the high country with his grandfather. “There below us was Texas, green and brown and graying in the sun, spread wide under the clear spread of sky like the opening scene in a big western movie.”

At the same time, the novel is starkly unsentimental about rural life as the golden age of ranching is coming to an end. A state veterinarian orders Homer to destroy his herd of cattle over fear of hoof-and-mouth disease; among the cattle he must kill are two old Longhorn steers he loves. (“I been keeping ’em to remind me how times was,” says Homer.) Hud is a restless, violent man who sexually assaults the family’s cook. Driving home one night from a rodeo, Hud accidentally hits Homer, who is crawling, senile, on the side of the road, and Hud decides to put the old man out of his misery with a .22 rifle. At the end of the novel, a disillusioned Lonnie climbs into a cattle truck and heads toward the lights of Wichita Falls.

McMurtry told me he expected Horseman, Pass By to sell “maybe a handful of copies and disappear,” and, in fact, the novel was hardly a best-seller. But shortly after it was published, New York Times critic Charles Poore declared McMurtry, then just 25 years old, to be “among the most promising first novelists who have appeared this year.” Impressed in particular with McMurtry’s descriptions of the “gnarled pastoral side to Texas life,” Poore hailed him for offering a new understanding of Texas. “The material he has at his command as a descendant of Texan generations is usable in all kinds of new ways. We say that, obviously, in view of the narrow range in which [Texas] has been exploited so far in our literature. Mostly boots and saddles, or oil rigs and billionaires.”

The novel, Horseman, Pass By, whose title comes from a poem by W. B. Yeats—was published by Harper in 1961, after the Stanford fellowship had ended and McMurtry had moved to Fort Worth, where he got a job teaching English at TCU (and yes, played Ping-Pong). Set in the fifties, the book tells the story of a noble but financially struggling North Texas rancher named Homer Bannon; his coarse, unscrupulous stepson, Hud; and his earnest teenage grandson, Lonnie. The book opens with a lovely description of the Texas plains in April “after the mesquite leafed out.” In a subsequent chapter, Lonnie describes a horseback ride across the high country with his grandfather. “There below us was Texas, green and brown and graying in the sun, spread wide under the clear spread of sky like the opening scene in a big western movie.”

At the same time, the novel is starkly unsentimental about rural life as the golden age of ranching is coming to an end. A state veterinarian orders Homer to destroy his herd of cattle over fear of hoof-and-mouth disease; among the cattle he must kill are two old Longhorn steers he loves. (“I been keeping ’em to remind me how times was,” says Homer.) Hud is a restless, violent man who sexually assaults the family’s cook. Driving home one night from a rodeo, Hud accidentally hits Homer, who is crawling, senile, on the side of the road, and Hud decides to put the old man out of his misery with a .22 rifle. At the end of the novel, a disillusioned Lonnie climbs into a cattle truck and heads toward the lights of Wichita Falls.

McMurtry told me he expected Horseman, Pass By to sell “maybe a handful of copies and disappear,” and, in fact, the novel was hardly a best-seller. But shortly after it was published, New York Times critic Charles Poore declared McMurtry, then just 25 years old, to be “among the most promising first novelists who have appeared this year.” Impressed in particular with McMurtry’s descriptions of the “gnarled pastoral side to Texas life,” Poore hailed him for offering a new understanding of Texas. “The material he has at his command as a descendant of Texan generations is usable in all kinds of new ways. We say that, obviously, in view of the narrow range in which [Texas] has been exploited so far in our literature. Mostly boots and saddles, or oil rigs and billionaires.”

The rights to Horseman, Pass By were snapped up by a Hollywood producer, who turned it into the movie Hud, a kind of revisionist western starring Paul Newman. Released in 1963, "Hud" was a critical and commercial success, nominated for seven Academy Awards and winning three. In the meantime, Harper published McMurtry’s second novel, Leaving Cheyenne (1963), which follows the lives of three more rural North Texans through the first half of the twentieth century: the serious Gid Fry, who is being groomed by his father to take over the family ranch; Gid’s best friend, Johnny McCloud, a free-spirited cowboy; and their neighbor Molly Taylor, who loves both Gid and Johnny and bears them each a son. (In one scene, Gid’s father tells him that “a woman’s love is like the morning dew, it’s just as apt to settle on a horse turd as it is on a rose.”)

Again, critics were impressed. “If Chaucer were a Texan writing today, and only 27 years old, this is how he would have written and this is how he would have felt,” wrote Marshall Sprague in the New York Times. “The book’s comedy is rare, the tragedy heart-rending—and, over all, there is an atmosphere of serenity and wisdom.” Though McMurtry was not yet convinced he could make a living as a writer—he continued to supplement his income by teaching, moving from TCU to Rice—he decided to write one more novel, and in 1966 he published The Last Picture Show, which he based largely on Archer City and the people he knew growing up. Although Sonny doesn’t bed Charlene, he does begin an affair with the football coach’s good-hearted wife, while his best friend, Duane Moore, goes after the prettiest girl in town, Jacy Farrow, only to be dumped and then drafted to fight in Korea. Adding to the sense of quiet desperation, the lone movie theater in town shuts down.

McMurtry’s own mother, after reading one hundred pages, hid the book in a closet because of the profane language and sex scenes, including one in which teenage boys perform rather unnatural acts with a cow. A few Archer City residents were furious with McMurtry for portraying their town as dreary and desolate. But critics remained fascinated with the young writer’s ability: one compared the characters in The Last Picture Show to the frustrated small-town cast in Sherwood Anderson’s Winesburg, Ohio. Hollywood again came calling, and McMurtry sold the book’s film rights to Columbia Pictures, which also paid him to co-write the screenplay with Peter Bogdanovich, a promising filmmaker who would direct the movie. Bogdanovich hired relatively unknown actors—Randy Quaid, Timothy Bottoms, Jeff Bridges—to play the teenagers, as well as the model Cybill Shepherd, who had never appeared in a movie; for the adult roles, he hired veterans Cloris Leachman, Ben Johnson, Ellen Burstyn, and Eileen Brennan.

McMurtry showed up in Archer City for a couple of afternoons to watch the filming. By this time, he and Jo had divorced—they had one son, James, who would grow up to become a respected singer-songwriter—and he arrived in his hometown alone. Everyone on set was riveted by him. “He wore these Buddy Holly–like glasses, and he was so smart,” recalled Shepherd when I called her recently in California. “And he possessed this quiet charm. One day when it was cold, he got in the car I was sitting in and held my hands to keep them warm. He told me about the poetry of Yeats. I had never met anyone like him.”

When "The Last Picture Show" was released, in 1971, it was a sensation, receiving eight Academy Award nominations, including best picture, best director, and best adapted screenplay. (It received two Oscars: Leachman won for best supporting actress and Johnson won for best supporting actor.) Jack Kroll, Newsweek’s veteran film critic, went so far as to proclaim "The Last Picture Show" the best American movie since "Citizen Kane."

That his books translated so well to the screen would, over the next several decades, propel McMurtry to the kind of stratospheric fame he could have never envisioned for himself as a writer. By this time he had moved to Virginia, just outside Washington, DC, where he and a friend, Marcia Carter, the daughter of an oilman and diplomat, opened a rare-book store in Georgetown. McMurtry was hired by producers to write screenplays, including one based on John Barth’s philosophical novel The Floating Opera and another based on Hunter S. Thompson’s Fear and Loathing in Las Vegas. (They were never produced.)

Meanwhile, he churned out more novels, this time with a distinctly urban backdrop: Moving On, published in 1970, features a sharp-tongued 25-year-old married woman in Houston named Patsy Carpenter who has a taste for extramarital affairs; All My Friends Are Going to Be Strangers, published in 1972, details the adventures of Danny Deck, a young writer at Rice who has just learned his novel has been accepted for publication. And then came 1975’s Terms of Endearment, a novel of domestic manners whose protagonist, Aurora Greenway, makes dramatic pronouncements (“The success of a marriage invariably depends on the woman”), juggles a bevy of suitors (among them, a wealthy oilman who lives in a Lincoln Continental that’s parked on the twenty-fourth floor of a parking garage he owns in downtown Houston), and takes care of her daughter, Emma, who over the last sixty pages of the book dies of cancer.

McMurtry’s seemingly effortless shift—from rural to citified, from ranchers to socialites—was met with immediate praise. If before his work had caught attention for its unsparing portrayal of the state’s agrarian identity, now he was lauded for so easily embracing Texas’s emerging modernity. The New York Times critic Janet Maslin was so impressed at his ability to capture the inner lives of women in Terms of Endearment that she would later seek to credit him as the father of chick lit.

Needless to say, Hollywood pounced again, and the movie "Terms of Endearment"—this one featuring Shirley MacLaine, Debra Winger, and Jack Nicholson—became a blockbuster hit in 1983, receiving eleven Academy Award nominations and winning five. At least one actress contacted McMurtry to meet for lunch or drinks, hoping to persuade him to write a novel that could be turned into a movie starring her. Rumors abounded that he carried on flings with a few starlets; one story went that he had even been caught kissing one in a convertible on Sunset Boulevard.

“Not true,” he told me.

“Is the rumor true you ended up having a brief romance with Cybill Shepherd?” I asked.

“That’s a little true,” he replied, smiling ever so slightly. I smiled too. It was nice to know that, on occasion, the scrawny small-town boy with thick glasses can get the prettiest girl.

If McMurtry was impressed by all the attention, however, he didn’t show it. As a joke—or maybe it wasn’t a joke—he sometimes wore a sweatshirt imprinted with the words “Minor Regional Novelist.” “I was a minor regional novelist from Texas,” he told me. “That’s all I was.”

In fact, over the next few years, he did try to expand his fictional territory. After Terms of Endearment, he wrote Somebody’s Darling (1978), about a female director in Hollywood on the verge of great fame; Cadillac Jack (1982), which follows a rodeo bulldogger turned antiques collector as he womanizes his way across the country, eventually ending up in Washington, DC; and The Desert Rose (1983), which chronicles the life of an aging dancer in Las Vegas.

He also attempted to distance himself from other Texas writers, including the great J. Frank Dobie, skewering them publicly for ignoring the realities of an evolving state, with its rapidly sprawling suburbs, in favor of nostalgic historical novels. In an essay published in the Texas Observer in 1981, McMurtry lambasted these Old West novels for being nothing more than “Country-and-Western literature,” overly romanticized stories about honorable cowboys and the joys of the open range.

Still, he could not escape the state’s own hold on him: even in the narratives set outside Texas, his characters often had to contend with the state in one way or another. In one comic scene in Somebody’s Darling, two screenwriters, Elmo Buckle and Winfield Gohagen, steal the director’s master print in hopes of secreting it away to Texas. “Texas is the ultimate last resort,” says Gohagen. “It’s always a good idea to go to Texas, if you can’t think of anything else to do.”

And then McMurtry himself chose to go to Texas in a way no one could have expected. He began writing an old-fashioned western about a cattle drive.

Based on his extensive reading of western history, as well as on the stories he had heard his relatives tell on the front porch, McMurtry saw the Old West not as a romantic frontier but as a shatteringly lonely and often barbaric place, where few people found any happiness at all. Now McMurtry set out to prove this, opening his novel with two retired, hard-bitten Texas Rangers in the forlorn border town of Lonesome Dove.

The ex-Rangers, Augustus “Gus” McCrae and Woodrow Call, lead a cattle drive to Montana with a ragtag team of cowpokes, which includes a black cowboy, a bandit turned cook, a piano player with a hole in his stomach, a young widow, a teenager who is Call’s unacknowledged son, and a prostitute. On their journey, the group encounters psychopathic outlaws, vengeful Indians, buffalo hunters, gamblers, scouts, cavalry officers, and backwoodsmen. They endure perilous river crossings, thunderstorms, sandstorms, hailstorms, windstorms, lightning storms, grasshopper storms, stampedes, drought, and a mean bear. There are plenty of shootings and a few impromptu hangings. The prostitute, Lorena, is gang-raped. In the end, after McCrae is mortally wounded by Indians, he asks Call to bury him in a little peach orchard by the Guadalupe River near San Antonio, where he was once in love with a woman. Call dutifully carries his partner’s half-mummified body back to Texas.

McMurtry told me he was offered “maybe a ten-thousand-dollar advance” for Lonesome Dove, because his editor was not sure readers would want to buy a western the size of War and Peace. (McMurtry accepted the advance because he wasn’t sure people would want to buy it either.) But when Lonesome Dove was released, in 1985, it grabbed hold of the public’s imagination like no western of its time, selling nearly 300,000 copies in hardcover and more than a million copies in paperback.

Readers raved over McMurtry’s precisely drawn characters, his depictions of place, his ear for frontier idioms, and his action-packed set pieces. They memorized lines of dialogue (“The older the violin, the sweeter the music”; “Ride with an outlaw, die with him”; Call’s unforgettable declaration after beating a surly Army scout to a pulp in front of shocked onlookers: “I hate rude behavior in a man. I won’t tolerate it”). And they reveled in the details, whether about food eaten on the cattle drive (beans laced with chopped rattlesnake) or, say, medical treatment (a cowboy bitten by an angry horse is given axle grease and turpentine for his wound). For Texans, went one joke, Lonesome Dove had become the third-most-important book in publishing history, right behind the Bible and the Warren Commission report.

The book was awarded the Pulitzer the following year, and when it was inevitably adapted for the screen—CBS aired a four-part miniseries based on the novel in 1989, starring Tommy Lee Jones and Robert Duvall—a staggering 26 million viewers tuned in. Together, the novel and miniseries were arguably more influential in shaping Americans’ vision of the Old West than the movies of John Ford.

When I asked McMurtry about Lonesome Dove’s success, he did one of his shrugs. “It isn’t a masterpiece by any stretch of the imagination,” he said. “All I had wanted to do was write a novel that demythologized the West. Instead, it became the chief source of western mythology. Some things you cannot explain.”

McMurtry’s fame grew all the more: Annie Leibovitz took his photograph; universities invited him to lecture. “There were at least five men around the country who pretended to be me so that they could seduce women,” he said. “One woman called and said, “Don’t you remember who I am? I slept with you on Thanksgiving Day.’ I said, ‘No, ma’am, I was with my family on Thanksgiving Day.’ ”

Protective of his privacy, he embraced a peripatetic life, driving rented Lincoln Continentals around the country, visiting friends and secondhand bookstores. In addition to an apartment he kept above his bookshop in Georgetown, he had apartments in Los Angeles and Houston. He also purchased a two-story, prairie-style mansion for himself in Archer City, and, hoping to create an American version of Hay-on-Wye, the Welsh town that draws book lovers from all over the world, he opened an enormous used-book store in his hometown. Spread over four buildings downtown, it consisted mostly of the inventories he bought from other secondhand booksellers who wanted to get out of the business.

For Archer City residents, the resentment they had felt toward McMurtry over The Last Picture Show was long gone. One woman opened the Lonesome Dove Inn. The owner of the Dairy Queen taped the covers of McMurtry’s novels to the wall. When the New York author Susan Sontag came to Archer City, she looked around and told McMurtry that he lived in his own theme park.

I asked McMurtry again if it was really true that he hadn’t reread the novel. “I haven’t reread it, and incidentally, I’ve only watched parts of the miniseries,” he replied. “I’ve got other things to do.”

What he did was continue to write—relentlessly, pounding out his five pages daily on the Hermes 3000, which he took with him everywhere. (McMurtry, who despises computers, eventually purchased more than two dozen Hermes typewriters, which he kept in places around the country.) He seemed to issue forth a book every year or so, sometimes twice a year. He wrote another Old West novel (Anything for Billy (1988), about Billy the Kid). He wrote Texasville (1987), a sequel to The Last Picture Show, in which Sonny and Duane are middle-aged and still living in Thalia, and he also wrote The Evening Star (1992), a sequel to Terms of Endearment, in which Aurora reads Proust and realizes her life is slipping away. He wrote more screenplays and composed book reviews and literary essays for such publications as the New York Review of Books.

When I asked McMurtry’s close friend Susan Freudenheim, a former Fort Worth museum curator who is now the executive editor of the Jewish Journal, in Los Angeles, how McMurtry was able to do so many things at the same time, she sighed. “There’s no way to explain it,” she said. “He operates on a different plane than everyone else. One day, Larry and I were out somewhere, and he said he had to take me back to my apartment because he had to write a long book review for the Times. He said he would be done in an hour. I thought, ‘Impossible.’ But one hour later, there he was, back at my door, his review typed up and ready to send.”

One would think that if there was anyone who did not need a writing partner, it would be McMurtry. But in the late eighties, on a visit to Tucson, he met Diana Ossana, who was then a legal assistant and the mother of an eleven-year-old daughter. (The two happened to be dining at the same all-you-can-eat catfish restaurant.) They became very close friends—neither would describe their relationship to me as romantic—and in 1991, when McMurtry underwent quadruple-bypass surgery after suffering a heart attack, she told him he was welcome to stay in her back bedroom and recuperate at her home.

He arrived with a typewriter and some books—among them, his twelve-volume edition of Proust and the diaries of Virginia Woolf. Sitting at Ossana’s kitchen counter, he quickly crafted Streets of Laredo (1993), a sequel to Lonesome Dove, in which Call, as an old man, is hired by a railroad to chase a Mexican bandit. But afterward, McMurtry began to experience some post-surgical depression, sitting on the couch and staring out the window for hours. Ossana intervened. “I realized that if he didn’t write his five pages a day, he would die,” she told me. “So I basically forced him to go back to work.”

An unpublished author who had written a few short stories, Ossana occasionally made comments about McMurtry’s manuscripts. She sometimes rearranged passages and suggested plot twists. “She was very good, and I was very grateful,” said McMurtry. In fact, when a producer called and asked for a screenplay based on the life of Pretty Boy Floyd, the Depression-era gangster, McMurtry said he would do it only if Ossana could be his co-writer. The two wrote the script, then turned it into a novel when the film was not produced. Once McMurtry finished a prequel to Lonesome Dove (Dead Man’s Walk [1995]), they also co-wrote another novel, a western titled Zeke and Ned (1997), as well as a couple of teleplays and screenplays, including, hilariously enough, a film adaptation of the fifties television series "Father Knows Best." (It was never made.)

Then, in 1997, Ossana handed McMurtry a short story in The New Yorker that had been written by Annie Proulx, titled “Brokeback Mountain,” about Ennis del Mar and Jack Twist, two ranch hands who fall in love in Wyoming as teenagers in 1963 and continue their tortured affair, furtively, over the next twenty years. Envious that he hadn’t thought of the story first, McMurtry—who had known gay cowboys growing up—embraced the idea of bringing the narrative to the screen. He and Ossana secured the rights to the story from Proulx, wrote a script, and waited to see if the movie would be made.

While they waited, McMurtry knocked out some more books: another Lonesome Dove prequel (Comanche Moon [1997]), a series of four novels about a British family who travels through the American frontier in the 1830s, and Duane’s Depressed (1999), in which the former football-playing teenager from Thalia is now an oil millionaire and so bored with his life that he stops driving his pickup truck, begins walking everywhere instead, falls in love with his lesbian psychiatrist, and eventually flies to Egypt to see the pyramids.

McMurtry even wrote an odd book of history titled Oh What a Slaughter! Massacres in the American West: 1846–1890 (2005). When I asked him why he took on such a subject in the midst of everything else, he said, simply, “The massacres interested me.”

“Did you ever consider slowing down, at least a little? Take some sort of break from writing to restore your creative batteries?” I asked.

McMurtry gave me one of his stares. “Writing is what I do,” he said.

McMurtry had no expectation that "Brokeback Mountain" would ever be produced; a gay-cowboy movie was not exactly on the wish lists of Hollywood studio executives. But then famed director Ang Lee jumped onboard, as did stars Heath Ledger and Jake Gyllenhaal, and when the film was released, in 2005, it provoked an unprecedented national conversation about sexuality, machismo, and the power of story. The movie earned eight Academy Award nominations, including for best picture. (It won three: for best director, best adapted screenplay, and best original score.) Just as McMurtry had strived to do in his novels, "Brokeback Mountain" took a familiar genre—cowboy life—and shattered it.

At the Academy Awards ceremony, which he attended with Ossana, McMurtry wore blue jeans with a tuxedo shirt and an Armani tuxedo jacket, a sartorial choice that got him almost as much media attention as the A-list actresses in their designer gowns. “I wanted to be comfortable, because going to the Oscars is like sitting all day in a gymnasium,” he told me. “Anything to make it less onerous, I will do.”

It might have seemed a perfect time to retire at the top, or at least scale back. McMurtry was 69, and as he himself had suggested in his 1999 collection of essays, Walter Benjamin at the Dairy Queen, most fiction writers in their sixties lose their touch. “Self-repetition, if not self-parody, are the traps that await elderly novelists,” he wrote.

In fact, some critics were already accusing McMurtry of parodying himself in his Lonesome Dove sequel and prequels. (“It turns out the person who can write the best parody of Larry McMurtry is Larry McMurtry,” wrote one.) The New York Times’ Dwight Garner panned him for writing too many books. “He writes so much that supply outstrips demand,” he snapped. “A lot of his stuff verges on being—how to put this?—typed rather than written.”

McMurtry ignored the criticism. “I’ve never written to please other people,” he told me. In the years following his Oscar, he published three short memoirs (about his life as a writer, his life as a book collector, and his life in the film business), a short biography of General George Armstrong Custer, and two more novels (When the Light Goes [2007] and Rhino Ranch [2009]) featuring Duane Moore, who still lives in Thalia, suffering from clogged arteries because of eating too many butter-basted T-bones at a steakhouse in nearby Seymour. Duane, who is often said to be McMurtry’s alter ego, tries to find love with a variety of women, ponders all the hits and misses of his life, and finally keels over dead, alone, while laying a trotline.

Then, in early 2011, McMurtry pulled a Duane-like move. He picked up the phone and called Faye Kesey, who lived on a farm in Oregon and who had not remarried since Ken’s death, in 2001. McMurtry, who told me he had always maintained “a curiosity” about Faye since meeting her during their Stanford days, arranged for Faye to fly to Texas and come to Archer City. “I had wanted to see his bookstore,” explained Faye, who today is a spry and very pretty 81-year-old with blue eyes and shoulder-length gray hair. “And I could tell Larry wanted some company. He seemed a little lonely.”

The visit quickly turned romantic. McMurtry bought a ring from a jewelry store in Wichita Falls, and in April they were married in Archer City by a justice of the peace before members of their families and a small group of McMurtry’s friends, including Ossana, Freudenheim, his former bookstore partner Marcia Carter, and the actress Diane Keaton, who has long wanted to turn Somebody’s Darling into a movie. (Cybill Shepherd couldn’t make it.) In his wedding vows, McMurtry told Faye, “I promise I will always be interesting.”

“And has he been?” I asked Faye.

“Oh, very interesting,” she said, smiling, her eyes radiant behind her glasses. “I have to say, you don’t find many people like Larry.”

Soon after the wedding, McMurtry was back to writing fiction, and in 2014 he published The Last Kind Words Saloon, a spare novel that follows the Old West icons Wyatt Earp and Doc Holliday from a saloon in the Texas settlement of Long Grass to Buffalo Bill’s Wild West show in Denver to the climactic gunfight at the OK Corral, in Tombstone, Arizona. The book made best-seller lists. In the New York Review of Books, Joyce Carol Oates gushed, “It’s as if Vladimir and Estragon of Beckett’s "Waiting for Godot" have been transformed into two aging gunslingers trading wisecracks and platitudes in an existentially barren western landscape, waiting for a redemption that never comes.”

Just as he had ignored the criticism, McMurtry ignored the praise. “Oh, the book was fine,” he said when I mentioned Oates’s review. “It could have been better.” I asked if he still agreed with his assessment in one of his memoirs, Literary Life (2009), in which he’d declared that none of his work was “really great.” He nodded. “Maybe a couple of books will last,” he said. “But the rest will end up on back shelves of bookshops. There could be worse fates.”

Was he right? Perhaps, I thought, it was true that the next generation of readers, an increasingly diverse swath of globalized and digitized consumers, would know little of McMurtry or his work. But it was also true—and I knew this as a reader myself—that McMurtry had forever shaped the way people see Texas, with all of its past, all of its stories, all of its changes. As Mark Busby, a professor of English at Texas State University and a leading McMurtry scholar, told me, “What no one can deny is that McMurtry has made Texas feel very real. His books have taught people that Texas is not just a curious part of the country but an unforgettable piece of the American experience.” He continued, “Think about it. Parents are still naming their boys Gus or Call. I named my own dog Hud.”

Several weeks after my trip to Tucson, I went to see McMurtry in Archer City, where he and Faye were spending a few days. Except for a couple of new businesses, the town looks almost exactly as it did when he was a boy: dry and dusty, the sole stoplight still blinking, the movie theater shuttered. McMurtry’s house, which was once the country club, sits a few blocks from downtown.

The house, which has floor-to-ceiling bookcases in almost every one of its fourteen rooms, the shelves filled with McMurtry’s favorite books—28,000 in all—has the air of an invitation-only private library. As Faye cheerfully put dishes away in the kitchen, McMurtry gave me a tour, pointing out some of his prized collections: novels written by Russians, novels written by poets, novels written about the Yellow Peril, travel books written in the nineteenth century by women. He stopped to show me a 1929 edition of Nathanael West’s novel, Miss Lonelyhearts, which he had found after a 25-year search. “It cost me six thousand dollars,” he said. “Now you know the real reason I keep writing.”

He put the book back on its shelf, then led the way to the dining room, where one of his Hermes 3000s sat on the table. There was a slight wobble in his walk. McMurtry, ever more frail, is slowing down: his balance, he told me, is not good, and he has had to stop driving because of his weakening eyesight. He takes blood-pressure and blood-thinner medications for his heart; he has trouble going up and down stairs. Last fall, when he received a National Humanities Medal from President Obama at a White House ceremony, he wore his New Balances instead of his favorite cowboy boots because he was worried he might slip and fall on the marble floors. “Old age comes on apace to ravage all the clime,” he said, quoting the eighteenth-century Scottish poet James Beattie. I asked him if he planned to be buried in Archer City some day. “No, I think I’ll go for cremation,” he said, matter-of-fact as usual. “I’ll have my ashes kept in the bookstore. That seems appropriate.”

We soon headed to the bookstore—Faye driving McMurtry, me following in my car. The store now takes up only two buildings; in 2012 McMurtry held a large book auction, selling off nearly three-fourths of his inventory to make it more manageable. Inside, there were only a couple of customers. “Our business these days is largely online,” he said as we strolled the aisles. “It’s sad to say, but the era when one wandered through an old bookshop is almost gone forever.”

McMurtry had one more thing to show me. We got back in our cars, and I followed as Faye drove to the little ranch outside town where McMurtry spent his early childhood. As we turned off the highway and onto a dirt road, I was struck by how much the plains around us looked like those McMurtry described 55 years ago in Horseman, Pass By. The mesquite had begun to leaf out, and new grass was carpeting the flats. As we crested a hill, the ranch country spread wide under the clear spread of sky like the opening scene in a big western movie.

We pulled up to the ranch house. Not far away, on another property, was a giant wind turbine, its blades slowly turning. McMurtry haltingly made his way across the small front yard, holding onto Faye’s arm as he stepped onto the stone porch. He opened the door, and as we walked inside, I took a breath. Each room, even the kitchen, was filled with books, neatly arranged in customized bookshelves that McMurtry had added to the house. “I thought you would like this, to see all the books in my once-bookless home,” McMurtry said.

A few minutes later, he seemed ready to return to Archer City. It was apparently time to get back to the Hermes 3000. There were, after all, two new novels to work on and a screenplay to write.

We stepped back out on the porch, and he took another look around, his gaze turning south. In the distance stood the Cross Timbers, a belt of trees that marks the lower border of the Great Plains in Texas.

“You know, people have no idea how empty the world is out here,” McMurtry said. “They don’t understand its bleakness.”

And yet you keep coming back,” I said.

“I keep coming back,” he replied. A light breeze came up, blowing wisps of his white hair across his forehead. “I admit, I always do.” Ω

[Walter Ned "Skip" Hollandsworth was raised in Wichita Falls, Texas and graduated with a BA (English) from Texas Christian University. He has worked as a reporter and columnist for newspapers in Dallas, and he also has worked as a television producer and documentary filmmaker. Since joining Texas Monthly in 1989, Hollandsworth has received several journalism awards, including a National Headliners Award, the national John Hancock Award for Excellence in Business and Financial Journalism, the City and Regional Magazine gold award for feature writing, the Texas Institute of Letters O. Henry award for magazine writing, and the Charles Green award for outstanding magazine writing in Texas, given by the Headliners club of Austin. He has been a finalist four times for the National Magazine Awards, the magazine industry’s equivalent of the Pulitzer Prize, and his work has been been included in such publications as Best American Crime Writing and Best American Magazine Writing.]

Copyright © 2016 Emmis Publishing /dba/ Texas Monthly





Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2016 Sapper's (Fair & Balanced) Rants & Raves

Wednesday, June 29, 2016

The Jillster Explains Why You Will Feel Like A Dental Patient Who Had Two Root-Canals In A Row Later This Summer (Cleveland, July 18–21 and Philadelphia, July 25–28, 2016)

Late next month, we will be regaled by our equivalent of a "... tale told by an idiot, full of sound and fury, signifying nothing." After the smoke clears in late July 2016, the two presidential candidates will engage in what a Dumbo pundit has predicted will be a "death match." So, it appears that the nominating conventions will be the equivalent of the pre-match shenanigans of a World Wrestling Entertainment "Survivor Series." In the meantime, Professor Jill Lepore offers a survey course in US political conventions. If this is a (fair & balanced) study of political inanity, so be it.

[x New Yorker]
How To Steal An Election
By The Jillster (Jill Lepore)

TagCrowd cloud of the following piece of writing

created at TagCrowd.com

At sunrise on the day before the Republican National Convention begins, in Cleveland, a hundred women will take off their clothes and pose for the photographer Spencer Tunick outside the convention hall. Naked, they’ll be holding up big, round mirrors to the sky, to catch the light. “Women will decide the outcome of this election,” Tunick says. He insists that his installation is not a political protest. “This is a work Republican women can participate in,” he says, bipartisanly.

This year’s Conventions will be held back to back, like a doubleheader, or two root canals in a row. The week after the Republicans meet in Cleveland, the Democrats will meet in Philadelphia. First Trump, then Clinton. But, what with the anti-Trumpers and the pro-Sandersers, some people are worried that all hell might break loose, which is unusual, since people more commonly worry that the Conventions will be boring. “At first blush, the Republican National Convention at Cleveland next week promises to be a very dull show,” H. L. Mencken wrote in 1924, when the incumbent, Calvin Coolidge, was the all but assured nominee. “Some dreadful mountebank in a long-tailed coat will open. . . with a windy speech; then another mountebank will repeat the same rubbish in other words.” And, while that really is what happens, lately more than ever (since 1952, no Convention has gone past the first ballot) the Conventions are never boring, if only because of the high jinks, not to mention the low jinks. In Chicago in 1864, the Democrats installed a giant sign made of coiled gas pipe. It was supposed to read “McClellan, Our Only Hope,” but the gas jets broke and the thing just flickered and died, hopelessly. Roscoe Conkling was so sure he’d get the nod in 1876 that he picked his Vice-President and a motto—“Conkling and Hayes / Is the ticket that pays”—only to be defeated by his erstwhile running mate, ever after known as Rutherfraud B. Hayes. [Actually Hayes received the nickname in the resolution of the election of 1876.]

Until 1932, when FDR decided to show up to accept his nomination, the candidates themselves skipped the Conventions, citing modesty, a precedent set a century before by Henry Clay. Asked by letter if he would be willing to be nominated by the short-lived National Republicans, at their one and only Convention, Clay wrote back to say yes but that it was impossible for him to attend the Convention “without incurring the imputation of presumptuousness or indelicacy.” When Grover Cleveland received a telegram at the White House informing him that he had been renominated by a Democratic Convention meeting in St. Louis, he said, “Heavens, I had forgotten all about it.” Many a journalist might not have minded if the candidates had maintained the tradition of keeping away. “Interviewing a candidate is about as intimate as catching him on television,” Norman Mailer wrote from the Republican Convention in Miami in 1968, to which some GOP genius had flown in a pachyderm. “Therefore the reporter went to cover the elephant.”

It’s not all a bamboozle, especially not this election. The White House is at stake, and more, too: the state of the union. The worry, this time around, isn’t that the Conventions will be boring; it’s that they’ll be interesting, frightfully.

The Presidential-nominating Convention is an American invention. It is the product of a failure of the Constitution. Kings are born; Presidents are elected. How? This is a math problem and it’s a political problem, and it’s been solved but never resolved. The first nominating Convention was held in 1831. It was an attempt to wrest power away from something known as the legislative caucus, which was itself an attempt to wrest power away from the Electoral College. The first primary was held in 1901. It was an attempt to wrest power away from the nominating Convention. This year, there’s been a lot of talk about how the system is “rigged” by “the establishment.” It was exactly that kind of talk that got us the caucus, the Convention, and the primary, institutions built in the name of making American democracy more representative and more deliberative. But the more representative the body the less well it is able to deliberate: more democracy is very often less.

How to elect a President was vexed from the start. At the constitutional convention in Philadelphia in 1787, the men who framed the federal government made a great many compromises, but “the Convention were perplexed with no part of this plan so much as with the mode of choosing the President,” as the Pennsylvania delegate James Wilson later explained. Some delegates believed that Congress should elect the President. This allowed for popular participation in government while avoiding what Hamilton called the “excess of democracy.” But having Congress elect the President violated the principle of the separation of powers. Wilson proposed that the people elect the President directly, but Madison pointed out that the Southern states “could have no influence in the election on the score of the Negroes.” That is, the South had a lot of people, but a third of them were slaves; in a direct election, the North, which had a lot of people but very few slaves, would have had more votes. Wilson therefore suggested the Electoral College, a proposal that built on a mathematical compromise that had taken the delegates most of the summer to devise. Under the terms of the three-fifths compromise, each state was granted one representative in Congress for every thirty thousand people, except that slaves, who could not vote, counted as three-fifths of a person. Wilson’s proposal applied this formula to the election of the President: the number of each state’s electors in the Electoral College is the sum of its congressional delegation, its two senators plus its number of representatives. Substituting electors for voters conferred on the slave states a huge electoral advantage, once the first census was taken, in 1790. Virginia and Pennsylvania had roughly equivalent free populations, for instance, but Virginia, because of its slave population, had six more seats in the House than did Pennsylvania, and therefore six more electors in the Electoral College. This bargain helps to explain why the office of the President of the United States was, for thirty-two of the first thirty-six years of its existence, occupied by a slave-owning Virginian.

In the first two Presidential elections, George Washington ran unopposed. But by 1796, when Washington announced that he would not run for a third term, the polity had divided into parties, a development that the Electoral College was not designed to accommodate. One Federalist complained that he hadn’t chosen his elector “to determine for me whether John Adams or Thomas Jefferson is the fittest man for President. . . . No, I choose him to act, not to think.” To better delegate their electors, Federalists and Republicans in Congress began meeting in a caucus where they decided their party’s Presidential nominee.

Early American Presidential elections were not popular elections, not only because the vote was mainly restricted to white male property owners but also because delegates to the Electoral College were elected by state legislatures. The legislative caucus worked only as long as voters didn’t mind that they had virtually no role in electing the President, a situation that lasted for a while since, after all, most people living in the United States at the time were used to having a king. But a new generation of Americans objected to this arrangement, dubbing it “King Caucus.” “Under what authority did these men pretend to dictate their nominations?” one citizen asked in 1803. “Do we send members of Congress to cabal once every four years for president?” New states entering the union held conventions to draft state constitutions, in which they adopted more democratic arrangements. This put pressure on old states to revise their own constitutions. By 1824, eighteen out of twenty-four states were holding popular elections for delegates to the Electoral College. Between 1824 and 1828, the electorate grew from fewer than four hundred thousand people to 1.1 million. Men who had attended the constitutional convention in 1787 shook their gray-haired heads and warned that Americans had crowned a new monarch: “King Numbers.”

That king still sits on his throne. “The first principle of our system,” Andrew Jackson, of Tennessee, insisted, is “that the majority is to govern.” The Electoral College couldn’t be undone except by a constitutional amendment. But the legislative caucus could be. The first call for the beheading of King Caucus came in 1822, in the pages of the New York American. Two years later, after the press learned about a caucus meeting to be held in the House, only sixty-six out of two hundred and forty legislators were willing to appear before a disgruntled public, which flooded the galleries shouting, “Adjourn! Adjourn!” And so it did.

The Anti-Masonic Party, formed to end the reign of secret cabals, held the first Presidential-nominating Convention, in September, 1831. Unfortunately, the man chosen as the Party’s nominee turned out to be. . . a Mason. The Anti-Masons left two legacies: the practice of granting to each state delegation a number of votes equal to the size of its delegation in the Electoral College, and the rule by which a nomination requires a three-quarters vote. Other practices have not endured. Two months after the Anti-Masons met, the National Republican Party held a Convention of its own, in which it called on the states not in alphabetical order but in “geographical order,” beginning with Maine, and working down the coast, causing no small amount of consternation among the gentlemen from Alabama. The practice of holding a national Convention might not have endured if Jackson hadn’t decided that the Democratic Party ought to hold one, too. Jackson wanted to boot out his Vice-President, John C. Calhoun, who believed that states had a right to nullify federal laws, a position that Jackson opposed. Jackson and his advisers realized that if they left the nomination to the state legislatures, where Calhoun had a lot of support, they’d be stuck with him again. Jackson contrived to have the New Hampshire legislature call for a national Convention. In 1835, Jackson issued the call for a nominating Convention himself, in an extraordinary letter to the American people:

I consider the true policy of the friends of republican principles to send delegates, fresh from the people, to a general convention, for the purpose of selecting candidates for the presidency and vice-presidency; and, that to impeach that selection before it is made, or to resist it when it is fairly made, as an emanation of executive power, is to assail the virtue of the people, and, in effect, to oppose their right to govern.

The point of this Convention was to assure the nomination of Jackson’s handpicked successor, Martin Van Buren, and to allow Van Buren to contrive for his choice, Richard Johnson, to win the Vice-Presidential nomination. But Tennessee, whose support for Jackson had begun to waver, refused to send a delegation to the Convention, held in Baltimore. With fifteen electors, Tennessee had fifteen votes at the Convention. Unwilling to lose those votes, Van Buren’s convention manager went to a tavern, found a Tennessean named Edward Rucker, who just happened to be in Baltimore, and made him a one-man, fifteen-vote delegation. “Rucker” became a verb.

Populism is very often a very clever swindle. But since 1831, with only one exception—the Whigs in 1836—every major party has nominated its Presidential candidate at a Convention.

There is no end to the ruckery in the annals of American history. “Absolutely rigged,” Trump said about the nomination process in April. “I wouldn’t use the word ‘rigged,’ ” Bernie Sanders said in May. “I think it’s just a dumb process.”

The first party “platform” was adopted at a Convention in 1840, during an election that also introduced more rough-hewn lumber in the form of log cabins.(Whigs paraded them around the country, on wheels.) Platform-committee meetings are chest-thumping contests between warring clans within the parties; in exchange for conceding, defeated candidates tend to have a lot of influence over the platform. Even without having conceded, Sanders won from the DNC additional seats on the platform committee; he then named as his delegates celebrity progressives like Cornel West and Bill McKibben. RNC platform-committee delegates include the conservatives Tony Perkins, the head of the Family Research Council, and David Barton, a Texas evangelical and amateur historian who has lectured for Glenn Beck’s online university; both were supporters of Ted Cruz. This year, the GOP is also crowd-sourcing the committee’s work at platform.gop, asking anyone who visits the site to rank issues about, for instance, the Constitution: Which is more important to you, human life or the Second Amendment?

In 1844, when the incumbent President, John Tyler, found himself without a party, he called for a third-party Convention to nominate him, in order to persuade the Democrats to nominate him at their own Convention. (These and other escapades are recounted by Stan Haynes, the most exhaustive chronicler of the Conventions, in a series of invaluable books.) Tyler campaigned on a promise to annex Texas. Two weeks before the Democratic Convention was to begin, in Baltimore, Jackson called a meeting. Jackson said he wanted “an annexation man, and from the Southwest.” James K. Polk, who was unknown outside Tennessee, became that man. (“I wish I could slay a Mexican,” Henry Clay said four years later, when the names on the ballot were mainly those of generals who had fought in the Mexican-American War.)

One lesson of American Presidential history: You can’t beat somebody with nobody. Desperate, late-in-the-day attempts to draft into the race, say, Mitt Romney are unusual at this point in American history. But running a dark horse was a minor American art form well into the twentieth century. George Bancroft finagled Polk’s nomination by making sure that Polk’s name wasn’t mentioned until the third day of the Convention. “My name must in no event be used until all efforts to harmonize upon one of the candidates already before the public shall have failed,” Franklin Pierce warned when he was the dark horse of the Democratic Convention in 1852. James Garfield, a Republican delegate, made such a good speech, nominating his fellow-Ohioan the uninspiring John Sherman, that Conkling, a New York delegate, handed Garfield a note that read, “New York requests that Ohio’s real candidate and dark horse come forward.” Garfield’s nomination was masterminded by a Philadelphia banker, who seated Garfield supporters at strategic sites around the hall so that, from his seat on the stage, he could cue them to greet Garfield with perfectly timed ovations.

“Every attempt to abridge the privilege of becoming citizens. . . ought to be resisted,” the Democratic Party pledged, in 1856, countering the Know-Nothings, whose motto was “Americans Must Rule America,” and whose platform consisted of a resolution discouraging the election of anyone not born in the United States to any office, of any kind. That wave of nativism passed, only to be replaced by efforts to prohibit Chinese immigration. “It is the immediate duty of congress fully to investigate the effects of the immigration and importation of Mongolians on the moral and material interests of the country,” the Republican National Convention resolved in 1876.

Much skulduggery concerns the credentials of delegates. “Why didn’t you nominate Rufus Choate?” began a joke told about the old men who’d been rounded up to serve as delegates at a Convention. (Yes, Choate was dead, but so recently!) Then there’s more ordinary betrayal. In 1876, when the Democrats met in St. Louis—the first time that a Convention was held west of the Mississippi—a delegation opposed to the nomination of the New Yorker Samuel Tilden hung a giant banner from the balcony of the Lindell Hotel. It read “The City of New York, the Largest Democratic City in the Union, Uncompromisingly Opposed to the Nomination of Samuel J. Tilden for the Presidency Because He Cannot Carry the State of New York.” So much for the favorite son.

“We are united,” Henry Clay said, halfheartedly, at one of the Conventions in which he failed to win the nomination. In 1860, at a Democratic Convention held in Baltimore—the second Democratic gathering held that year, since the Southern delegates bolted from the first one—an American flag was adorned with the motto “We Will Support the Nominee.” That Convention required delegates to take a loyalty pledge: “Every person occupying a seat in this convention is bound in good honor and good faith to abide by the action of this convention, and support its nominee.” This happened again in 1948, when Southerners bolted from the Democratic Convention over civil rights, and held their own Convention, as the Dixiecrat Party, whose platform included this statement: “We stand for the segregation of the races and the racial integrity of each race.” After that, Democrats called for delegates to take a loyalty pledge. The Dixiecrat defection also contributed to the Democrats’ adoption, in 1956, of a bonus system, awarding extra votes to delegates from states that had voted for the Party nominee in the previous election.

These traditions are why Trump was asked, at the first GOP debate of this primary season, whether he would support the eventual Republican nominee. They’re also why so many Democrats lost patience with Sanders for remaining in the race. (Trump says that Sanders is waiting for “the FBI Convention,” which is Trump’s way of suggesting that Clinton will be indicted before the Democrats meet in Philadelphia.) Second-placers often hanker for an old-fashioned, contested Convention. For a while, Trump wanted one, too, but, when Cruz stepped down, Trump changed his mind: no one wants to contest what’s already won. At that point, the Indiana attorney Joshua Claybourn gave up his seat as a GOP delegate. “Party rules would require I vote for Donald Trump,” Claybourn explained. “I choose not to let that happen.”

The rise of the primary was a triumph for Progressive reformers, who believed that primaries would make elections more accountable to the will of the people. That didn’t quite come to pass. Instead, primaries became part of the Jim Crow-era disenfranchisement of newer members of the electorate. Frederick Douglass addressed Republicans at a Convention in Cincinnati in 1876, asking, “The question now is, Do you mean to make good to us the promises in your constitution?” Sarah Spencer, of the National Woman Suffrage Association, was less well received at that Convention, which marked the centennial of the Declaration of Independence. “In this bright new century, let me ask you to win to your side the women of the United States,” Spencer said. She was hissed. In 1880, Blanche K. Bruce—a former slave, a delegate from Mississippi, and a U.S. senator—served as an honorary vice-president of the Republican Convention, and wielded the gavel.

The end of Reconstruction saw the rise of the secret ballot, which, by effectively introducing a literacy requirement, disenfranchised black men. If the Emancipation Proclamation ended the electoral advantage granted to Southern whites by the three-fifths clause, the secret ballot restored it. In Louisiana, black-voter registration dropped from 130,000 in 1898 to 5,300 in 1908 to 730 in 1910. But the real racial recount came with the rise of the primaries; the reform began to gain strength in 1905. The election of 1912 was the first in which a significant number of delegates to the nominating Conventions were elected in state primaries, as Geoffrey Cowan writes in Let the People Rule (2016), a book that takes its title from Theodore Roosevelt’s campaign slogan. Roosevelt wanted to wrest the Republican nomination from the incumbent President, William Taft, and saw the primaries as his only chance. “The great fundamental issue now before the Republican Party and before our people can be stated briefly,” he said. “It is: Are the American people fit to govern themselves, to rule themselves, to control themselves? I believe they are. My opponents do not.”

Thirteen states held primaries; Roosevelt won nine. Still, winning the Convention was another matter, since the primaries weren’t binding. By 1912, blacks had been so wholly disenfranchised in the South, and the South was so wholly Democratic, that most of the Southern delegates to the Republican Convention were black men who had been appointed to Party offices by the Taft Administration. Roosevelt needed their votes and tried to court them. “I like the Negro race,” he said in a speech at an AME church, the day before the Convention. But the next day the New York Times reported on affidavits alleging that Roosevelt’s campaign had attempted to bribe black delegates. Roosevelt lost the nomination to Taft. He then formed the Progressive Party, whose Convention refused to seat black delegates. “This is strictly a white man’s party,” said one of Roosevelt’s supporters, a leader of what became known as the Lily Whites. In the general election, Roosevelt and Taft split the Republican vote, allowing Woodrow Wilson to gain the Oval Office, where, as W. E. B. Du Bois remarked, he introduced “the greatest flood of bills proposing discriminatory legislation against Negroes that has ever been introduced into an American Congress.”

Party leaders ignored primaries for as long as they could. Beginning in the nineteen-thirties, they instead used public-opinion polls to gauge the prospects of their candidates. Candidates who sought out primaries tended to be weak ones. In 1952, Estes Kefauver entered and won twelve of fifteen primaries; it didn’t matter. At the Democratic Convention, he lost on the third ballot, to Adlai Stevenson, who hadn’t run in a single primary. That same year, Robert Taft won six primaries to Dwight Eisenhower’s five. It didn’t matter; at the Republican Convention, the Party went for Eisenhower, who was leading in the polls. John F. Kennedy needed to win primaries to demonstrate to the Party that voters didn’t mind that he was Catholic. Barry Goldwater bypassed the primaries but won the nomination because the delegates to the 1964 Convention fell for him. “This Nation and its people are freedom’s model in a searching world,” he said, accepting the nomination. Another lesson of American Presidential history: Beware of candidates who flatter the people.

Nominating Conventions are extra-legal, and attempted reforms have often been deemed unconstitutional. The rules set by each Convention are essentially peace treaties negotiated between the parties and the voters. It falls to both sides to accept the terms of the peace.

“The invitation to violence arises because partisanship in its most intense forms contests the very basis of a political community,” the political scientist Russell Muirhead has observed. The basis of that community, he argues, is a trio of political settlements, each achieved by violence: the rejection of monarchic rule through the acceptance of the idea of the consent of the governed; the rejection of religious intolerance through the acceptance of freedom of conscience; and the rejection of slavery through the acceptance of political equality. This election season, all three of those fundamental settlements have become, to varying degrees, unsettled. “The will of the people is crap,” the influential conservative Erick Erickson wrote, about Trump’s primary victories. Trump has called for a religious test for immigrants, in order to ban Muslims. And the argument of the Black Lives Matter movement is that political equality was never settled in the first place.

The protests at the Democratic Convention in Chicago in 1968 resulted in a change in the balance of power between the primaries and the Conventions: before 1968, primaries hardly mattered; since 1968, the Conventions have hardly mattered. A report issued in 1968 predicted that “instantaneous polls of the entire electorate” conducted by “central computers from every home” would make nominating Conventions obsolete, which has, in fact, happened. That’s the de-facto change, but the de-jure change is that the primaries became binding.

After the chaos of 1968, political reformers called for the abolition of the nominating Convention, to be replaced by a national primary, and the American Bar Association called for the abolition of the Electoral College, to be replaced by direct, popular election. These proposals, which had been made before and have been made since, have a ready appeal. The nominating Convention is a messy and often ugly accident of history. “No American political institution is more visible than the convention, or more often visibly shoddy,” the constitutional scholar Alexander Bickel admitted. But changing the structure of government carries its own dangers, Bickel insisted: “The sudden abandonment of institutions is an act that reverberates in ways no one can predict, and many come to regret. There may be a time when societies can digest radical structural change, when they are young and pliant, relatively small, containable, and readily understandable; when men can watch the scenery shift without losing their sense of direction. We are not such a society.”

The loss of direction that Bickel warned of has come to pass, even without radical change. Instead, there’s been incremental change. The rules have changed, and changed, and changed. The parties change the rules when they lose, with an eye toward winning the next time around. There’s no grand plan; there’s a plan to win in four years’ time. The rule changes since 1968 have made the primaries more binding, notwithstanding the argument that they violate the 1965 Voting Rights Act (since the course of events is disproportionately determined by the very nearly all-white states of New Hampshire and Iowa). The system, as it stands, rewards political extremism, exacerbates the influence of money in elections, amplifies the distorting effects of polls, and contributes to political polarization. Debatable, but often asserted, is that it also produces poor candidates and ineffective Presidents.

Since 1968, no one in either party has successfully defeated at the Convention the candidate who won a plurality of the primaries and the caucuses. In 1972, George McGovern, who’d chaired the Democratic commission that rewrote the Party’s delegate-selection rules, won its nomination despite an “Anybody but McGovern” challenge at the Convention, in Miami. McGovern lost to Nixon in a landslide: he carried just one state. In 1976, at the GOP Convention, in Kansas City, Ronald Reagan challenged Gerald Ford and, very narrowly, lost. Jimmy Carter, who’d won a lot of primaries, won the Democratic nomination and even the election, but after his failed Presidency many Democrats regretted binding their delegates to the primaries. In 1980, at the Democratic National Convention, in New York City, Ted Kennedy tried to challenge Carter but was defeated by the rules. That’s why, in 1984, the DNC invented superdelegates, high-status Party officials who are pledged to no one candidate. This year, a lot of Republicans are regretting binding their delegates to the primaries. The rules committee meets the week before the Convention. Hundreds of anti-Trump Republicans have formed an organization called Free the Delegates and begun plotting a strategy to block his nomination by adding a “conscience clause” to the rules, unbinding the delegates. Paul Ryan said that he wouldn’t object: “It’s not my job to tell delegates what to do.” This tactic has been tried before. A savvy souvenir collector could even hawk on the streets of Cleveland the buttons that Kennedy supporters wore in 1980, which read “FREE THE DELEGATES.”

Mencken said that going to a Convention was something between attending a revival and watching a hanging. Going to this year’s Conventions could feel more like getting trapped in a forest fire. The Cleveland Police Department has stocked up on riot gear. Protesters are expected at both Conventions, in droves, if, generally, in clothes. Much of the sense of foreboding is a production of the press, and especially of Twitter, each Tweet another match lit on the pyre of the republic. But part of the foreboding is founded. Trump renounced violence only after inciting it. “It goes without saying that I condemn any and all forms of violence,” Sanders said in a statement that included a lot of “but”s.

No nomination is ever entirely uncontested; the only question is what form the contest will take—sound or fury. The gavel used at the 1880 Republican Convention had a handle made of cane grown at Mount Vernon and a head made of wood taken from the doorway of Abraham Lincoln’s house in Springfield. American elections are makeshift. Another gavel will rap in Cleveland, on July 18th, calling the Convention to order. The people remain as unruly as ever. Ω

[Jill Lepore is the David Woods Kemper '41 Professor of American History at Harvard University as well as the chair of the History and Literature Program. She also is a staff writer at The New Yorker. Her latest books are The Story of America: Essays on Origins (2012), Book of Ages: The Life and Opinions of Jane Franklin (2013). and The Secret History of Wonder Woman (2014). Lepore earned her BA (English) from Tufts University, an MA (American culture) from the University of Michigan, and a PhD (American studies) from Yale University.]

Copyright © 2016 The New Yorker/Condé Nast Digital



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2016 Sapper's (Fair & Balanced) Rants & Raves