Friday, September 30, 2016

Amid The Sound & Fury Of 2016, Here Is An Alternative View Of POTUS 44's Legacy

Harvard professor James Kloppenberg has written an elegant summation of the two terms of POTUS 44. Little more can be added. If this is (fair & balanced) graceful historical writing, so be it.

[x CHE]
How Obama Sees America
By James T. Kloppenberg

TagCrowd cloud of the following piece of writing

created at

As the presidency of Barack Obama reaches its end, assessments of his legacy are bound to proliferate. Historians, of course, resist the rush to judgment; such evaluations require the passage of time and the slow emergence of perspective. Even so, certain dimensions of the legacy of Obama’s presidency have already become clear.

Obama did not spring full-blown from the mind of Zeus. This man of African and American heritage quite consciously fashioned himself and his political program from materials he inherited from the American past. As he made abundantly clear in The Audacity of Hope (2006), he shares the convictions of many social democrats who preceded him. First, he believes in what he calls "ordered liberty," a conception of freedom that involves not only the absence of illegitimate restraint but also — and equally important — both an awareness of the need to internalize moral and civic laws, and a commitment to securing the conditions necessary for the exercise of that liberty. Second, he believes in equality — not merely the empty and formal equality of opportunity that all Americans are said to enjoy, but social and economic equality of the kind that most white Americans experienced from the mid-1940s through the early 1970s, a period of rapid economic growth and low inequality.

Third, Obama believes that the tradition of interpreting the Constitution, which he describes as a "nation arguing with its conscience," reflects a solid commitment to the idea that law should be alive rather than fixed — that we must continue to adapt our laws in response to our changing convictions. Fourth, Obama believes in government regulation of the economy, a graduated income tax, and a robust safety net, all of which are necessary to advance the common good and protect against the emergence of an aristocracy of wealth inimical to a democracy. Finally, he believes in securing, at last, the promise of equal treatment for all Americans contained in the civil-rights legislation of the mid-1960s

But there is a difference that separates Obama from his predecessors. Whereas earlier generations of Americans committed to similar ideals and programs usually considered them either consistent with God’s will or mandated by reason, Obama understands that they are simply part of our cultural and historical inheritance. They are what we have decided that we value, and they are no less precious to us as a result. That claim is contentious. It is also complicated.

As the first American president to come of age after the cultural and intellectual revolutions of the 1960s, Obama showed, not only in his memoir Dreams From My Father (1995, 2004) but in much of what he has written, that he realizes there are no universal, timeless, and fixed truths that all humans must embrace. Instead, like the late 19th- and early 20tth-century American pragmatist philosophers William James and John Dewey, and like cultural anthropologists (such as Obama’s mother), Obama understands that all knowledge is particular, historically situated, and provisional. In contrast to the postwar generation of Americans who embraced the United Nations and its Universal Declaration of Human Rights, Obama belongs to a chastened, skeptical generation that might share such ideals but doubts they can be shown to rest on the bedrock of divine law or rationality or applied without difficulty to every society. Instead they are contingent and historically conditioned cultural products, hypotheses to be tested in practice rather than doctrines to be affirmed as sacred.

The evidence of Obama’s commitments to this cluster of ideas — which you might call "anti-foundationalist" or, to use the older term associated with the philosophical tradition of James and Dewey, "pragmatist" — is scattered throughout Obama’s writings and speeches. Among the consequences of this sensibility, consistent with an awareness of the limits of one’s knowledge, is an awareness that all perspectives are partial, and all of our beliefs should be scrutinized according to the best available evidence.

To those for whom such an approach to politics is synonymous with spinelessness, and those who believe that all success comes from unyielding adherence to unchanging principles, pragmatists will always appear too quick to compromise and too cowardly to stick to their guns. The United States contains many people, on the left and the right, who share that view. Obama does not.

Almost everyone who has known Obama agrees that an openness to differing perspectives is among the defining features of his character. During his presidency, it has manifested itself repeatedly. His willingness to compromise with conservatives has left him vulnerable to critics (though I believe, in time, he will be counted among the most effective reformers of the last century and a half).

Those who want to grasp President Obama’s contribution to American cultural history should simply read or, even better, listen to his words. One might start with his first inaugural and end with his speech at the Democratic National Convention in Philadelphia this summer, or review his eight State of the Union addresses. But I think four speeches, from the last two years of his presidency, stand out as emblematic of his defining qualities.

The first came in Selma, AL, in March 2015, 50 years after the police attacked marchers on the Edmund Pettus Bridge. Obama observed that Selma was among the places "where this nation’s destiny has been decided," when "the stain of slavery and anguish of civil war; the yoke of segregation and tyranny of Jim Crow; the death of four little girls in Birmingham; and the dream of a Baptist preacher" all collided on that bridge. That "clash of wills" helped determine that "the true meaning of America" would be "an inclusive America." Although victory was neither "preordained" nor "complete," the marchers "proved that nonviolent change is possible, that love and hope can conquer hate" if enough ordinary people come together to shape the future. "What greater expression of faith in the American experiment than this, what greater form of patriotism is there than the belief that America is not yet finished, that we are strong enough to be self critical, that each successive generation can look upon our imperfections and decide that it is in our power to remake this nation to more closely align with our highest ideals?" The marchers in Selma helped open "the doors of opportunity" for all Americans, and reminded us that America is "a constant work in progress" and that "our work is never done."

Obama then struck a note he had been sounding since the spring of 2008, when he gave his famous "race speech" in Philadelphia in the middle of the presidential campaign. He rejected the notion that "racial division is inherent to America," adding, "If you think nothing’s changed in the past 50 years, ask somebody who lived through the Selma or Chicago or Los Angeles of the 1950s. Ask the female CEO … [or] your gay friend." Although "this nation’s racial history still casts its long shadow upon us," he insisted, the changes are real.

A few months later, in June, President Obama delivered a speech in Charleston, SC, in memory of the pastor and eight members of the Emanuel AME Church congregation murdered by a white supremacist. The pastor, the Reverend [Mr.] Clementa Pinckney, "embodied the idea that our Christian faith demands deeds and not just words" and "that to put our faith in action is more than individual salvation, it’s about our collective salvation."

Obama has repeatedly shown that he believes the Judeo-Christian tradition remains a vibrant source of moral standards as well as a useful guide for civic action, and his account of the history of the Mother Emanuel church, once destroyed because it was founded by a foe of slavery, underscored that point. Black churches harbored runaway slaves and nurtured civil-rights workers; they "have been, and continue to be, community centers where we organize for jobs and justice; places of scholarship and network; places where children are loved and fed and kept out of harm’s way." Black churches have also been targets, and the most recent murder descended from a long history of violence used "to terrorize and oppress." Such acts were intended to "incite fear and recrimination," thereby deepening racial divisions. But the murderer could not have expected that "the families of the fallen would respond when they saw him in court — in the midst of unspeakable grief, with words of forgiveness."

Searching for some significance in the slaughter, Obama focused on the South Carolina governor’s call for the removal of the Confederate flag from state Capitol grounds. That act by Nikki Haley represented "one step in an honest accounting of America’s history" and registered "the amazing changes that have transformed this state and this country for the better because of the work of so many people of goodwill, people of all races striving to form a more perfect union." Reverend Pinckney understood, President Obama concluded, that "justice grows out of recognition of ourselves in each other," that any person’s liberty depends on the liberty of others, and that "the path of grace involves an open mind — but, more importantly, an open heart." Near the end of his speech, he began, at first tentatively, and then with growing confidence, to sing the hymn "Amazing Grace," and the congregation joined him. I doubt those who heard it will forget it.

In February, Obama appeared at the state legislature of Illinois, where his political career began. He began by joking with his former colleagues about his mistakes as a fledgling legislator. Then he turned serious. Among the few regrets of his presidency, he confessed, was "my inability to reduce the polarization and meanness in our politics." He had been unable to "translate" to Washington what had been possible in Springfield. When citizens realize that the consequence of hyperpartisanship is stalemate, it "makes them cynical. And when that happens, more powerful and extreme voices fill the void." He continued: "When either side makes blanket promises to their base that it can’t possibly meet," such as cutting taxes but not services, or waging war without requiring sacrifices, or "bashing" either unions or corporations without "acknowledging that both workers and business make our economy run — that kind of politics means that the supporters will be perennially disappointed." Obama insisted that, without abandoning "our most deeply held principles," it was still possible "to forge compromises in pursuit of a larger goal."

Those ideas were amplified in his commencement address at Howard University in May. Speaking to an audience attuned to the anger recently sparked by police shootings, Obama acknowledged that African-Americans had plenty of reasons to be outraged. In typical fashion, though, he asked the graduates and their families to see other sides of American life. "We must expand our moral imaginations," he said, to encompass "the middle-aged white guy who you may think has all the advantages." In recent decades, that man’s world had been "upended by economic and cultural and technological change" that he felt "powerless to stop. … You got to get in his head, too." Democracy only works, the president insisted, when people respect those who disagree with them. When we shout, we cannot hear, and change "requires listening to those with whom you disagree, and being prepared to compromise." Certainty is not enough; in fact, it’s counterproductive. "If you think that the only way forward is to be as uncompromising as possible, you will feel good about yourself, you will enjoy a certain moral purity, but you’re not going to get what you want."

As he has done throughout his career, Obama has tried to embody that counsel himself. In recent years, the understandable fury over police shootings has prompted some in the Black Lives Matter movement to insist that anger and intransigence are the only appropriate responses to a system stacked against them. But as the young African-American political theorist Brandon M. Terry, an assistant professor at Harvard, wrote recently in Dissent, the increasing militancy of some black writers and activists has become part of the problem. Terry advises developing "an ethos of humility and self-criticism that, over time, will generate more powerful ideas, arguments, and hopefully, coalitions. Trust and respect — and substantive political power — will come only from a mutually enriching process of engaging with and arguing over needs (like safety, income, and education) and values (that is, the ethics of punishment, ideals of masculinity, nativism, and so on) as well as policies."

At a moment when most opinion polls find, contrary to predictions eight years ago, that blacks and whites consider race relations worse now than they were when Obama was elected, it seems clear that his most important long-term contribution to U.S. history would be to persuade Americans, against all odds, to adopt that ethic of reciprocity — an unfinished mission. Democracy in America remains a work in progress, a quest to make one out of many. The nation will move in that direction, however, only if we can recover the expansive ideals, the rejection of dogmatism, and the commitment to finding common ground that propelled Barack Obama to national prominence in the first place. Ω

[James T. Kloppenberg is the Charles Warren Professor of American History at Harvard University. His recent books include Toward Democracy: The Struggle for Self-Rule in European and American Thought (2016) and Reading Obama: Dreams, Hope, and the American Political Tradition (2010). Kloppenberg received an AB, summa cum laude (history) from Dartmouth College and both an MA and PhD (history) from Stanford University.]

Copyright © 2016 The Chronicle of Higher Education

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2016 Sapper's (Fair & Balanced) Rants & Raves

Thursday, September 29, 2016

J.D. Vance Is Right On — ALL Of Us (Without Exception) Are Marked By The Stain Of Hypocrisy

J. D. Vance, a California attorney, hit this summer's best-seller lists with his memoir of growing up in Middletown, a facotry town in southwest Ohio. Vance grew up in a family that migrated from Appalachian Kentucky to better-paying work (for a time) in Middletown. He graduated from Middletown High School and enlisted in the US Marine Corps to serve in a non-combat deployment in Iraq (2003-2007). Upon his discharge from the Marines, Vance enrolled in The Ohio State University and — after graduation — was admitted to the Law School of Yale University. In today's essay, VAnce demonstrates his loyalty to "them what brung him" — working-class folk who might be called "white trash" in some quarters. If this is a (fair & balanced) conversation about race and class in the United States, so be it.

[x NY Fishwrap]
When It Comes To Baskets, We’re All Deplorable
By J. D. Vance

TagCrowd cloud of the following piece of writing

created at

It was the awkward comment heard round the world. At a fund-raiser earlier this month, the Democratic presidential nominee, Hillary Clinton, divided the supporters of her Republican opponent Donald J. Trump into two even groups. One consisted of good, if alienated and dispossessed, people. But the other half goes into a “basket of deplorables,” she said. “The racist, sexist, homophobic, xenophobic, Islamophobic — you name it.”

The ensuing reaction to her comments is a case study in everything wrong with our political discourse. Mr. Trump — who still hasn’t apologized for suggesting that a disproportionate share of Mexican immigrants are rapists and criminals — demanded an apology. Meanwhile, many on the left came to her defense: The remark might have been politically inept, many said, but it was true.

These commentators often base their arguments on polls that paint many Republicans in an unflattering light: About one-third of conservatives believe that Barack Obama is a Muslim, and more than half doubt whether he was born in the United States. According to one Reuters poll, about half of Mr. Trump’s supporters say that blacks are “more violent” than whites, while approximately 40 percent see blacks as “lazier” than other races.

These views are undoubtedly deplorable, and we all have a responsibility to confront them. But if Mrs. Clinton had said that half of Mr. Trump’s supporters hold some prejudicial views and left it there, we probably wouldn’t be talking about the comment today. Her sin was to collapse millions of people — from former Klansmen like David Duke to a struggling coal miner with some unacceptable opinions — into the same group of social outcasts.

It’s difficult in the abstract to appreciate that those with morally objectionable viewpoints can still be good people. This perhaps explains why Mrs. Clinton showed considerably less charity than did Mr. Obama as a candidate in a widely praised 2008 speech on race. In one particularly personal passage, he spoke about his white grandmother — an imperfect, but fundamentally good, woman, “a woman who helped raise me, a woman who sacrificed again and again for me, a woman who loves me as much as she loves anything in this world, but a woman who once confessed her fear of black men who passed by her on the street, and who on more than one occasion has uttered racial or ethnic stereotypes that made me cringe.”

If a pollster had called Mr. Obama’s grandmother and asked her questions about race, religion and sexuality, she almost certainly would have proffered at least one prejudicial view. The data tells us that she wouldn’t be alone. In a recent poll, about 40 percent of Democratic voters supported temporarily barring Muslims from entering the country. Large shares of black voters express some uneasiness with homosexual behavior, an opinion common among religious people of all races but undoubtedly unwelcome in cosmopolitan elite circles of the Democratic Party. The same poll that found that 40 percent of Mr. Trump’s supporters viewed blacks as lazier revealed that 25 percent of Mrs. Clinton’s supporters believed the same thing. Perhaps these people should also join Mrs. Clinton’s deplorable basket.

There’s no reason to limit basket-worthiness to those with explicit prejudices. For decades, scholars have studied the ways in which implicit biases affect how we perceive other people in this multiethnic society of ours. The data consistently shows that about 90 percent of us possess some implicit prejudices — and, unsurprisingly, people typically favor their own group. Layer on top of that the many people unwilling to speak about their prejudices with a pollster, and a picture emerges of a nation where a significant majority of the country harbors some type of bias.

There are many ways to confront the people of that nation in all its complexity. We can ignore that these biases exist, and pretend that our uniquely diverse society need never address the difficult questions posed by that diversity. This is the path chosen by far too many of my fellow conservatives.

We can deem a significant chunk of our populace unrepentant bigots, which appears to be the strategy of Mrs. Clinton and much of the left.

Or we can recognize that most of us fall into another basket altogether: One where prejudice — even implicit — coexists with incredible compassion and decency. In that basket is the black preacher who may view homosexuality as a little icky even as he lovingly ministers to struggling gay members of his church. The adoptive parent of a child born in Asia, who pours her heart and soul into her child’s well-being even as she tells a pollster that she doesn’t much care about America’s experience with Japanese internment. And in that basket is a white grandmother who speaks ill of black people even as she gives her beloved African-American grandson the emotional support and love that enable him to become the president of all Americans.

We can and should recognize the bad in that basket even as we celebrate the good. We must have the courage to confront dreadful views even in the people we love the most. But that’s difficult to do when we cast large segments of our fellow citizens into a basket to be condemned and disparaged, judging them even as we ignore that many of their deplorable traits exist in us, too. Ω

[J. D. Vance is an attorney with a global investment firm in San Francsisco. His first book is Hillbilly Elegy: A Memoir of a Family and Culture in Crisis (2016). Vance received a BA, summa cum laude (political science and philosophy) from The Ohio State University and a JD from the Law School of Yale University.]

Copyright © 2016 The New York Times Company

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2016 Sapper's (Fair & Balanced) Rants & Raves

Wednesday, September 28, 2016

The Joy (& Heartbreak) Of Childhood Reading

Adam Gidwitz, author of several children's books, took this blogger back to a childhood that was notable for one thing — reading. According to family lore, this blogger spent hours as a toddler — in the care of his maternal grandmother during WWII — sitting on the living-room sofa with a magazine. Supposedly the blogger looked at each and every page while lacking the ability to read. Looking... for what? Later in his childhood, when the blogger was tasked with spreading newspapers on the floor to protect against spillage, the kneeling blogger would be reading the newspaper pages as he spread them on the floor. Without fail, the last thing that the blogger does at night is read in bed until his eyes grow weary. If this is (fair & balanced) self-diagnosis of compulsive reading, so be it.

[x Aeon]
Books For Life
By Adam Gidwitz

TagCrowd cloud of the following piece of writing

created at

How does a book get on The New York Times bestsellers lists? For those outside the publishing industry, the question seems tautological. You get on a bestseller list by being among the top 10 bestselling books in your category. Obviously.

With certain caveats, that’s true. (Among those caveats — you can’t be a perennial bestseller, like the Bible: it would be number one on the list every week.) But that’s like answering the question: ‘How do you get to Carnegie Hall?’ with: "Take the Q to 57th street." Yeah, sure. But I want to know how you get on stage.

So how does a book achieve a level of sales that grants access to the list? Well, there are certainly tricks that publishers and authors use to boost sales at just the right moment, or in just the right way, to register on the bestseller lists. There are many, many books that were on a list for one week or two weeks and then dropped off, never to return again. They benefited, in all likelihood, from one of those tricks.

But there are other books that climb on to a list and seem to just stay there. How does that happen? That is no publisher’s trick. Neither savvy marketing, nor blanket marketing, nor really any kind of marketing can produce that kind of success. Books stay on a bestseller list for months at a time because people actually like them. They benefit from the exponential power of word of mouth. Albert Einstein reportedly said that compound interest is the most powerful force in the Universe. A publisher would disagree. Word of mouth is the most powerful force in the Universe, because word of mouth also benefits from an exponential model of growth: one person tells three people how good a book is, and they each tell three people, who each tell three more, and pretty soon that book is ensconced so firmly on a bestseller list that the list might as well be etched on stone tablets. That’s how you really get on the bestseller list.

But how do you write a book like that? No one knows.

Well, maybe the airport thriller-writers of the world know; they seem to produce bestseller after bestseller. But they write books that are the literary equivalent of the Candy Crush Saga video game, providing tiny dopamine hits with every swipe of your finger (or turn of the page, if you still read on paper). Just as few people say that they love Candy Crush, very few people say that they love such books. Doing something over and over doesn’t necessarily mean you love it.

For those of us who strive to write novels that are unique, and literary, and still bestsellers — books that people talk about because they love them — well, how to consistently do that is still very much a mystery.

You know what else is a mystery? My daughter. When she was born, she was a cipher. An unreadable blob of so-soft-it’s-almost-edible flesh. When she was about four months old, though, she became a very effective communicator. When she was hungry, she’d shriek like a banshee. When she was tired, she also shrieked. When she was bored or physically uncomfortable, she shrieked. She wasn’t crying. She was shrieking, and it was louder, more high-pitched and more sudden than any other sound I’d ever heard a human make. It was painful to the ears and the heartstrings, which made it a remarkably effective form of communication.

She shrieked to tell us just four things — fatigue, hunger, discomfort, boredom. So I was left wondering a great deal about her inner life. When I hold her in front of a mirror, what does she think? When she is gazing into space, how is she processing all that she has been exposed to? And when she shrieks particularly loudly, is she more upset? Or is she just experimenting with varying her mode of communication?

I don’t think I’ll ever know the answers to these questions. But, luckily, her psyche will become more and more transparent as each month goes by, because in each month she will become more and more interested in books. Our selection of books, from earliest childhood through to the ends of our lives, provides a window into our secret souls. And children, who (necessarily, for survival purposes) are most in touch with their desires and drives, tell us most about their inner lives when they tell us which books they want to read.

Sigmund Freud said a lot of crazy things, but one of his most compelling insights was that the mind is like the city of Rome. Each age has its own architecture, its own monuments, built on top of those from the previous ages. But instead of knocking down those monuments to an older time and replacing them, the mind preserves each landmark. Some, like the Colosseum, are more obvious, while others are hidden in the shadows of Palatine Hill. Even more completely than Rome, each adult keeps the landscape of her childhood intact. If you want to understand that childhood landscape, the foundations on which a person’s life is built, ask her what her favourite books were as a child.

As a young boy, I loved The Carrot Seed (1945, 2004), written by Ruth Krauss and illustrated by Crockett Johnson. The text is so simple I can quote it in its entirety:

A little boy planted a carrot seed.

His mother said: ‘I’m afraid it won’t come up.’

His father said: ‘I’m afraid it won’t come up.’

And his big brother said: ‘It won’t come up.’

Every day the little boy pulled up the weeds around the seed and sprinkled the ground with water.

But nothing came up.

And nothing came up.

Everyone kept saying it wouldn’t come up.

But he still pulled up the weeds around it every day.

And sprinkled the ground with water.

And then one day

A carrot came up.

Just as the little boy had known it would.

I was obsessed with this book as a young child. I demanded it over and over, more than any other book. It’s not hard to see why. The writing is simple and subtle. The brother’s curt ‘It won’t come up’ cuts deep, for example. Since I didn’t have a brother until I was nearly five years old, I remember associating that line with my father. In my psyche, he got two putdowns, the second sharper than the first.

The illustrations are no less brilliant than the text. When the carrot finally does sprout, its green fronds are bigger than the boy protagonist, and he carts the carrot away in a wheelbarrow because it is bigger than he is, and (if the laws of physics are obeyed in picture books) must weigh at least 50 pounds. Not only was his family wrong to doubt him, his triumph is supernaturally enormous.

I will skip the Freudian interpretation of the carrot as a penis, though in shape, colour and placement it’s an easy argument to make — especially since the book came out of the culture of publishing in New York City in the same year that Alfred Hitchcock’s Freudian ode "Spellbound" was released.

More interesting than the phallic carrot, though, is that the family doesn’t reappear to see the boy’s triumph. Unlike in popular US films, where all the doubters have to watch the hero’s victory and, despite everything, stand and applaud (cf every movie that ends with either a sports event or a prom), the protagonist of The Carrot Seed doesn’t need his family to tell him that the carrot came up. He knows it did. It’s enormous, and vaguely purple, and is way bigger than his dad’s... carrot. If you know what I mean.

So what does it say about me that I loved this book as a child enough to have kept my personal copy at my elbow through college, my first job, my transition to my second career (writing), and now into parenthood?

My father, despite being a wise and kind man, seemed incapable of not competing with me, his eldest child. My mother says this started pretty much from birth, which was the moment that his preeminence in the household was threatened. I loved my father totally, and he loved me too. We played together. He encouraged me in sports — which weren’t one of his strengths. But any new knowledge I acquired and shared had to be either contradicted or upstaged by him. My little brother, once he came along, was always ‘Right, in a way’. My answers, on the other hands, never seemed good enough. Even, to my fury, when I was repeating verbatim facts that my father had taught me earlier.

When my brother was old enough, the three ‘boys’ of the house would play Monopoly. (My mother refused to play with us, because she was always tempted to give my little brother money, which my father and I found unacceptable). The only viable long-term strategy in Monopoly is to buy every property one lands on, to prevent one’s opponents from getting monopolies. Then, once all the properties have been divvied up, if no one has a monopoly, the game will either go on forever, or the players will have to agree to a mutually acceptable trade.

But we three boys were so afraid of losing that we would negotiate over our trade for 30 minutes, 40 minutes, an hour. Often, the negotiations ended with either my brother or me in tears. Soon, the stalemates became so excruciating that we agreed to stop playing Monopoly altogether. Years later, when I was 15 and my father was 56, we were driving in his car. I said to my father: ‘We should all play Monopoly again.’ He responded: ‘What’s the point? It’s impossible for anybody to win.’ I said: ‘The point is to be together.’ He stared through the windshield for a moment. I waited for his response. Finally, he said: ‘You know, that never occurred to me.’

I have a close friend whose favourite book as a child was The Runaway Bunny (1942, 2005) by Margaret Wise Brown, illustrated by Clement Hurd. Its first page reads:

Once there was a little bunny who wanted to run away. So he said to his mother: "I am running away."

"If you run away," said his mother, "I will run after you. For you are my little bunny."

On each subsequent page, the little bunny fantasises about different ways in which he could transform himself and escape his mother. But, like a game of rock-paper-scissors, for each transformation the bunny proposes, his mother has a counter. "If you run after me... I will become a fish in a trout stream and I will swim away from you." "If you become a fish in a trout stream," says his mother, "I will become a fisherman and fish for you."

On and on this game goes, until ultimately the bunny proposes turning into a little boy and running into a house. "If you become a little boy and run into a house… I will become your mother and catch you in my arms and hug you." At which point the bunny replies: "Aw shucks... I might just as well stay where I am and be your little bunny."

I never thought much about the meaning of The Runaway Bunny until I learned that it was my friend’s absolute, bar-none, ask-for-it-every-night favourite book as a child– at which point I burst out laughing. No book could suit him more. Where his mother was concerned, he was always a rebel. Once, when he was sent to his room as a young boy, he stood at the top of the stairs and shouted at his mother: "I have a penis and you don’t!" As a young adolescent, well before he was legally allowed to drive, he "borrowed" his parents’ car in the middle of the night and drove from the suburbs into New York City.

On a recent Mother’s Day, he gave his mother a card that said: "I don’t have to give you anything because I know you’ll always love me." His mother burst into grateful tears. Whatever you think of that card as a Mother’s Day gift (had I been his mother, I would have been tempted to slap him), the mother-son relationship in The Runaway Bunny describes the dynamic between my friend and his mother pretty well. The book is a map, incomplete of course, of his relationship with his mother for years.

To be clear, I am not claiming that someone’s entire personality can be explained by looking at her childhood library. But often we can, cautiously, gain insight into someone’s personality by analysing the books she loved as a child. With even more caution, we can expand the scope of this analysis to point toward an answer to the question: How does a book get on, and stay on, a bestseller list? What makes thousands of people love a book enough to initiate that magical, process of word-of-mouth, catapulting it onto the list and keeping it there? Perhaps it is the book that speaks to the inner psychic landscapes of large swaths of the American public.

I don’t have the figures to prove it, but I would guess that the most popular children’s story in the world is Cinderella. Adaptations and retellings of it are ubiquitous: if I tried to list its adaptions in film and literature just over the past decade, I might just break the internet [sic]. Cinderella has iterations in almost every culture, from Ancient Egypt and China to 18th-century France.

This should come as no surprise. The story of Cinderella is basically that of a child unnoticed and unvalued by peers and parent-figures. Her "real" parent-figure (in the French version it’s her fairy godmother; in Grimm, it’s the spirit of her dead mother) shows up and enables her to unlock her latent worth, proving the naysayers wrong and allowing her to achieve the greatness she deserves.

Most children feel undervalued sometimes. Many feel impotent and unnoticed. And plenty believe that, if only they were seen clearly, or if only they had an opportunity, they could prove that they are more valuable, worthwhile, beautiful, talented or strong than anyone knew. Everyone, at some point in her life, has felt like Cinderella. And many people feel like Cinderella all the time.

So some people, mostly little girls (thanks to the gendered way that the story is usually told), will identify Cinderella as their favourite story. But many people won’t. Instead, they’ll mention Harry Potter, or 'Star Wars,' or any of the dozens and dozens of Cinderella stories that dominate our bestseller lists and box offices.

Let’s look at Harry, from J K Rowling’s Harry Potter and the Sorcerer's Stone (1997). We know from the outset that he is ‘the boy who lived’, who survived an attack of the darkest magic from the world’s greatest dark wizard and somehow managed, as an infant, to vanquish that wizard. So he’s special. Very special. But no one knows it, because he’s being raised by an ignorant aunt and uncle, along with their brutish son (cf stepmother and stepsisters). But soon, someone comes to rescue him, to take him to the place he’s always meant to be — Hogwarts School of Witchcraft and Wizardry.

There is a wonderful passage in which the half-giant Hagrid, who is rescuing Harry from his horrible aunt Petunia and uncle Vernon Dursley, educates Harry about himself:

‘Do you mean ter tell me,’ [Hagrid] growled at the Dursleys, ‘that this boy — this boy! — knows nothin’ abou’ — about ANYTHING?’

Harry thought this was going a bit far. He had been to school, after all, and his marks weren’t bad.

‘I know some things,’ he said. ‘I can, you know, do maths and stuff.’

But Hagrid simply waved his hand and said: "About our world, I mean. Your world. My world. Yer’ parents world."

‘What world?’

Hagrid looked as though he was about to explode.

‘DURSLEY!’ he boomed.

Uncle Vernon, who had gone very pale, whispered something that sounded like ‘Mimblewimble.’

Hagrid stared wildly at Harry.

"But yeh must know about yer mum and dad," he said. "I mean, they’re famous. You’re famous."

"What? My — my mum and dad weren’t famous, were they?"

"Yeh don’ know… yeh don’ know…" Hagrid ran his fingers through his hair, fixing Harry with a bewildered stare.

"Yeh don’ know what yeh are?" he said finally.

Uncle Vernon suddenly found his voice.

"Stop!" he commanded, "stop right there, sir! I forbid you to tell the boy anything!"

A braver man than Vernon Dursley would have quailed under the furious look Hagrid now gave him; when Hagrid spoke, his every syllable trembled with rage.

"You never told him?... You kept it from him all these years?"

"Kept what from me?" said Harry eagerly.

"STOP! I FORBID YOU!" yelled Uncle Vernon in panic.

Aunt Petunia gave a gasp of horror.

"Ah, go boil yer heads, both of yeh," said Hagrid. "Harry — yer a wizard."

When I first read this passage, I was tutoring a third-grade boy in East Harlem, trying to get him interested in books. Harry Potter had yet to become the phenomenon, at least in the United States, that it would become. I picked it from a bookshelf and read him the first chapter. He wasn’t interested. But after he left, I kept reading, and when I got to this passage, I cried. Tears were streaming down my face in the tiny library of the East Harlem tutorial programme.

The passage still makes me cry. It is Rowling at her best, confirming the promise of Cinderella, confirming the unrecognised (but subconsciously felt) greatness inside the child. Rowling is a genius, and her books will one day be in the "perennial bestseller" class with the Bible, because she tells the Cinderella story so well.

In the Dursley house, Harry is oppressed by his aunt, uncle and cousin’s cruelty, just as Cinderella is by the cruelty of her stepmother and stepsisters. But in a brilliant adaptation of the Cinderella trope, Harry is also oppressed by the Dursleys’ normality. Harry Potter and the Sorcerer’s Stone opens with the line: "Mr and Mrs Dursley, of Number Four Privet Drive, were proud to say that they were perfectly normal, thank you very much." It is the Dursleys bloody-minded devotion to all things normal and conventional that makes them hate Harry so much. He is unable to conform because he is special — his magical powers keep manifesting, inadvertently and even unconsciously, driving Vernon and Petunia crazy, and prompting them to punish him with increasingly harsh measures.

Failure to conform is hated. Specialness is hated. Failure to conform and specialness become one. This is the magical adaptation of Harry Potter to the modern world. When you see an adult who adores Harry Potter, who proudly tells you what Hogwarts house she is in, and explains to you the method for determining your own, you are likely speaking to someone who has felt oppressed by the conventionality of her world, and whose Cinderella fantasy is not transforming from an overlooked child into a princess, but rather transforming from a social outsider into a wizard. This is part of the deep psychic appeal of Harry Potter.

Another part of the appeal is the role of parents. In many versions of the Cinderella story, her fairy godmother gives her the clothes she needs to wear to the ball. In the Brothers Grimm version, the magical helpmate is the spirit of Cinderella’s real mother, embodied in a tree growing out of her mother’s grave. In either case, there is a superficial parent figure (the stepmother), who does not see the true value of the child, and a "real" parent figure, who does. Harry’s superficial parent figures are, obviously, the Dursleys. Harry has many "real" parent figures through the course of the series, from his stern stand-in mother Professor McGonagall to his inspiring but distant stand-in father, Albus Dumbledore. But many of the most potent emotional payoffs in the series are when Harry’s real parents return in spirit, like Cinderella’s mother: to protect him, to reveal gifts they gave him long ago or, most movingly of all, to be proud of him. The Harry Potter series is not just for social misfits. It’s for anyone who longs for a parent’s recognition and love.

Cinderella stories appeal to people of all ages. But, as we grow, we need more story types. New psychic pressures come to bear and, to cope with them, we crave new forms of tales. Taking a look at the bestseller list for the past five years reveals that teenagers, or "Young Adults" as they’re called in the biz, gobble up dystopian fiction. Veronica Roth’s Divergent series (2011-13) has done battle with James Dashner’s Maze Runner series (2009-16) at the top of a list which has seen The 5th Wave (2013-16), Legend (2011-13) and others periodically dethrone them. But regardless of which title is on top this week, the bestselling series list has been dominated by dystopian fantasies.

To some degree, dystopian novels have been successful for the better part of the past century. Perhaps the most widely read and beloved is 1984 (1949, 1991) by George Orwell. Aldous Huxley predicted the future more accurately in Brave New World (1932, 2013) – where an authoritarian control structure is rendered unnecessary by the population’s desire to self-sedate (he predicted a drug called "soma"; turns out it was YouTube). But, Orwell has captivated and inspired untold millions, even after the fall of his bogeyman, the Soviet Union. His ideas have infiltrated the language. "Orwellian" has become synonymous with dystopian authoritarianism. I have twice witnessed the effect that 1984 can have on a high-school class: first as a student myself and then again as a teacher. Its impact goes beyond political critique. It hits at the psychic level.

You can see why. 1984 is the story of a man, Winston, who is hemmed in on all sides by an authoritarian power. His life is observed and constrained. His thought is observed and constrained. The power structure is trying to teach him to replace the language of his upbringing and the literature he loves with a language that they have contrived in order to control him. What teenager does not feel each of these things? Parents observe and constrain their lives. Peers observe and constrain their thoughts. Teachers try to replace the language of their youth with "appropriate" and "educated" language that operates by rules which strengthen the power of the teachers themselves.

Winston’s rebellion is private and introverted — trying to steal moments away from the telescreens to write in his secret diary, like a teenager putting on her headphones and blasting her music behind a locked bedroom door. Until, that is, Winston begins an affair with Julia. It is only through romance that his rebellion comes to vivid, dangerous life. How completely adolescent! Only when a boyfriend or girlfriend shows up can real independence begin. 1984 is effective as a teen novel not because of its political message, but because it dramatises the internal psychic struggle of growing up.

What makes 1984 immortal is that it ends with a real insight into the futility of teenage rebellion. Every teen becomes an adult, joins the machine, and comes to love Big Brother. Teens hate this fact — many of them hate the ending — but it stays with them, because they know it’s true. In 1984’s sibling, Animal Farm (1945), the animals with whom we identify see their ultimate betrayal: the pigs standing like their former human masters in their former masters’ house. But in 1984 it is Winston himself who gives in at the end. He does love Big Brother, because Big Brother, also known as adult society, has manipulated language, power and thought to perpetuate its system. By being born into it, we have already lost. Depressing, perhaps cynical — but teens see their very lives scribbled on the pages of 1984. It’s more fun to read a dystopian novel with a happy ending. But the novel that you pass on to your children is the one that first taught you the truth.

When a child asks for the same book three hundred times, she is telling her parents what she needs to learn, what she needs to come to terms with. Adults do the same thing. Books are psychologists, using imagination therapy to elicit secrets that their readers did not know they kept. A child does not realise what he reveals when he names this doll "mummy," that one "daddy," and then has them shout at one another. Nor do we tend to realise what we are revealing about ourselves when we push a book into the hands of three friends. Maybe the bestseller list, stripped of the fly-by-night entries and dopamine drips, is a snapshot of the national psyche. It might be telling us what we need to learn, what we are coming to terms with. More certainly, I know one thing: I hope my daughter doesn’t love The Carrot Seed as much as I did. Ω

[Adam Gidwitz is the author of the best selling children's books: A Tale Dark and Grimm (2010), In a Glass Grimmly (2012), and The Grimm Conclusion (2012). More recently, he has written The Inquisitor's Tale: Or, The Three Magical Children and Their Holy Dog (2016). Gidwitz received a BA (English litarature) from Columbia University and spent his junior year abroad in the University's Oxford/Cambridge Scholars program.]

Copyright © 2016 Aeon Media Group

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2016 Sapper's (Fair & Balanced) Rants & Raves

Tuesday, September 27, 2016

Today, Hot Stuff — With A Political Application

In the day after the first presidential debate of 2016, this blog offers a muy caliente alternative to the blather of the talking heads. As Harry S Truman (HST) famously siad, "If you can't stnd the heat, stay out of the kitchen." Of course, he was talking about air temperature, not the ingestion of a chili pepper. In fact this essay inspired the blogger to imagine the perfect medium for the Stupid (GOP) nominee for POTUS 45 to demonstrate his appreciation of waterboarding. The fan of waterboarding would be be able to demonstrate his reality TV chops by allowing himself to be strapped to chair with his head titled backward, mouth open. A wet towel would be placed over the macho man's face and then a mixture of one gallon of water and one liter of an alcohol extraction of capsaicin oil into the open throat of the waterboarding fan. Now, that would be TV that this blogger would watch and record to play over and over. If this is a (fair & balanced) politico-gastro fantasy, so be it.

[x Aeon]
Hot Stuff
By Kendra Pierre-Louis

TagCrowd cloud of the following piece of writing

created at

On drizzly gray Sundays, after a mid-morning stroll has left my bones damp, I nestle under the covers and curl into a ball. Laptop on bedspread, I wrap myself in the screen’s azure glow and scroll through images of the vox populi in search of warmth, flavour and spice. In short, I spend hours watching YouTube videos of people eating chili peppers.

The standard form is this: the vlogger, glass of milk within easy reach, holds a single dried pepper pod to the camera, before taking a deep preparatory breath and gingerly placing it in their mouth. The chili being consumed is usually brown, the size of a shrunken peach pit. It looks harmless enough. Yet whether it’s a ghost pepper, a naga viper, a trinidad moruga scorpion or a carolina reaper, it’s almost always one of the world’s hottest chilis. It will have busted the once-mythical limit of 1 million Scoville Heat Units (SHU) – the curious measure developed in 1912 by Wilbur Scoville, an American pharmacist.

Scoville would take a measured amount of dried pepper and make an alcohol extraction of its capsaicin oil (capsaicin gives chili peppers their bite). That extraction would be added incrementally to a sugar-water solution, until an assembled panel of five tasters could no longer detect the pepper’s heat. A measurement of 1.5 SHU is roughly equivalent to one part per million of chili heat. These days, high-performance liquid chromatography adds a heightened level of precision, but the basic principle remains the same. Your standard bell pepper has a Scoville rating of zero; the common jalapeño, 10,000 SHU; a habanero around 300,000 SHU.

The peppers that the YouTubers love to eat are all roughly 100 times hotter than a jalapeño. The bhut jolokia, or ghost pepper, from northeast India, was officially the first pepper to hit 1 million SHU early this century. Seven additional peppers have since broken the 1 million SHU barrier; the hottest, the carolina reaper, at 2.2 million SHU, is poised this year to be supplanted by HP56, a pepper that registers 3 million SHU. It was developed, as peppers usually are, through standard hybridisation techniques. And what can’t be accomplished through horticulture can be achieved through cookery: hot sauces sold to so-called "pepperheads" can hit 16 million SHU.

While I don’t remember the first time I tasted a chili pepper (I’m fairly certain my mother slipped some in my baby bottle – at least, my taste for it was early acquired), I do remember the first time I couldn’t have it: on a high-school exchange in Nicaragua. Central American cuisine, though delicious, is not especially spicy, and unlike Hillary Clinton I’d failed to pack hot sauce. Toiling through my nacatamales, vigorón and gallo pinto, I longed for a touch of a scotch bonnet, a sprinkling of jalapeño.

On YouTube, the pepperhead – a gangly 12-year-old with Harry Potter glasses, a 20-something woman in a cat T-shirt, or a bro in a backwards baseball cap – begins to chew. There’s a pause as the brain registers the sensation, then an eruption of expletives, tears, occasionally vomiting. The milk, a supposed tempering agent, never works, whether it’s chugged, swirled around like mouthwash, or poured over the body. ‘It tastes like a thousand burning needles in my mouth,’ exclaims the pepperhead, before losing the ability to speak altogether.

Why would anyone do this to themselves?

Chili peppers (Capsicum annuum) which bear no relation to black pepper (Piper nigrum) have been spicing up Mexican food since at least 7500 BCE. India might now be a sizeable exporter of the world’s chili peppers, but these peppers did not exist outside Central East Mexico (an area that includes modern New Mexico) before the "discovery of the Americas." The Columbian Exchange, the post-1492 exchange of plants and animals between the Old World and New for the first time after millions of years of separation, sent chili peppers skittering across the globe in an early culinary fusion. Today’s Eastern varietals are descendants of those early North American peppers.

Why a product with all the bite but none of the power of, say, gun powder, would make it into the daily meals of people living in countries as diverse as Ethiopia, Bangladesh and Fiji has long left psychology researchers with blander palates scratching their heads. Capsaicin, which incidentally is a vanilloid (a group that includes the compounds that give vanilla its sweet aroma), works by binding to a receptor on the cells that detect temperature and, separately, to cells that signal pain. Piperine, the compound that gives black pepper its bite, and allyl isothiocynanate, the chemical that gives mustard and radishes (including wasabi) their kick, work on similar receptors. We register chili as hot because those receptors, VR1, are usually triggered only by putting foods hotter than 110°F (43°C) into our mouths or by coating them with acid.

The cookery writer Julia Child once claimed that eating too many chilis could burn off your taste buds. It can’t. But sometimes eating chili or, worse, rubbing your eyes or your nethers with capsaicin-coated hands, can make you wish it would sear off your pain receptors.

Peppers likely developed capsaicin to keep mammals – with our flat, seed-destroying teeth – away from the fruit. Sweet peppers, it’s thought, hope to dissuade by mimicry: if they look enough like hot peppers, mammals won’t risk eating them. Birds, which generally lack teeth and allow the seeds to pass through their digestive tract intact, can’t detect capsaicin and thus can easily eat chili peppers. This allows for wide dispersal as the birds release the seeds when they defecate.

Mostly this investment in capsaicin seems to have worked. Paul Bosland, director of the Chile Pepper Institute at New Mexico State University which breeds and researches chili peppers, told me that mice will avoid eating chili peppers if they have other options. Paul Rozin, a professor of psychology at the University of Pennsylvania, who has studied dogs in Mexico that are frequently fed the chili-infused leftovers of their human owners, has observed the same thing. I don’t believe, however, that the pepper fully anticipated Homo sapiens. As someone who has eaten rotted fermented shark, durian fruit with its redolent gasoline-like aroma, and the slimy mass of eyes and tentacles that comprise conch, I’m unsure what manner of beast or botanic we humans would be unwilling to eat. And yet, even through this lens, our tolerance of chili peppers is unique.

According to Rozin, disgust is ‘acquired’. We learn to hate decaying foods, ‘odd meats’ or stinky things. It’s why many cultures tolerate spoiled milk in the form of cheese, but not in the clotted form it takes when aged in a milk carton at the back of a fridge. It’s also why many Asian cultures view both cheese and spoiled milk with disgust: they have not crafted a cheese exception. On the other hand, we don’t have to learn that irritating foods – such as chili peppers – should be spit out. Young children below the age of five generally dislike chili peppers. We have to learn to like them.

Why so many of us willingly eat a food that elicits a sensation akin to poking our tongue into a fire is a curious question. As a rule, people in hotter countries, or hotter parts of the same country, consume more chili than those in cooler climes: in the United States, Louisiana’s jambalaya is spicy, while New England’s clam chowder is not. Food can be preserved in cold climates just by leaving it outdoors, in nature’s refrigerator, but hotter climates require more extreme methods. Chili peppers, powered by antimicrobials, kill many of the microbes that spoil food in warmer climates. Over time, people learned that foods laced with chili peppers were less likely to send them running to the toilet. Though, in many people, capsaicin causes ‘a ring of fire’ as it exits the anal sphincter, so chili peppers, in effect, substitute one sort of violent eruption for another.

While chili peppers are antimicrobial, garlic, onion, oregano, cloves and even the demure bay leaf all pack a stronger microbe-killing punch, and also go easier on the tongue. And most of these spices are also found in chili-containing cuisines. Similarly, chili’s absence in colder climates is easily explained by the fact that they’re warm-weather plants that evolved in desert climates. Throw them into cooler climates, or put a little too much nitrogen in their soil, and you’ll get a very lovely pepper plant – but no fruit. Perhaps most critically, as our YouTube brethren show, chili pepper consumption has not declined with the advent of refrigeration – it’s increased. Since 2000, hot-sauce consumption in the US rose by 150 per cent.

In my case, something almost magical happens when I bite into a chili-spiced dish. There’s the release of heat, a rising cascade of pain, but with it comes a forced awareness of my body. My nostrils clear, I start to sweat a little, and I develop a laser-like focus on what’s going on inside of my mouth: flavours become more intense. Most significantly, the body’s natural processes trigger internal opiates in the face of pain, which suggests that chili-pepper eaters are essentially drugging themselves.

There’s a macho, mostly male, do-or-die culture around eating the world’s hottest peppers. Adam Richman, the likeable host of the US TV show "Man v Food," routinely undertook "heat challenges," subjecting himself to the most spicy foods. In November 2012, the chef Arif Ali passed out and was hospitalised after eating chicken wings prepared in a sauce containing bhut jolokia chilis for a "Man v Food"-style challenge at a restaurant in London. In England, the Clifton Chilli Club chili-eating contest features 17 rounds in which participants take on increasingly hotter peppers. Similar competitions exist in Nagaland, India (homeland of the bhut jolokia) as well as in Scotland, Thailand and the US. The habitual eating of chilis renders other foods bland, which is why pepperheads need ever hotter varieties to get the same kick, routinely consuming peppers that would leave the rest of us drenched in sweat.

Machismo aside, most people don’t chase the higher kick – we get to like a higher level of heat, but then stay there. Also, we don’t experience withdrawal symptoms. Rozin says that the chili pepper "doesn’t have the usual addictive properties"’: you might miss its heat, but you can function without it. In this sense, chilis are only as corrupting to the palate as salt or sugar. Most importantly, animals share similar biological symptoms and get addicted to the same drugs we get addicted to. If humans were becoming addicted to chili peppers, "then the animals in Mexico would also like it, because they have that system and they eat hot peppers all the time," says Rozin. ‘But they don’t like it which makes me think that there’s something more human about it.’

Rozin thinks that our ability to enjoy experiences that logic says we shouldn’t is an example of a hedonic reversal – the word hedonic coming from the Ancient Greek hēdonikos, meaning pleasure. But the term I prefer is benign masochism. It’s a definition that Rozin extends to a number of human pleasures: for example, sky diving and rollercoaster riding. If you enjoy the bitterness of tobacco, coffee or dark chocolate, that’s benign masochism. If your idea of a swell time is a good cry over a sad movie, or a good scream at a horror flick, that too is benign masochism. It embraces a level of meta-awareness: you feel the pain, but with the distance of knowing it’s not truly harmful. It is a form of enjoying mind over body.

As far as we know, hedonic reversal is a distinctly human form of enjoyment, related in part to the fact that feelings of aversion and pleasure overlap closely in our brains, releasing similar, upbeat, chemical payloads. People who enjoy chili peppers also tend to be more sensation-seeking – though not as much as people who are into fear.

Why we enjoy watching people eat chilis is another matter. Rozin again: ‘When you watch a horror movie, you’re really scared. It’s not quite the same in this case. But you’d have to feel, get some signal yourself, that there’s something negative. Perhaps it’s a case of "projected benign masochism" or "benign sadism". Either way, given danger without harm, gained for a handful of change in the comfort of your own kitchen, my response to the question of why anyone would eat a chili pepper is: why would anyone not? Ω

[Kendra Pierre-Louis is a journalist and photographer whose writing includes Green Washed: Why We Can't Buy Our Way to a Green Planet (2012). She also has written articles for The Washington Post, Newsweek, In These Times, Modern Farmer, and Slate. Pierre-Louis received a BA (economics) from Cornell University as well as an MA (sustainable development) from the School for International Training (SIT).]

Copyright © 2016 Aeon Media Group

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2016 Sapper's (Fair & Balanced) Rants & Raves

Monday, September 26, 2016

Here's A Snapshot Of Today's Political Discourse Courtesy Of Tom Tomorrow (Dan Perkins)

In today's 'toon, Tom Tomorrow (Dan Perkins) creates an imaginary colloquy between Sparky the Wonder Penguin (wearing Inuit-style goggles to ward of bull$hit blindness) and a prototypical Trumpster. Spark speaks solo to the stolid Trumpster for the first 4 panels of the 'toon and provides an exhaustive inventory of the fatal flaws in the Stupid (GOP) candidate for POTUS 45. In the next-to-last panel, Spark awaits the Trumpster's response and finally in the sixth panel, the Trumpser comes out of his self-induced coma and offers a lame apology for not listening to Spark's list of the multitude of disqualifying traits in the Stupid (GOP) candidate because the Trumpster was lost in a reverie of the coming renaissance of US greatness. And Spark replies by damning the Trumpster with faint praise. If this is a (fair & balanced) portrayal of US politics in September 2016, so be it.

[x This Modern World]
The Unpersuadable
By Tom Tmorrow (Dan Perkins)

Tom Tomorrow/Dan Perkins

[Dan Perkins is an editorial cartoonist better known by the pen name "Tom Tomorrow". His weekly comic strip, "This Modern World," which comments on current events from a strong liberal perspective, appears regularly in approximately 150 papers across the U.S., as well as on Daily Kos. The strip debuted in 1990 in SF Weekly. Perkins, a long time resident of Brooklyn, New York, currently lives in Connecticut. He received the Robert F. Kennedy Award for Excellence in Journalism in both 1998 and 2002. When he is not working on projects related to his comic strip, Perkins writes a daily political blog, also entitled "This Modern World," which he began in December 2001. More recently, Dan Perkins, pen name Tom Tomorrow, was named the winner of the 2013 Herblock Prize for editorial cartooning. Even more recently, Dan Perkins was a runner-up for the 2015 Pulitzer Prize for Editorial Cartooning.]

Copyright © 2016 Tom Tomorrow (Dan Perkins)

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2016 Sapper's (Fair & Balanced) Rants & Raves