Tuesday, May 23, 2017

Forget Rubbing Your Thumb And Forefinger Together To Keep The Elephants Away — Everyone Knows That A Daily Blog Entry Is the ONLY Effective Elephant-Repellent On The Market

Obviously, Eags (Timothy Egan) breaks this blog's string of contributors who wear a 46XY on their genetic jerseys. However, Eags devotes his column to an amazing woman who recently celebrated her 100th birthday at a dinner with friends (and Eags). So, in a way, the week of celebration of carriers of the 46XY chromosome is not disrupted. If this is a (fair & balanced) tribute to not merely enduring, but prevailing, so be it.

[x NY Fishwrap]
On Turning 100
By Eags (Timothy Egan)


TagCrowd cloud of the following piece of writing

created at TagCrowd.com

I went to a birthday party the other day for someone who has lived through the flu pandemic of 1918, World War I, the Great Depression, World War II, the nuclear jitters of the Cold War and the disastrous first four months of Donald Trump.

Alma Balter, at age 100, takes in that century of life with a shrug and looks around the table at who’s not going to finish their dessert. She is no fickle eater. When we go out to dinner, everyone else orders a light pasta or just a salad.

“I’ll have the porterhouse,” she says. That is, if the ribs aren’t available. I’ve seen a hefty slab, smothered in barbecue sauce, disappear at her end as if she were hosting a conqueror’s feast in “Game of Thrones.” The waiter usually pauses. The smaller portion, ma’am?

“No, the 12-ouncer is fine.”

What is she going to do later — farm labor, lifting 50-pound hay bales or moving granite stones? She burns most of her calories, it turns out, with a bridge game.

Alma is one of more than 70,000 people in the United States who are alive today having made it to triple digits — a growing demographic. She shares an apartment complex, and many a meal, with Holocaust survivors, widows of Nazi-killing war heroes and people who knew Jackie Robinson when he played four sports at UCLA, a few blocks from her home in Westwood [CA]. At present, her health is fine, as is the aforementioned appetite.

When you go to a 100th birthday party — my first — people always want to know the secret to long life. Last month, Emma Morano died at the age of 117 at her home in Italy. She was, for a time, the world’s oldest human. Her secret was not something cardiologists would recommend: She ate three eggs a day, two of them raw, and was a regular consumer of hazelnut cookies chased by home-spiked grappa. The drink, for those who’ve never been in a road emergency on the Italian autostrada, might work as fuel for your Fiat in a pinch.

Given that you can break the rules of health-obsessives and still live longer than friends who eat like squirrels, I’m more interested in the time that these folks have passed, rather than how they got there.

Alma was born during Woodrow Wilson’s presidency, the same year and month that John F. Kennedy came into the world. One in 10 American babies would not live to see their first birthday. Life expectancy at birth was 54. The United States had just entered the First World War, that senseless clash that took the lives of about 17 million people and upended centuries-old empires.

The first miracle of her life was surviving to the age of 2 — during the flu pandemic. It killed more people than the war, 20 to 40 million worldwide, probably more than died during the bubonic plague of the 14th century. By the time the flu subsided, at the dawn of the Roaring Twenties, women could not legally drink alcohol, along with everyone else. But finally they could vote.

During the Great Depression, when one in four American adults were out of work and many homes still did not have indoor plumbing, the birthrate plummeted, as did hope. And then came World War II, which killed upward of 60 million people — about 3 percent of the global population.

To come of age in the middle part of what may have been the bloodiest century in history requires you to remember that one should never let yesterday use up too much of today.

It would get scarier still. Since the 1950s, nuclear war — the lights-out, end-it-all global annihilation — has hung over humanity. But then we got smartphones, all the world’s knowledge in the palm of our hands. And for the first time, really old age was not uncommon.

Trump presents a special problem for those in the ultra-golden years. They can remember a lot of presidents who are now turning over in their graves. He is singularly, historically, epically awful — scary in his willful incompetence. And yet, given what a centenarian has seen, you know that he, too, will pass, and we will survive.

“I know how lucky I am,” Roger Angell wrote in a 2014 essay in his longtime literary home, The New Yorker. “Decline and disaster impend, but my thoughts don’t linger there.” He was 93 and lucid enough that he could not forget Keats, Dick Cheney or what was waiting for him at the dry cleaner.

On her birthday, Alma noted that one of her relatives lived to be 106 — implying that many more helpings of thick beef slabs, and whiskey sours at happy hour, were ahead. She told a joke about a man who made a scratchy sound by rubbing two fingers together, as a way to keep the elephants away. No, he was told — you’re crazy! Why keep doing it?

“It’s worked so far.” # # #

[Timothy Egan writes "Outposts," a column at the NY Fishwrap online. Egan — winner of both a Pulitzer Prize in 2001 as a member of a team of reporters who wrote the series "How Race Is Lived in America" and a National Book Award (The Worst Hard Time in 2006) — graduated from the University of Washington with a degree in journalism, and was awarded an honorary doctorate of humane letters by Whitman College in 2000 for his environmental writings. Egan's most recent book is The Big Burn: Teddy Roosevelt and the Fire that Saved America (2009).]

Copyright © 2017 The New York Times Company



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2017 Sapper's (Fair & Balanced) Rants & Raves

Monday, May 22, 2017

Watergate & Russiagate Illustrate The Truism That History Doesn't Repeat Itself, But It Does Provide A Monotonous Background Hum

Tom Tomorrow (Dan Perkins) betrayed a smidgen of crap-fatigue in the e-mail that accompanied today's 'toon that focused on the parallels between Watergate (1970s monochromatic panels) and Russiagate (2017 full color panels) in the e-mail accompanying today's 'toon, Tom (Dan) wrote:

Some weeks, I just don't have much more to add. More next time, hopefully!

Dan (aka Tom)

Like Tom (Dan), this blogger is rotten sick of the entire melodrama playing out in DC. Spirits were not lifted with the so-called presidential address delivered yesterday to the assembled heads of state in the Middle East. The orator, reading off the teleprompter screen, sounded like a 4th grader speaking to his class. Yawn. And there you have it, dear reader. If this is (fair & balanced) political ennui, so be it.

[x TMW]
Then And Now
By Tom Tomorrow (Dan Perkins)

Tom Tomorrow/Dan Perkins

[Dan Perkins is an editorial cartoonist better known by the pen name "Tom Tomorrow." His weekly comic strip, "This Modern World," which comments on current events from a strong liberal perspective, appears regularly in approximately 150 papers across the U.S., as well as on Daily Kos. The strip debuted in 1990 in SF Weekly. Perkins, a long time resident of Brooklyn, New York, currently lives in Connecticut. He received the Robert F. Kennedy Award for Excellence in Journalism in both 1998 and 2002. When he is not working on projects related to his comic strip, Perkins writes a daily political blog, also entitled "This Modern World," which he began in December 2001. More recently, Dan Perkins, pen name Tom Tomorrow, was named the winner of the 2013 Herblock Prize for editorial cartooning. Even more recently, Dan Perkins was a runner-up for the 2015 Pulitzer Prize for Editorial Cartooning.]


Copyright © 2017 This Modern World/Tom Tomorrow (Dan Perkins)



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2017 Sapper's (Fair & Balanced) Rants & Raves

Sunday, May 21, 2017

No, This "Lights-Out" Essay In the MIddle East Has Nothing To Do With Current Speeches & Photo-Ops In The Region

Running in the background as this post is being prepared is the live broadcast of the Idiot-in-Chief speaking to an assemblage of Middle East heads of state in Riyadh, Saudi Arabia. Amazingly, the power did not fail during the speech (thus far), but the inane remarks — read with 4th eloquence from the teleprompter screen — at incredibly low-wattage (blah, blah, blah). However, that aside, the march of women in this blog continues with a report about a little-noticed crisis throughout the Middle East. If this is a (fair & balanced) shock, so be it.

[x New Yorker]
The Lights Are Going Out In The Middle East
By Robin Wright


TagCrowd cloud of the following piece of writing

created at TagCrowd.com

Six months ago, I was in the National Museum in Beirut, marvelling at two Phoenician sarcophagi among the treasures from ancient Middle Eastern civilizations, when the lights suddenly went out. A few days later, I was in the Bekaa Valley, whose towns hadn’t had power for half the day, as on many days. More recently, I was in oil-rich Iraq, where electricity was intermittent, at best. “One day we’ll have twelve hours. The next day no power at all,” Aras Maman, a journalist, told me, after the power went off in the restaurant where we were waiting for lunch. In Egypt, the government has appealed to the public to cut back on the use of light bulbs and appliances and to turn off air-conditioning even in sweltering heat to prevent wider outages. Parts of Libya, which has the largest oil reserves in Africa, have gone weeks without power this year. In the Gaza Strip, two million Palestinians get only two to four hours of electricity a day, after yet another cutback in April.

The world’s most volatile region faces a challenge that doesn’t involve guns, militias, warlords, or bloodshed, yet is also destroying societies. The Middle East, though energy-rich, no longer has enough electricity. From Beirut to Baghdad, tens of millions of people now suffer daily outages, with a crippling impact on businesses, schools, health care, and other basic services, including running water and sewerage. Little works without electricity.

“The social, economic and political consequences of this impending energy crisis should not be underestimated,” the UN special coördinator for the Middle East peace process, Nickolay Mladenov, warned last month, about the Gaza crisis. The same applies across the region.

Public fury over rampant outages has sparked protests. In January, in one of the largest demonstrations since Hamas took control in Gaza a decade ago, ten thousand Palestinians, angered by the lack of power during a frigid winter, hurled stones and set tires ablaze outside the electricity company. Iraq has the world’s fifth-largest oil reserves, but, during the past two years, repeated anti-government demonstrations have erupted over blackouts that are rarely announced in advance and are of indefinite duration. It’s one issue that unites fractious Sunnis in the west, Shiites in the arid south, and Kurds in the mountainous north. In the midst of Yemen’s complex war, hundreds dared to take to the streets of Aden in February to protest prolonged outages. In Syria, supporters of President Bashar al-Assad in Latakia, the dynasty’s main stronghold, who had remained loyal for six years of civil war, drew the line over electricity. They staged a protest in January over a cutback to only one hour of power a day.

Over the past eight months, I’ve been struck by people talking less about the prospects of peace, the dangers of ISIS, or President Trump’s intentions in the Middle East than their own exhaustion from the trials of daily life. Families recounted groggily getting up in the middle of the night when power abruptly comes on in order to do laundry, carry out business transactions on computers, charge phones, or just bathe and flush toilets, until electricity, just as unpredictably, goes off again. Some families have stopped taking elevators; their terrified children have been stuck too often between floors. Students complained of freezing classrooms in winter, trying to study or write papers without computers, and reading at night by candlelight. The challenges will soon increase with the demands for power—and air-conditioning—surge, as summer temperatures reach a hundred and twenty-five degrees.

The reasons for these outages vary. With the exception of the Gulf states, infrastructure is old or inadequate in many of the twenty-three Arab countries. The region’s disparate wars, past and present, have damaged or destroyed electrical grids. Some governments, even in Iraq, can’t afford the cost of fuelling plants around the clock. Epic corruption has compounded physical challenges. Politicians have delayed or prevented solutions if their cronies don’t get contracts to fuel, maintain, or build power plants.

The movement of refugees has further strained equipment. Lebanon, Jordan, Iraq, and Egypt, already struggling, have each taken in hundreds of thousands of Syrian refugees since 2011. The frazzled governor of Erbil, Nawzad Hadi Mawlood, told me that Iraq’s northern Kurdistan—home to four million Kurds—has taken in almost two million displaced Iraqis who fled the Islamic State since 2014, as well as more than a hundred thousand refugees fleeing the war in neighboring Syria since 2011. Kurdistan no longer has the facilities, fuel, or funds to provide power. It averages between nine and ten hours a day, a senior technician in Kurdistan’s power company told me, although it’s worse in other parts of Iraq.

I called on the technician at his home because electricity is now such a political flashpoint that he didn’t want to be seen hosting a journalist at work. Dusk was settling in so the living room was poorly lit. His house had no electricity, either. The only thing that worked was a wall clock. “It’s battery-operated,” he told me. I asked if he knew when electricity would return. He shrugged. “When you see the lights go on,” he replied. “I only work for the power company. They don’t tell us the hours, either.”

In Erbil, as in cities across the Middle East and North Africa, the only alternatives are noisy and polluting generators that cost three to ten times state rates. “I have no generator,” the technician noted.

In Lebanon, Moustafa Baalbaki, a young software engineer, tried to help people cope with outages by developing the cell-phone app Beirut Electricity, which does what the government doesn’t: it forecasts power cuts in the capital—and sends alerts ten minutes before the power goes out.

“I spent days and nights vainly wishing for the government to fix this problem,” Baalbaki told me. He originally figured out the complicated algorithm to plan his university classes and figure out when his ailing grandmother could visit so she could take the elevator to the family’s ninth-floor apartment. It worked so well that he offered it free to other Lebanese through Apple. It had almost ten thousand downloads the first day.

“The government doesn’t like me much,” Baalbaki told me. “I’m really not in competition. I’m just trying to find solutions that make life tolerable in these conditions. It’s terrible that we live like this in 2017. I’d be happy to kill the app—if we’d just get twenty-four hours of electricity.” # # #

[Robin Wright is a contributing writer for The New Yorker (online) and has written for the magazine since 1988. Her first piece on Iran won the National Magazine Award for best reporting. A former correspondent for the Washington Post, CBS News, the Los Angeles Times, and the Sunday Times of London, she has reported from more than a hundred and forty countries. She is currently a joint fellow at the U.S. Institute of Peace and the Woodrow Wilson International Center for Scholars. She has also been a fellow at the Brookings Institution and the Carnegie Endowment for International Peace, as well as at Yale, Duke, Dartmouth, and the University of California, Santa Barbara. wright's most recent book book, Rock the Casbah: Rage and Rebellion Across the Islamic World (2011, 2012), was selected as the best book on international affairs by the Overseas Press Club. See her other books here. Wright received both a BA and an MA (history) from the University of Michigan at Ann Arbor; she was the first woman appointed as the sports editor of The Michigan Daily aw well.]

Copyright © 2017 The New Yorker/Condé Nast Digital



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2017 Sapper's (Fair & Balanced) Rants & Raves

Saturday, May 20, 2017

Witch Way To The Truth?

For those wanting more about the temporary madness in Salem, MA in 1692, here is a link to an essay that Stacy Schiff derived from her latest book, The Witches: Salem, 1692 (2015).

To help with the reading, try the best pop song about the dark arts:


[x YouTube/RebelSongbird Channel]
"Witchcraft" (1957)
By Frank Sinatra

Thanks to Schiff, it's obvious that the real witch hunt is the frantic search for insiders "leaking" embarrassing tidbits to reporters. If this is the (fair & balanced) consideration of public madness, so be it.

[x New Yorker]
The Single Greatest Witch Hunt In American History, For Real
By Stacy Schiff


TagCrowd cloud of the following piece of writing

created at TagCrowd.com

It didn’t take long for our President to declare the appointment of a special counsel for the Russia inquiry “the single greatest witch hunt of a politician in American history.” Historical literacy has never been for everyone. Even the ancients complained of ignorance about the past and inaccuracies on the page. The greatest witch hunt in American history, of course, occurred in 1692, not 2017. It’s worth revisiting, as it happens to offer a few lessons about name-calling, special prosecutors, and abuses of power. Strictly speaking, the Salem witch trials were less a hunt than a free-for-all. Beginning with three more or less usual suspects, they ended in a colony-wide epidemic. Fingers pointed in every direction as friends and families accused one another. By some counts as many as seven hundred witches flew about Massachusetts. A special court prosecuted the cases according to the law of the land. Nineteen innocent men and women hanged. Over several days, a twentieth would be crushed under stones, for contempt of court

Behind those witchcraft prosecutions—not Massachusetts’s first, but forever its most infamous—stood the colony’s best-educated men. The political élite had reason to embrace the trials. Together they had recently sent a royal governor packing, in a political coup; they had a fledgling administration to support. At its head sat a barely literate man, rude and reckless, a rascally treasure hunter installed by a beleaguered group of purists eager to safeguard their privileges and padlock their ranks. A weak, absent administrator, he had little interest in governing. He far preferred glorious deeds involving sunken treasures and Indian scalps. He was without political experience; he threw tantrums; he bullied and insulted elected officials. His supporters worried about legitimacy and strained to broadcast proficiency. Having earlier incited a mob to overturn the government, they needed to prove their law-and-order credentials. Political concerns outweighed all else. Close-knit and inbred, those men constituted as much a “real family” as a fraternity. Their business interests coincided. They moved in lock step.

Why was there no twenty-first victim of the Salem witch trials? The initial attempts to object to the proceedings proved dangerous. The skeptic was a marked man; he could count on being rewarded with a witchcraft accusation. Early on, a Baptist minister warned that the court stood in danger of convicting innocents. He was offered a choice between a jail sentence and a crushing fine. He would not be heard from again.

Only after eight frenzied months did sane men finally speak up. Establishment figures, they broke ranks with reluctance. Thomas Brattle, a thirty-four-year-old, Harvard-educated merchant, and among the wealthiest men in the colony, prefaced his remarks with a near apology: he would prefer to bite off his fingertips than cast aspersions on authority. Men were not infallible, however. And when they erred it was essential to take a stand. Sometimes silence was unconscionable. Brattle could no longer bear the government’s “ignorance and folly”; he balked at the proceedings, remarkable for irregularities of all kinds. Were they to continue, he warned, they would spell the colony’s ruin. In one of the most eloquent have-you-no-decency documents in history, Brattle asked how anyone involved in the trials would be able to “look back upon these things without the greatest of sorrow and grief imaginable.” He anticipated a stain on New England, one that ages would not remove.

Diplomatic though he was, Brattle also registered his dissent anonymously, in a letter that circulated privately, probably later than we would like to believe. The original is nowhere to be found. Integrity wins no popularity contests; at first blush it bears a resemblance to disloyalty. It is not easy to comment on the emperor’s wardrobe. It is infinitely easier to sully the reputations of others, to divert attention with a delusional narrative and to trample accountability. President Trump, in more than one Oval Office tweet, has suggested that any wrongdoing lies with those who give information to reporters, and he has urged his government to find the “leakers.” That sounds curiously like witch-hunting to me. # # #

[Stacy M. Schiff is a Pulitzer Prize-winning nonfiction author and columnist for both The New Yorker and The New York Times. She won the 2000 Pulitzer Prize for Biography or Autobiography for Vera, a biography of Vera Nabokov, wife and muse of Vladimir Nabokov. Her biography Cleopatra: A Life was published in 2010 and most recent she has written The Witches: Salem, 1692 (2015). Schiff received a BA (history) from Williams College as well as a DLit, honoris causa from her alma mater.]

Copyright © 2017 The New Yorker/Condé Nast Digital



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2017 Sapper's (Fair & Balanced) Rants & Raves

Friday, May 19, 2017

Impeachment? Did Someone Whisper "Impeachment"?

So many words have been uttered/whispered in the unsettled week just passed. The I-word has been prominent in the musings of the aggrieved: Impeachment. The basic source is Article II, Section 4 of the Constitution of the United States of America. In our history, only two POTUSes have been impeached: Andrew Johnson (1868) and William Clinton (1998) were impeached by the House of Representatives, acting as a grand jury, and both were acquitted by the Senate, acting as a trial jury. A third POTUS, The Trickster (1974) resigned before the process could be implemented. Today — carrying the distaff banner —The Jillster offers a brief explanation of the inclusion of impeachment in the US Constitution. The basis of her explication comes from two unofficial sources: James Madison's Notes of Debates in the Federal Convention of 1787 (published in 1840), supplemented by the notes of Robert Yates (published in 1821). One of the first actions of the Constitutional Convention was to proceed without a record (minutes) of discussions in camera. If this is (fair & balanced) true originalism, so be it.

[x New Yorker]
How Impeachment Ended Up In The Constitution
By The Jillster (Jill Lepore)


TagCrowd cloud of the following piece of writing

created at TagCrowd.com

On the morning of Friday, July 20, 1787, delegates to the Constitutional Convention, in Philadelphia, addressed the question of whether or not a President could be impeached while in office. A king might be beheaded, a Prime Minister toppled. What fate could befall a terrible President? Charles Pinckney, of South Carolina, and Gouverneur Morris, of Pennsylvania, moved to strike out a proposed phrase stipulating that the President could be removed “on impeachment and conviction for malpractice or neglect of duty.” Morris thought that if a President committed crimes, he wouldn’t be reëlected, and that would be that, since no other solution accorded with the separation of powers. “Who,” he wondered, “shall impeach?” The irascible George Mason, of Virginia, found this argument absurd. “Shall any man be above justice?” Mason asked. “Above all, shall that man be above it who can commit the most extensive injustice?” It was as good a question then as it is now.

The delegates had been debating the manner of electing a President, wrangling over the weighty matters of the Electoral College and proportionate representation. It was a hot summer. They were tired and ornery, and a good many of them had grown impatient with compromise. “One objection against electors was the danger of their being corrupted by the candidates, and this furnished a peculiar reason in favor of impeachments whilst in office,” Mason pointed out. If the Electoral College were to stand, Mason thought, impeachment ought not be a matter open to compromise: it was an essential remedy. He wanted to know, “Shall the man who has practised [original spelling] corruption, and by that means procured his appointment in the first instance, be suffered to escape punishment by repeating his guilt?”

Benjamin Franklin, the oldest delegate, at eighty-one, ventured his wisdom. He favored impeachment, too, but cited fairness. No man ought to be convicted by hearsay or denied a fair trial. Why not “provide in the Constitution for the regular punishment of the executive, where his misconduct should deserve it, and for his honorable acquittal, where he should be unjustly accused”?

In that case, Morris suggested, the Constitution ought to enumerate a list of impeachable offenses and define them, one by one. Right off the top of his head, James Madison could think of a lot of good reasons to impeach a President. He ticked off this list: “He might lose his capacity after his appointment. He might pervert his administration into a scheme of peculation or oppression. He might betray his trust to foreign powers.” (To peculate is to embezzle.) It’s a very good list. Members of Congress might want to consult it.

Nevertheless, a few delegates pressed on with objections to the idea of impeaching a sitting President. Rufus King, of Massachusetts, along with Pinckney, worried that the independence of the executive branch would be lost if the threat of impeachment were wielded by the legislative branch and held over the President “like a rod.” But King’s fellow Massachusetts delegate, Elbridge Gerry, disagreed, arguing that no decent President had anything to fear from members of Congress who represented the interests of the people: “A good magistrate will not fear them,” Gerry said. “A bad one ought to be kept in fear of them.” He hoped people might remember that as a maxim.

Pinckney and King tried to pass a motion to table the discussion, but that failed, and the matter went to a vote on the question, “Shall the executive be removable on impeachments?” Every state delegation except those of South Carolina and Massachusetts voted “aye.” Even Gouverneur Morris had come around and changed his mind. After all, he said, in this new government, “The people are the king.” But the President is only a man, as true and as honorable as the best of us, or as false and dishonorable as the worst. # # #

[Jill Lepore is the David Woods Kemper '41 Professor of American History at Harvard University as well as the chair of the History and Literature Program. She also is a staff writer at The New Yorker. Her latest books are The Story of America: Essays on Origins (2012), Book of Ages: The Life and Opinions of Jane Franklin (2013). and The Secret History of Wonder Woman (2014). Lepore earned a BA (English) from Tufts University, an MA (American culture) from the University of Michigan, and a PhD (American studies) from Yale University.]

Copyright © 2017 The New Yorker/Condé Nast Digital



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2017 Sapper's (Fair & Balanced) Rants & Raves

Thursday, May 18, 2017

Elementary, Isn't It?

Let's see — we have investigated the corrosive effects of anger and pathological incompetence thus far in ye olde blog. Today, neuroscience writer Maria Konnikova takes a look at empathy ("the ability to walk in another person's shoes") as an admirable trait. And she finds a surprising source of empathetic behavior: Sherlock Holmes. The fictional detective of novels set in the late 19th and early 20th century England, Holmes was notable for acute observation, forensic science, and logical reasoning in solving an array of crimes. Konnikova holds that Holmes was a cognitive empath. If this is (fair & balanced) neuroscientific imagining, so be it.

[x Aeon]
The Empathy Machine
By Maria Konnikova


TagCrowd cloud of the following piece of writing

created at TagCrowd.com

What’s the first thing you think of when you hear the name Sherlock Holmes? It might be a deerstalker, a pipe or a violin, or shady crimes in the foggy streets of London. Chances are, it’s not his big, warm heart and his generous nature. In fact, you might think of him as a cold fish — the type of man who tells his best friend, who is busy falling in love, that it ‘is an emotional thing, and whatever is emotional is opposed to that true cold reason which I place above all things’. Perhaps you might be influenced by recent adaptations that have gone so far as to call Holmes a ‘sociopath’.

Not the empathetic sort, surely? Or is he?

Let’s dwell for a moment on Silver Blaze (1892), Arthur Conan Doyle’s story of the gallant racehorse who disappeared, and his trainer who was found dead, just days before a big race. The hapless police are stumped, and Sherlock Holmes is called in to save the day. And save the day he does – by putting himself in the position of both the dead trainer and the missing horse. Holmes speculates that the horse is ‘a very gregarious creature’. Surmising that, in the absence of its trainer, it would have been drawn to the nearest town, he finds horse tracks, and tells Watson which mental faculty led him there. ‘See the value of imagination.... We imagined what might have happened, acted upon that supposition, and find ourselves justified.’

Holmes takes an imaginative leap, not only into another human mind, but into the mind of an animal. This perspective-taking, being able to see the world from the point of view of another, is one of the central elements of empathy, and Holmes raises it to the status of an art.

Usually, when we think of empathy, it evokes feelings of warmth and comfort, of being intrinsically an emotional phenomenon. But perhaps our very idea of empathy is flawed. The worth of empathy might lie as much in the ‘value of imagination’ that Holmes employs as it does in the mere feeling of vicarious emotion. Perhaps that cold rationalist Sherlock Holmes can help us reconsider our preconceptions about what empathy is and what it does.

Though the scientific literature on empathy is complex, a recent review in Nature Neuroscience by a team of researchers from Harvard and Columbia including Jamil Zaki and Kevin Ochsner has distilled the phenomenon into three central stages. The first stage is ‘experience sharing’, or feeling someone else’s emotions as if they were your own – scared when they are scared, happy when they are happy, and so on. The second stage is ‘mentalising’, or consciously considering those states and their sources, and trying to work through understanding them. The final stage is ‘prosocial concern’, or being motivated to act – wanting, for example, to reach out to someone in pain. However, you don’t need all three to experience empathy. Instead, you can view these as three points on an empathetic continuum: first, you feel; then, you feel and you understand; and finally, you feel, understand, and are compelled to act on your understanding. It seems that the defining thing here is the feeling that accompanies all those stages.

‘Sympathy’ is an idea with a deep history — in ancient Greek, sympatheia means, literally, ‘with suffering’ — but ‘empathy’ is a newcomer to popular use. The word was coined by the British cognitive psychologist Edward Titchener as late as 1909: ‘Not only do I see gravity and modesty and pride and courtesy and stateliness,’ he wrote, ‘but I feel or act them in the mind’s muscle. This is, I suppose, a simple case of empathy, if we may coin that term as a rendering of Einfühlung.’ To Titchener, empathy was a kind of ‘feeling into’ someone else’s emotional state.

Soon, the word was being used by therapeutically minded psychologists such as Carl Rogers, the American psychologist and a founder of the humanist approach, who wrote in his book Client-Centred Therapy (1951) that therapists needed to ‘live the attitudes of the other’. But while the term grew quickly in currency — the psychoanalyst Stanley Olinick termed it a ‘buzz word’ in 1984 — it remained for a long time relatively amorphous and fluid in definition and intention.

In 1986, the psychologist Lauren Wispé tried to pin down the notion of empathy in a systematic way. ‘Of course,’ she wrote, ‘the important questions are why individuals are moved to sympathy or empathy, under what conditions, and for whom.’ Despite her commitment to a fresh, objective look at the concept, she defined empathy from the start as being based on a feeling, a compulsion: we are moved, under the right conditions and with the right people at hand. The possibility that we might not be moved, that we might instead choose to think and act in the interests of another without the attendant emotional push isn’t considered.

But is this necessarily correct? The basis of empathy is being able to see things from someone else’s point of view. Empathy lets us ‘walk a mile in another man’s shoes’, look at the world through the eyes of another, or any number of other now-clichéd phrases. But while that perspective-taking seems intimately tied to the emotion of the thing — you walk in someone’s shoes to feel their pain, look through their eyes to understand their feelings — it need not be. As recent research suggests, there are times when becoming too emotionally involved actually stifles our empathetic capacity.

What would it look like if we were to imagine a personality that was deeply empathetic — and yet wholly unemotional? This person would be, I think, just like that emotionless paragon we invoked earlier: Sherlock Holmes, the world’s greatest fictional detective. Holmes is cold and logical. Holmes is detached. As he explains when Watson remarks on the attractiveness and saintliness of a certain young lady, ‘It is of the first importance not to allow your judgment to be biased by personal qualities.’ He explains the importance of leaving his own feelings out of his calculations: ‘A client is to me a mere unit, a factor in a problem. The emotional qualities are antagonistic to clear reasoning. I assure you that the most winning woman I ever knew was hanged for poisoning three children for their insurance-money, and the most repellent man of my acquaintance is a philanthropist who has spent nearly a quarter of a million upon the London poor.’

Holmes, it seems, is a mere problem-solving machine, hardly human at all. But he is also a man of inordinate creativity of thought. He refuses to stop at facts as they appear to be. He plays out many possibilities, maps out various routes, lays out myriad alternative realities in order to light upon the correct one. His is the opposite of hard, linear, A-to-B reasoning. If he were to stick to such an approach, he would be no better than an Inspector Lestrade or a detective Gregson — those Scotland Yard dullards who approach crime in a linear fashion, without his sparkle and imagination.

In fact, his success stems from the very non-linearity and imaginative nature of his thinking, his ability to engage the hypothetical just as he might the physical here-and-now. Think of The Valley of Fear (1915), Conan Doyle’s final Sherlock Holmes novel, in which Inspector MacDonald, or Mac, as Holmes affectionately calls him, bungles along with the obvious leads — looking for a missing bicyclist, following up with hotels and stations, and generally doing everything an energetic detective might be expected to. Holmes instead asks to spend a night in the room of the crime. Why? Musing in the atmosphere where the crime was committed helps him to see the world as the criminal did, to think as he might have done. Imagination is central to his reasoning powers.

So Holmes is an expert at the very thing that makes empathy possible in the first place — seeing the world from another’s point of view. He is entirely capable of understanding someone else’s internal state, mentalising and considering that state, and exhibiting prosocial concern. Indeed, he is a master of it. At the end of The Adventure of the Noble Bachelor (1892), it is Holmes, not Watson, who best understands the motivations of the bachelor in question. Watson remarks wryly that ‘his conduct was certainly not gracious’. And Holmes replies with a smile: ‘Ah, Watson, perhaps you would not be very gracious either, if, after all the trouble of wooing and wedding, you found yourself deprived in an instant of wife and of fortune. I think that we may judge Lord St Simon very mercifully and thank our stars that we are never likely to find ourselves in the same position.’

No doubt Holmes would argue that his lack of emotion gives him a certain freedom from prejudice, as much as a lack of warmth. And recent research bears this out. Most of us start from a place of deep-rooted egocentricity: we take things as we see them, and then try to expand our perspectives to encompass those of others. But we are not very good at it. The notion, known as egocentric anchoring and adjustment, has been studied extensively by the psychologists Nicholas Epley of the University of Chicago and Thomas Gilovich of Cornell University. Even when we know that someone’s background is different from our own, and that we should be wary of assuming we can understand their situation as though it were our own, we still can’t shake off our own preconceptions in judging them. The more cognitively strained we are (the more we have going on mentally), the worse we become at adjusting our egocentric views to fit someone else’s picture of the world. Gilovich describes this as ‘satisficing’. We do a little work to adjust our perspectives to another’s point of view, but not much. We are ‘satisfied’ with something that merely ‘suffices’. Our neural networks might be mirroring another’s suffering, but largely because we worry how it would feel for us.

Not so Holmes. Because he has worked hard to dampen his initial emotional reactions to people, he becomes more complete in his adjustment, more able to imagine reality from an alternative perspective. Ironically, he ends up as a less egocentric and more accurate reflection of what someone else is thinking or experiencing at any given point.

Just think how precise are Holmes’s insights into people’s characters, their whims, their motivations and inner states. He strives for clarity and openness to evidence in his every encounter. As he writes in his treatise on observation, ‘By a man’s fingernails, by his coat-sleeve, by his boots, by his trouser-knees, by the callosities of his forefinger and thumb, by his expression, by his shirt-cuffs — by each of these things a man’s calling is plainly revealed.’ In our own attempts to understand others, we might think such minutiae below us — why bother with such petty concerns when there are emotions, feelings, lives at stake? — but in ignoring those petty details, we lose crucial evidence. We miss the signs of difference that enable us to walk in those shoes we don’t deign to look at closely. We lose the raw material for future creative thought. And are we being more or less empathetic when we do so? Empathy it seems, is not simply a rush of fellow-feeling, for this might be an entirely unreliable gauge of the inner world of others.

Simon Baron-Cohen, professor of developmental psychopathology at Oxford and famous for his work on autism, distinguishes between two elements of empathy. There is affective empathy, the emotional part. And there is cognitive empathy, or the ability to think oneself into another person’s mind. Based on having an effective theory of mind, this cognitive empathy provides an important counterbalance to the emotional. But must the two always go together? Can we imagine an emotionless, purely cognitive, empathy?

The question is not a new one. In their 1963 study of empathy and birth order, the psychologists Ezra Stotland and Robert Dunn distinguished the ‘logical’ and the ‘emotional’ part of empathising with similar and dissimilar others. They understood the first as an exercise in cognitive perspective-taking, and the latter as an instance of non-rational emotional contagion. More recently, Baron-Cohen has described how individuals with Autism Spectrum Disorder might not be able to understand or mentalise, yet some are fully capable of empathising (in the emotional sense) once someone’s affective state is made apparent to them — a sign, it seems, that the two elements are somewhat independent.

Physiological studies seem to support this, too. In 2009, a team of psychologists from the University of Haifa found that patients with ventromedial prefrontal damage showed consistent selective deficits in cognitive empathy and theory of mind — that is the cognitive aspects of empathy — while their emotional empathy and emotional recognition ability remained intact. Conversely, patients with lesions in the inferior frontal gyrus of the brain demonstrated remarkable deficits in emotional empathy and the recognition of emotion – but their cognitive empathy remained on a par with healthy controls. Are both of these groups, then, empathetic in their own way — the one emotionally, and the other, cognitively so?

For most of us, the dissociation between cognitive and emotional aspects of empathy is unlikely to be so extreme. Nor indeed is this the case for Holmes: Conan Doyle is quick to show us that his hero has his own sympathies, but they are well-controlled, even hidden. He is quite prepared to cover up for a well-intentioned criminal, saying: ‘I had rather play tricks with the law of England than with my own conscience.’ And his friendship with Watson provokes the occasional crack in his cool façade. ‘You’re not hurt, Watson? For God’s sake, say that you are not hurt!’ he exclaims in a rare outpouring of emotion when his friend has been shot during The Adventure of the Three Garridebs (1924).

Feelings are not entirely absent from Holmes’s empathic calculus, but they are not allowed to drive his actions. Instead, he acts only if his cognition should support the emotional outlay. And if it doesn’t? The emotion is dismissed. It’s not about the feelings for Holmes, but about the perspective-taking, the hypothetical departure from self and into a world of possibility that is the root of imagination and inspired reason. In short, it’s about the creative departure from your own mind – whatever the motivation behind that departure happens to be. In sterilising his empathy, Holmes actually makes it more powerful: a reasoned end, rather than a flighty impulse. As Watson remarks: ‘Grit in a sensitive instrument, or a crack in one of his own high-power lenses, would not be more disturbing than a strong emotion in a nature such as his.’

In speaking of empathy, psychologists such as Daniel Batson, professor of social psychology at the University of Kansas, and Frans de Waal, professor of primate behaviour at Emory University, have invoked its evolutionary value as a skill for social animals, whether human or otherwise. The so-called mirror neurons — motors that fire mimetically in our brains when we observe someone doing or experiencing something – seem to point to the deep evolutionary origins of empathy. Not only do we learn from mirroring what others do. Such mirroring also helps smooth social interaction, helps us to help one another, and helps us to overcome the hurdles that would stymie our societies if we did not have such strong pro-social inclinations.

All that makes perfect sense. But might not the other, colder part of empathy — cognitive empathy, or theory of mind — be equally adaptive in evolutionary terms? The ability to see the world from another set of eyes, to experience things vicariously, at multiple levels, is training ground for such feats of imagination and reason that allow a Holmes to solve almost any crime, an Einstein to imagine a reality unlike any that we’ve experienced before (in keeping with laws unlike any we’ve come up with before), and a Picasso to make art that differs from any prior conception of what art can be.

There is a profound cognitive leap that we are able to make. It starts with egocentricity and the world ‘as it is to me’. It lands on other-centredness and the world ‘as it is for you’. Divorce empathy from emotion — let’s call it a sterilised empathy — and you have the seedbed of logical reasoning and creative thought. Empathy and creativity share an important, even essential feature: to be creative, just as to be empathetic, we must depart from our own point of view. We must see things not as they are but as they might be. And the value of that ability extends far beyond the simple fact that some of our neurons light up when we see someone else suffering — or that we feel compelled to help when we commiserate with another human being, be he alive or fictional.

Sterilised empathy might not be sterilised so much as expanded, from an emotional ability to an essential element in creativity and problem-solving. The emotional element in empathy is itself a limited one. It is selective and often prejudicial — we tend to empathise more with people whom we know or perceive to be like us, or simply when we have more mental space to bother. Empathy can be all the more powerful and creative in its cognitive form when it is independent of context and emotional outpouring.

Sherlock Holmes might be described as cold, it’s true. But who would you like on your side when it comes to being given a fair say, to being helped when that help is truly needed, to knowing that someone will go above and beyond the call of duty for your sake, no matter who you are or what you might have done? I, for one, would choose the cool-headed Holmes, who understands the limits of human emotions, and who seeks to ‘represent justice,’ so far as his ‘feeble powers allow’. # # #

[Russian-born Maria Konnikova came to the States with her parents at age 4. Konnikova is a contributor to The New Yorker (online), where she writes a weekly blog focusing on psychology and science. She is the author of both Mastermind: How to Think Like Sherlock Holmes (2013) as well as The Confidence Game (2016). Konnikova received a BA, magna cum laude (psychology and creative writing) from Harvard University and a PhD (paychology) from Columbia University.]

Copyright © 2017 Aeon Media Group



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2017 Sapper's (Fair & Balanced) Rants & Raves

Wednesday, May 17, 2017

Puzzled By Today's News? Here's The Answer

The distaff parade of contributors continues today in this blog, The full title of today's post is "What Know-It-Alls Don’t Know, Or The Illusion Of Competence" and this blogger exercised editorial arrogance in amending the essay title because the short version comports perfectly with the current Washington, DC Klown Show. The issue of competence is at the root of all of the nonsense. If the mess in Washington, DC provides (fair & balanced) proof of irrationality, ineptness or just plain stupidity in the Oval Office, so be it.

[x Aeon]
The Illusion Of Competence
By Kate Fehlhaber



TagCrowd cloud of the following piece of writing

created at TagCrowd.com

One day in 1995, a large, heavy middle-aged man robbed two Pittsburgh banks in broad daylight. He didn’t wear a mask or any sort of disguise. And he smiled at surveillance cameras before walking out of each bank. Later that night, police arrested a surprised McArthur Wheeler. When they showed him the surveillance tapes, Wheeler stared in disbelief. ‘But I wore the juice,’ he mumbled. Apparently, Wheeler thought that rubbing lemon juice on his skin would render him invisible to videotape cameras. After all, lemon juice is used as invisible ink so, as long as he didn’t come near a heat source, he should have been completely invisible.

Police concluded that Wheeler was not crazy or on drugs – just incredibly mistaken.

The saga caught the eye of the psychologist David Dunning at Cornell University, who enlisted his graduate student, Justin Kruger, to see what was going on. They reasoned that, while almost everyone holds favourable views of their abilities in various social and intellectual domains, some people mistakenly assess their abilities as being much higher than they actually are. This ‘illusion of confidence’ is now called the ‘Dunning-Kruger effect’, and describes the cognitive bias to inflate self-assessment.

To investigate this phenomenon in the lab, Dunning and Kruger designed some clever experiments. In one study [PDF], they asked undergraduate students a series of questions about grammar, logic and jokes, and then asked each student to estimate his or her score overall, as well as their relative rank compared to the other students. Interestingly, students who scored the lowest in these cognitive tasks always overestimated how well they did — by a lot. Students who scored in the bottom quartile estimated that they had performed better than two-thirds of the other students!

This ‘illusion of confidence’ extends beyond the classroom and permeates everyday life. In a follow-up study, Dunning and Kruger left the lab and went to a gun range, where they quizzed gun hobbyists about gun safety. Similar to their previous findings, those who answered the fewest questions correctly wildly overestimated their knowledge about firearms. Outside of factual knowledge, though, the Dunning-Kruger effect can also be observed in people’s self-assessment of a myriad of other personal abilities. If you watch any talent show on television today, you will see the shock on the faces of contestants who don’t make it past auditions and are rejected by the judges. While it is almost comical to us, these people are genuinely unaware of how much they have been misled by their illusory superiority.

Sure, it’s typical for people to overestimate their abilities. One study found that 80 per cent of drivers rate themselves as above average — a statistical impossibility. And similar trends have been found when people rate their relative popularity and cognitive abilities. The problem is that when people are incompetent, not only do they reach wrong conclusions and make unfortunate choices but, also, they are robbed of the ability to realise their mistakes. In a semester-long study of college students, good students could better predict their performance on future exams given feedback about their scores and relative percentile. However, the poorest performers showed no recognition, despite clear and repeated feedback that they were doing badly. Instead of being confused, perplexed or thoughtful about their erroneous ways, incompetent people insist that their ways are correct. As Charles Darwin wrote in The Descent of Man (1871): ‘Ignorance more frequently begets confidence than does knowledge.’

Interestingly, really smart people also fail to accurately self-assess their abilities. As much as D- and F-grade students overestimate their abilities, A-grade students underestimate theirs. In their classic study, Dunning and Kruger found that high-performing students, whose cognitive scores were in the top quartile, underestimated their relative competence. These students presumed that if these cognitive tasks were easy for them, then they must be just as easy or even easier for everyone else. This so-called ‘imposter syndrome’ can be likened to the inverse of the Dunning-Kruger effect, whereby high achievers fail to recognise their talents and think that others are equally competent. The difference is that competent people can and do adjust their self-assessment given appropriate feedback, while incompetent individuals cannot.

And therein lies the key to not ending up like the witless bank robber. Sometimes we try things that lead to favourable outcomes, but other times — like the lemon juice idea — our approaches are imperfect, irrational, inept or just plain stupid. The trick is to not be fooled by illusions of superiority and to learn to accurately reevaluate our competence. After all, as Confucius reportedly said, real knowledge is knowing the extent of one’s ignorance. # # #

[Kate Fehlhaber is the editor-in-chief of Knowing Neurons. She received a BA, honors (neuroscience) from Scripps College (CA) and a PhD from the University of California at Los Angeles (neuroscience).]

Copyright © 2017 Aeon Media Group



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License..

Copyright © 2017 Sapper's (Fair & Balanced) Rants & Raves