Tuesday, September 23, 2014

And Now A Word About The National Felony League

This blogger concurs with Grantland's Louisa Thomas: replace the F-word in NFL with Felony. The National Felony League: if the Foo $hits, wear it. If this is (fair & balanced) exposure of a national disgrace, so be it.

[x Grantland]
Together We Make Football
By Louisa Thomas

Tag Cloud of the following piece of writing

created at TagCrowd.com

On January 1, 2005, in the dark, early hours of New Year’s Day, police responded to a call from the girlfriend of Denver Broncos defensive back Willie Middlebrooks. She told them that she and Middlebrooks had fought earlier that night, and he was still angry when he came home. According to the police report, he grabbed her by the hair and then tried to choke her twice, once so forcefully that he lifted her off the floor. She was treated for injuries at a local hospital; he would later plead guilty to a misdemeanor assault charge. The Broncos traded Middlebrooks to the San Francisco 49ers later that summer. “The 49ers did their research and found out that Willie is a high-character guy who was in a bad situation,” Middlebrooks’s agent told the San Jose Mercury News. “Sometimes guys just need a change of scenery.”

On Valentine’s Day 2005, Tennessee Titans cornerback Samari Rolle hit his wife, Danisha, giving her a gash that required three stitches over her left eye. Three weeks later, the Ravens signed him to a six-year deal worth $30.5 million. After Rolle pleaded guilty to assaulting his wife, the NFL fined him one game’s paycheck but let him play.

On April 26, 2005, Brad Hopkins, left tackle for the Tennessee Titans, pleaded guilty to assaulting his wife, Ellen. According to the police report, Hopkins became angry and choked her because she refused to stop talking to an insurance agent about adding a car to their coverage. He pleaded guilty, and the NFL suspended Hopkins for one game.

On August 28, 2005, police responded to a 911 call from the home of Tasha and Kevin Williams. They found Tasha, a Louisiana Tech University senior who had married Kevin, a Vikings All-Pro defensive tackle, earlier that summer, with two lacerations on her left forearm and blood on her white shirt. She told police that Kevin had pushed her when he saw that she wasn’t wearing her wedding ring. After she hit him with her cell phone, he threw her across the bed and into a nightstand, then jumped on her. She grabbed a knife, which he wrestled from her. Police noticed that Kevin was drunk and “not fully aware or really caring about what was going on.” He pleaded guilty to disorderly conduct and was sentenced to a $1,000 fine and one year of probation. The NFL did not suspend him.

Nine NFL players were arrested on domestic assault charges in 2005, Paul Tagliabue’s last full year as NFL commissioner. Among those nine arrests, Hopkins’s single-game penalty was the only suspension the league handed out that year. The following year, Tagliabue was succeeded as commissioner by Roger Goodell.

Domestic violence is not a new problem in the NFL.

After TMZ released the tape that showed the heavy left hook that Ray Rice landed on Janay Palmer (then his fiancée, now his wife) in a casino elevator in February, Roger Goodell defended his decision to replace the two-game suspension he had originally issued to Rice with an indefinite suspension by saying that the video was “starkly different” from what he’d been told. The Ravens explained their decision to cut Rice, whom they had vigorously supported, by saying that “seeing that video changed everything.” Football analysts expressed their maiden moral outrage about domestic violence by saying, The video probably shouldn’t have mattered, but it changed everything. The public turned a minor protest against Goodell’s decision to suspend the All-Pro running back into overwhelming pressure. The video changed everything.

The video should have changed nothing. The facts remained unchanged: Rice punched Janay Palmer before half-lifting, half-dragging her limp body out of the elevator. Ray Rice is a 212-pound football player. He can bench 400 pounds. The apparent inability of the public to picture just how awful the scene looked without actually watching a video of it amounted to a failure of imagination and a failure of empathy and will.

But no one could escape the video, and no one could deny its power. It did change things, and some things for the better. People who had never cared about domestic violence were forced to pay attention. Victims of domestic violence who felt isolated or helpless began to share their stories or seek help. The number of calls to the National Domestic Violence Hotline shot up 84 percent two days after the video’s release. After Janay Rice’s angry defense of her husband on social media, there was a widespread conversation about the complex psychology of victims of domestic violence. Those inclined to let the victim shoulder some of the blame had to backtrack. Hours after releasing Rice, the Ravens deleted a tweet from May that said, “Janay Rice says she deeply regrets the role that she played the night of the incident.”

The video left no doubt about what had happened. The punch was no longer “alleged.” The possibility, “hypothetically speaking,” that Janay had done something to somehow provoke the attack — always the most reprehensible suggestion — was shown to be impossible.

The video from the elevator wasn’t entertainment — but it was a spectacle. The site that released it, TMZ, reports on celebrities, making news of disgrace. It was aired and reaired; it was embedded in story after story. My Twitter and Facebook feeds were filled with little else. What I saw when I watched the video was repellent — and yet, confronted with it, I kept watching. Most others did too. The medium galvanized us; the experience was visceral and physical. It was personally involving. People didn’t have to see what happened in that elevator to believe it, but seeing it made people feel it. What Rice did was “disgusting,” “shocking,” “sickening.” Even as the video made people more aware of the widespread problem of domestic assault, it became possible to treat Ray Rice’s case as singular. The video made the incident seem sensational, extraordinary. Rice needed to be run out of town, but the Panthers’ Greg Hardy, who was convicted this summer of assaulting his girlfriend and making threats against her life (he is appealing the verdict), and the 49ers’ Ray McDonald, who is being investigated on allegations that he assaulted his pregnant fiancée, were treated with less urgency. The grainy image of a stocky man in an elevator coldcocking a woman was so specific that it encouraged people to respond to the incident as if it were unique.

The video did something else. Its existence turned what had been a serious but abstract problem into a scandal. It brought greater awareness to the NFL’s long history of tolerating high rates of domestic violence, but it didn’t prompt most people to explore why. The story became a kind of conspiracy, talked about in excited tones. Had Goodell seen the tape? If he hadn’t, why hadn’t he? Was he incompetent? Who knew what, and when? Was there a conspiracy here? Who was lying? What else was the league hiding? Conveniently, these were questions with answers. This was a crime that could be solved. The problem had a clean solution: Get rid of Roger Goodell.

Goodell should go. He should have the grace to recognize that whatever profits he has reaped for the NFL, his leadership hurts the sport. His tenure has been marked by hypocrisy, obfuscation, and negligence, and he should resign. If he doesn’t, then he should be fired.

But Roger Goodell isn’t what’s really wrong with football.

One hundred eighty-seven million Americans describe themselves as fans of the NFL. That’s 60 percent of the country’s population. Think about that for a second. The NFL makes about $9.5 billion in annual revenue, and Goodell has set a target goal of $25 billion by 2027. So far, damaging controversies have only helped ratings: 20.8 million people watched Thursday night’s game between the Ravens and Steelers — a 108 percent increase over last year’s "Thursday Night Football" opener; 22.2 million watched Sunday night’s game between the Bears and the 49ers, making "Sunday Night Football" the most-watched broadcast of the week.

Americans watch football for many reasons — for the memory of the ball in their hands, for the sight of a Hail Mary, for the fantasy leagues, for beer and chicken wings, for the adrenaline rush that comes when they see a wide receiver soar for a catch. Football encourages some deep tremor of romance about what it means to be a man — even, it should be said, among the sport’s many female fans. Save for the military — with which it has a symbiotic relationship — the NFL is the biggest and strongest exponent of American masculinity.

And integral to that notion of American masculinity is violence. Football is our culture’s great spectacle of violence, our version of the gladiatorial games of ancient Rome. You can see signs of football’s celebration of amped-up manhood in the pageantry of our own bread and circuses: the military jet flyovers, the Built Ford Tough commercials, the shiny uniforms, the amplified crunching sound of hard hits, the big-knotted ties, and the pregame show special effects that seem like something out of "Transformers 12." You can see it in the silver gladiator mask that Terrell Suggs wore during the pregame introductions when the Ravens played the Steelers last Thursday. But those are only symptoms. Get rid of the truck commercials, get rid of the gun salutes, and you’d still have the violence on the field. Get rid of the gladiator mask, and you’d still have Suggs.

Two years ago, Suggs’s girlfriend (now wife), Candace Williams, sought a protective order against him, accusing him of punching her and dragging her alongside a car. When the order was granted, Suggs was required to hand over his guns, including an AK-47, because of “reasonable grounds to believe the person seeking a protective order has been abused.” The 2012 accusations were not the first Williams made against Suggs. In 2009, Williams had accused him of spilling bleach on her and her son. He had previously given her, she said, “busted lips, broken nose, black eyes, bruises.” The accusations against Suggs were extensively reported both in 2009 and 2012. The NFL never suspended him. The public didn’t make much of a protest. This February, two days after Rice knocked Janay Palmer unconscious, Suggs signed a four-year deal with $16 million in guaranteed money. On Thursday before the game, as he danced on the field in his gladiator helmet, the crowd — 71,000, a sellout — went wild. An NFL house ad appeared on the CBS telecast saying, “Why do we love football?”

There are 1,696 active players in the NFL. Even if, as FiveThirtyEight’s Benjamin Morris found, NFL players are arrested on domestic assault charges at rates that are, relative to income level, “downright extraordinary,” very few of them will ever beat women. Most of them are good guys trying to do a job. Still, the job they do is part of a culture of aggression. Football is a pantomime of war, down to the pseudo-military tactics. But it is not a pantomime of violence. It is actual violence.

I’m not just talking about the injuries that players inflict on each other — the torn ligaments and compound fractures, or the smaller, persistent injuries that lead to chronic pain and pill addictions and make it hard for them just to sit on the floor and play with their kids. I’m not even talking about their head injuries, the repeated blows that are slowly deforming their brains, or the fact that even if no one dies, that doesn’t mean that death isn’t hastened. (Even the league is now admitting that one in three former players will have cognitive problems at “notably younger ages” than the average population. One symptom of CTE happens to be increased aggression.) The real problem is that infliction of pain is romanticized and ritualized. Hitting is the point. Inflicting injury is nominally avoided — but hurting the other team helps. “It’s a bully division,” Arizona’s general manager, Steve Keim, told Grantland’s Robert Mays earlier this year, “so we had to add our number of bullies to our defense.” He meant that as a good thing.

I get it. I didn’t blink at first when I read Keim’s words. I smile instinctively when I see a hit. The pirates of Seattle’s secondary routinely amaze me. I have come to love a good road-grading offensive line. I see it and I respond to football instinctively. I feel it. It taps into some dark and thrilling part of me, the sight of those magnificent athletes trying to make contact or elude it. I wish I could say that feeling is harmless, that it allows for a release of my most dangerous instincts without putting me in contact with actual danger, that it allows me to desire dominance without turning me into some kind of would-be dictator. Watching football connects me to friends and to strangers. It helps me lose myself in something bigger, something almost transcendent. It reminds me of my father, and of afternoons spent outside in the backyard learning to throw a spiral. The acrobatics of the best make me catch my breath in awe. It is just so much fun to watch.

I wish I could say that it is a substitute for violence, that it releases and diffuses that domineering, competitive instinct latent in human nature, and leaves us with some measure of self-respect — some awareness of courage and strength. But I think I’m lying to myself. Because when I’m honest, I can see that within the culture of football, as a woman, I’m not respected. The women I see are cheerleaders, sideline reporters, WAGs. I hear men talk, and I know that when they use the word “girl,” it’s shorthand for something weak.

Domestic violence is not a football problem; it is a societal problem. One in every four women [PDF] will be a victim of domestic violence in her lifetime. The he-said, she-said nature makes it hard to gather evidence. Domestic violence is one of the most complex and intractable problems that our legal system faces, and it remains a great taboo: Only one-quarter of physical assaults are reported to the police, and often victims don’t want to prosecute. Entangled personal histories and an understandable desire for privacy can make these cases hard. Sometimes women don’t want to cooperate, believing any punishment would harm them as well. Sometimes women throw punches. Sometimes they just want to move on. It can be hard to know exactly what happened. There usually isn’t a tape.

Domestic violence does not happen on a football field. It happens in bedrooms, cars, parking lots, elevators. Intimate-partner violence and sexual assault are epidemic in the military. They are pervasive in Silicon Valley, on college campuses, in small Alaskan towns. They exist in all countries and in all times. Getting rid of football would do nothing to change this.

And yet there are connections between a culture that sidelines women and disrespects them, a culture that disrespects women and tolerates violence toward them, and a culture that tolerates violence toward them and commits violence toward them. Nearly half — 48 percent — of all arrests for violent crimes among NFL players are arrests for domestic violence.

Men have worried that masculinity was under threat for as long as football has been around. The sport as we know it, after all, began during an era and in a class so nervous about decline that there was a condition, neurasthenia, to describe men’s anxiety. The easiest way to prove you were a man was to adopt an attitude of aggression. Those who were vulnerable or different were, and are, not merely unwelcome. It’s as if they were contagious. It is as if they were dangerous.

On Sunday, I turned on the Chiefs and Broncos just in time to see Kansas City tight end Anthony Fasano make a ridiculous juggling catch, somehow maintaining enough awareness and enough body control to collect the ball as he crashed to the ground. My eyes widened with goofy surprise and I made an inarticulate, happy shout. It was just awe-inspiring to watch.

And then I thought of a story involving the Chiefs that briefly dominated the news cycle two years ago, before it was forgotten. It was a story that seemed certain to force the league to change, but nothing happened. In 2012, Jovan Belcher, a Kansas City Chiefs linebacker, became the sixth NFL player to commit suicide in two years. Before he drove to the Arrowhead Stadium parking lot and shot himself, he killed his girlfriend, with whom he had a child. It didn’t happen on the Chiefs’ field, but it was damn close. It is hard not to think of Belcher as a casualty of the game. But he was also the killer of his child’s mother.

The NFL calls itself a family. If that’s the case, it’s a family of fathers and sons but not wives and daughters. It’s a family that more closely resembles the mob than a family connected by blood or love. It’s a family that protects its own by cutting others, a family that privileges loyalty over what’s right. But loyalty goes only so far in the NFL — because at some not-so-distant point, the family turns into a business. When concussions enter into it, or salary caps, or age, the family becomes about winning Sunday’s big game or about the business’s bottom line. If it’s a family, then it’s a fucked-up family.

The league can educate players about domestic violence, increase penalties, and provide continuing and intensive anger management. It can add more women to the higher ranks and put them in visible positions of power. But it won’t be enough.

Goodell does need to go. As Cris Carter said in his impassioned speech about Adrian Peterson’s alleged abuse of his son, taking a man off the field is what men will respect. It is a show of power, and men respond to power. But getting rid of Goodell won’t change the latent and virulent hostility toward those who don’t conform to the culture’s projection of masculinity, and it won’t change the sport. The violence will still be there. If we take the violence out of football, what’s left? Ω

[Louisa Thomas is a contributor to Grantland and a Fellow at the New America Foundation. She is also the author of Conscience: Two Soldiers, Two Pacifists, One Family–A Test of Will and Faith in World War I (2011). Thomas received a BA from Harvard University and was awarded a James B. Conant Prize for exemplary writing in her senior year.]

Copyright © 2014 ESPN Internet Ventures



Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2014 Sapper's (Fair & Balanced) Rants & Raves

Monday, September 22, 2014

On Today's Menu: Scarified Chicken?

Tom Tomorrow (Dan Perkins) replays the cacophony of craven fearmongers in the wake of Muslim jihadist beheadings of hostage-journalists after videos made it to YouTube. O, how they love to be scarified. KFC could make a killing with scarified chicken on its current menu. If this is (fair & balanced) mass hysteria, so be it.

[x This Modern World]
Building Blocks Of War
By Tom Tomorrow (Dan Perkins)

Tom Tomorrow/Dan Perkins

[Dan Perkins is an editorial cartoonist better known by the pen name "Tom Tomorrow". His weekly comic strip, "This Modern World," which comments on current events from a strong liberal perspective, appears regularly in approximately 150 papers across the U.S., as well as on Daily Kos. The strip debuted in 1990 in SF Weekly. Perkins, a long time resident of Brooklyn, New York, currently lives in Connecticut. He received the Robert F. Kennedy Award for Excellence in Journalism in both 1998 and 2002. When he is not working on projects related to his comic strip, Perkins writes a daily political weblog, also entitled "This Modern World," which he began in December 2001. More recently, Dan Perkins, pen name Tom Tomorrow, was named the winner of the 2013 Herblock Prize for editorial cartooning.]

Copyright © 2014 Tom Tomorrow (Dan Perkins)



Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2014 Sapper's (Fair & Balanced) Rants & Raves

Sunday, September 21, 2014

Colonel Bob Has Started To Fade Away

Robert Bateman has written a farewell to an army career of more than three decades. If this a (fair & balanced) ave atque vale, so be it.

[x Esquire]
I Will Fight No More, Forever
By Robert L. Bateman, III

Tag Cloud of the following piece of writing

created at TagCrowd.com

My conditions are significantly different than those of the man who first spoke those words. He was the tribal chief of a band of Nez Perce Indians, and perhaps the War Chief of several bands in the end. At the height of his fame, and his tragedy, this man retained his reason. He was named, by his opponents, "Joseph."*

Joseph and his folks did not like the deal they were offered. Perhaps because this occurred at gunpoint. Perhaps it was because the maker of the deal, the U.S. government, had already completely reneged on a prior deal it made four years earlier, in 1873. In fact, stories like those of Joseph and his people were one part of the reason why I became a soldier, to try and make sure bullshit like this never happened again. Or at least do my best to ensure that it never happened on my watch. In 1877 the U.S. Army was sent in to kick Joseph's people off their land and move them to another place, and only from the inside can one affect change on that scale. But 1877 was different, and the Army went in.

That wasn't right. Then or now.

In theory the military operation that followed should have been a walk-over. Those in opposition to the U.S. government's relocation plans numbered less than 1,000 men, women and children. But Joseph, well he was different. He was a better man than I, and he was not having none of that shit. What followed made him a legend among both the American soldiers chasing him and the Native Americans since then, because Chief Joseph handed us, the U.S. Army, our ass.

No, not Little Bighorn style, there was none of that. There was no massive force-on-force fight in this campaign, so don't get me wrong here. Joseph was entirely on the defensive and retreating the whole time. But his masterful use of delaying tactics, surprise, even temporary fortifications, kept the U.S. Army so left-footed that we could never gain contact with the main body of Joseph's people. Across Oregon, Washington, Idaho, Wyoming and Montana, they fought a rearguard action of 1,100+ miles and they did it so well that they earned a place in history. But eventually they found themselves between a rock and a hard place. There were just too many of us. There was nowhere to go, and so to preserve his people, Joseph surrendered.

"Hear me, my chiefs! I am tired; my heart is sick and sad. From where the sun now stands, I will fight no more forever."**

Maybe he said that, maybe he did not. But that is what happened.

I served the nation under Reagan, Bush I, Clinton, Bush II, and Obama. I have been shot at, mortared, rocketed, and knocked off my feet so many times that it does not matter. I know Egypt and Israel, Iraq, Afghanistan, and half a dozen other countries so well that they perversely feel like home in a way. My friends now hail from 22 countries, and I can say "give me a beer" in eight languages. But like Chief Joseph, despite the fact that war could continue, I have decided.

I will fight no more, forever.

And so, gentle readers, from now on I am just "Bob." I have retired from the Army. I have retired from our wars, conflicts, semi-conflicts and brawls. I will not be training young men to fight, and then seeing their names in granite. I will not share smokes with Colonels, telling them the ins-and-outs of where they will be, only to read later how they bought it. My days counting the dead in my memory banks are done, I hope. So then, too, is my service.

I admit that it is a difficult time. I first thought of myself in relation to a rank when I was 18 years old. Now I am 47. In my adult life I have never been anything but a soldier, a servant to the nation and a defender of all of our people. So now the question leaps, "Who am I?"

Now I am not "Captain Bateman." Nor am I "Major B", or "Colonel Bob." None of these apply anymore...I am just, well, just Bob. Which raises the question, for me and hundreds of thousands like me, "who am I?" Sure, I am Lieutenant Colonel (Retired) Robert Lake Bateman, III, emphasis on the "Retired," to the Army, my home these past decades, but to the world?

That identity, that 'self' that I maintained for nearly three decades is difficult to release. That is saying something, actually, because compared to many of my fellow soldiers, I have it easy. Over the past 30 years I forged other identities: Public Speaker, University professor, Author, and yes, sometimes Writer for magazines like Esquire. A lot of that, hell, almost all of that, was a bit different for an infantry officer of my age and grade. But I was having fun and doing my job and so there was no conflict. I was all those things in my spare time, but I was a soldier first, and always.

Now, of course, I must reconfigure, because "always" is over now, and I have retired.

Some of this may strike you as Shakespearean, essentially "Much Ado About Nothing." I would agree, but then i think, maybe not. I mean, do you have any idea how fundamentally this sort of thing changes a soldier's life? "Becoming" a civilian? No? Well think about this personal factoid for a second: For the first time in my adult life I have to choose my clothes.

Yea, ok, you might see that as freedom. Me, I see it as an annoying distraction.

For almost 30 years I did not need to worry about the color of a tie, the cut of my pleats, the style of my shoes, the quality of my suit, or how my hair looked...if I had hair on my head. None of that mattered, at all, to those who looked at me in uniform. They saw "Airborne" and "Infantry" and "Air Assault" and "Ranger," right there on my uniform. That told them all they needed to know in their initial visual inspection. They could decide later, based upon my comments and conduct, if I was worth a shit, but my clothes did not matter. Now, it appears, they do.

Clothes are just the first part of the transition. Civilians know that one should not say "FUCK" as a part of normal conversation, for example. But for me, 29 years of conditioning make it a part of my lexicon so deeply embedded that I might as well try to cut out the words "and" and "the." The same applies to about 19 other words soldiers use that I can think of off the top of my head which just do no go down well in civil society. It ain't easy, being nice.

Then there is the issue of politics.

You may have noticed that in all the years I've been tag-team writing with [Esquire's Charles P.] Pierce I have never, once, said a bad thing about a sitting elected official. There is a good reason for that: you don't want the guys with the guns opining on political issues. We have seen, in other nations, how that turns out. But now, for the first time since I became politically aware, I am allowed...and I am not sure what to do. In this I feel a little like a lost puppy. I know what I know, but I don't know what I am supposed to do about it, if you follow my meaning.

When you are a soldier, just being a soldier takes up your whole existence. When you are no longer a soldier, well, there is a whole lot of figuring out to be done.

Of course this means I can say what I want...

*Somewhat obviously that was not his given name in his own language.

**This may have been a literary after-the-fact creation. In fact, to my ears it sounds too good to be true. But this is what you will see in most places. Ω

[Robert L. Bateman became a free-lance contributor to Esquire magazine in 2013 after retiring as a Lieutenant Colonel from a 31-year-career in the U.S. Army. Bateman received a BA (internationl relations and history) from the University of Delaware. He also received an MA from The Ohio State University. He is a distinguished graduate of the NATO Defense College as well. Bateman is the author of Digital War, A View From The Front Lines (1999) and No Gun Ri: A Military History of the Korean War Incident (2002).]

Copyright © 2014 Hearst Communications, Inc.



Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2014 Sapper's (Fair & Balanced) Rants & Raves

Saturday, September 20, 2014

Roll Over Emily Dickinson — The Zekester Would Like A Final Pickup In 2032... 2014, Not So Much

Ah, the dilemma for this blogger who was planning to take a flu shot this afternoon. Dr. Zeke has sworn off flu shots at his age (57), so what's a blogger (older at 73) to do? Probably, the blogger will go in for the flu shot... for the last time on September 20, 2014. Dr. Zeke routinely refuses prostate surgery and this blogger did the same thing earlier this week in a conference with his urologist (known to the blogger as Uro-Boy, a doppelgänger of "Doogie Houser, MD"). Most of the other suggestions from Dr. Zeke have been done by this blogger. If this is (fair & balanced) end-of-life-planning, so be it.

[x The Atlantic]
Why I Hope To Die At 75
By Ezekiel J. Emanuel

Tag Cloud of the following piece of writing

created at TagCrowd.com

Seventy-five.

That’s how long I want to live: 75 years.

This preference drives my daughters crazy. It drives my brothers crazy. My loving friends think I am crazy. They think that I can’t mean what I say; that I haven’t thought clearly about this, because there is so much in the world to see and do. To convince me of my errors, they enumerate the myriad people I know who are over 75 and doing quite well. They are certain that as I get closer to 75, I will push the desired age back to 80, then 85, maybe even 90.

I am sure of my position. Doubtless, death is a loss. It deprives us of experiences and milestones, of time spent with our spouse and children. In short, it deprives us of all the things we value.

But here is a simple truth that many of us seem to resist: living too long is also a loss. It renders many of us, if not disabled, then faltering and declining, a state that may not be worse than death but is nonetheless deprived. It robs us of our creativity and ability to contribute to work, society, the world. It transforms how people experience us, relate to us, and, most important, remember us. We are no longer remembered as vibrant and engaged but as feeble, ineffectual, even pathetic.

By the time I reach 75, I will have lived a complete life. I will have loved and been loved. My children will be grown and in the midst of their own rich lives. I will have seen my grandchildren born and beginning their lives. I will have pursued my life’s projects and made whatever contributions, important or not, I am going to make. And hopefully, I will not have too many mental and physical limitations. Dying at 75 will not be a tragedy. Indeed, I plan to have my memorial service before I die. And I don’t want any crying or wailing, but a warm gathering filled with fun reminiscences, stories of my awkwardness, and celebrations of a good life. After I die, my survivors can have their own memorial service if they want—that is not my business.

Let me be clear about my wish. I’m neither asking for more time than is likely nor foreshortening my life. Today I am, as far as my physician and I know, very healthy, with no chronic illness. I just climbed Kilimanjaro with two of my nephews. So I am not talking about bargaining with God to live to 75 because I have a terminal illness. Nor am I talking about waking up one morning 18 years from now and ending my life through euthanasia or suicide. Since the 1990s, I have actively opposed legalizing euthanasia and physician-assisted suicide. People who want to die in one of these ways tend to suffer not from unremitting pain but from depression, hopelessness, and fear of losing their dignity and control. The people they leave behind inevitably feel they have somehow failed. The answer to these symptoms is not ending a life but getting help. I have long argued that we should focus on giving all terminally ill people a good, compassionate death—not euthanasia or assisted suicide for a tiny minority.

I am talking about how long I want to live and the kind and amount of health care I will consent to after 75. Americans seem to be obsessed with exercising, doing mental puzzles, consuming various juice and protein concoctions, sticking to strict diets, and popping vitamins and supplements, all in a valiant effort to cheat death and prolong life as long as possible. This has become so pervasive that it now defines a cultural type: what I call the American immortal.

I reject this aspiration. I think this manic desperation to endlessly extend life is misguided and potentially destructive. For many reasons, 75 is a pretty good age to aim to stop.

What are those reasons? Let’s begin with demography. We are growing old, and our older years are not of high quality. Since the mid-19th century, Americans have been living longer. In 1900, the life expectancy of an average American at birth was approximately 47 years. By 1930, it was 59.7; by 1960, 69.7; by 1990, 75.4. Today, a newborn can expect to live about 79 years. (On average, women live longer than men. In the United States, the gap is about five years. According to the National Vital Statistics Report, life expectancy for American males born in 2011 is 76.3, and for females it is 81.1.)

In the early part of the 20th century, life expectancy increased as vaccines, antibiotics, and better medical care saved more children from premature death and effectively treated infections. Once cured, people who had been sick largely returned to their normal, healthy lives without residual disabilities. Since 1960, however, increases in longevity have been achieved mainly by extending the lives of people over 60. Rather than saving more young people, we are stretching out old age.

The American immortal desperately wants to believe in the “compression of morbidity.” Developed in 1980 by James F. Fries, now a professor emeritus of medicine at Stanford, this theory postulates that as we extend our life spans into the 80s and 90s, we will be living healthier lives—more time before we have disabilities, and fewer disabilities overall. The claim is that with longer life, an ever smaller proportion of our lives will be spent in a state of decline.

Compression of morbidity is a quintessentially American idea. It tells us exactly what we want to believe: that we will live longer lives and then abruptly die with hardly any aches, pains, or physical deterioration—the morbidity traditionally associated with growing old. It promises a kind of fountain of youth until the ever-receding time of death. It is this dream—or fantasy—that drives the American immortal and has fueled interest and investment in regenerative medicine and replacement organs.

But as life has gotten longer, has it gotten healthier? Is 70 the new 50?

Not quite. It is true that compared with their counterparts 50 years ago, seniors today are less disabled and more mobile. But over recent decades, increases in longevity seem to have been accompanied by increases in disability—not decreases. For instance, using data from the National Health Interview Survey, Eileen Crimmins, a researcher at the University of Southern California, and a colleague assessed physical functioning in adults, analyzing whether people could walk a quarter of a mile; climb 10 stairs; stand or sit for two hours; and stand up, bend, or kneel without using special equipment. The results show that as people age, there is a progressive erosion of physical functioning. More important, Crimmins found that between 1998 and 2006, the loss of functional mobility in the elderly increased. In 1998, about 28 percent of American men 80 and older had a functional limitation; by 2006, that figure was nearly 42 percent. And for women the result was even worse: more than half of women 80 and older had a functional limitation. Crimmins’s conclusion: There was an “increase in the life expectancy with disease and a decrease in the years without disease. The same is true for functioning loss, an increase in expected years unable to function.”

This was confirmed by a recent worldwide assessment of “healthy life expectancy” conducted by the Harvard School of Public Health and the Institute for Health Metrics and Evaluation at the University of Washington. The researchers included not just physical but also mental disabilities such as depression and dementia. They found not a compression of morbidity but in fact an expansion—an “increase in the absolute number of years lost to disability as life expectancy rises.”

How can this be? My father illustrates the situation well. About a decade ago, just shy of his 77th birthday, he began having pain in his abdomen. Like every good doctor, he kept denying that it was anything important. But after three weeks with no improvement, he was persuaded to see his physician. He had in fact had a heart attack, which led to a cardiac catheterization and ultimately a bypass. Since then, he has not been the same. Once the prototype of a hyperactive Emanuel, suddenly his walking, his talking, his humor got slower. Today he can swim, read the newspaper, needle his kids on the phone, and still live with my mother in their own house. But everything seems sluggish. Although he didn’t die from the heart attack, no one would say he is living a vibrant life. When he discussed it with me, my father said, “I have slowed down tremendously. That is a fact. I no longer make rounds at the hospital or teach.” Despite this, he also said he was happy.

As Crimmins puts it, over the past 50 years, health care hasn’t slowed the aging process so much as it has slowed the dying process. And, as my father demonstrates, the contemporary dying process has been elongated. Death usually results from the complications of chronic illness—heart disease, cancer, emphysema, stroke, Alzheimer’s, diabetes.

Take the example of stroke. The good news is that we have made major strides in reducing mortality from strokes. Between 2000 and 2010, the number of deaths from stroke declined by more than 20 percent. The bad news is that many of the roughly 6.8 million Americans who have survived a stroke suffer from paralysis or an inability to speak. And many of the estimated 13 million more Americans who have survived a “silent” stroke suffer from more-subtle brain dysfunction such as aberrations in thought processes, mood regulation, and cognitive functioning. Worse, it is projected that over the next 15 years there will be a 50 percent increase in the number of Americans suffering from stroke-induced disabilities. Unfortunately, the same phenomenon is repeated with many other diseases.

So American immortals may live longer than their parents, but they are likely to be more incapacitated. Does that sound very desirable? Not to me.

The situation becomes of even greater concern when we confront the most dreadful of all possibilities: living with dementia and other acquired mental disabilities. Right now approximately 5 million Americans over 65 have Alzheimer’s; one in three Americans 85 and older has Alzheimer’s. And the prospect of that changing in the next few decades is not good. Numerous recent trials of drugs that were supposed to stall Alzheimer’s—much less reverse or prevent it—have failed so miserably that researchers are rethinking the whole disease paradigm that informed much of the research over the past few decades. Instead of predicting a cure in the foreseeable future, many are warning of a tsunami of dementia—a nearly 300 percent increase in the number of older Americans with dementia by 2050.

Half of people 80 and older with functional limitations. A third of people 85 and older with Alzheimer’s. That still leaves many, many elderly people who have escaped physical and mental disability. If we are among the lucky ones, then why stop at 75? Why not live as long as possible?

Even if we aren’t demented, our mental functioning deteriorates as we grow older. Age-associated declines in mental-processing speed, working and long-term memory, and problem-solving are well established. Conversely, distractibility increases. We cannot focus and stay with a project as well as we could when we were young. As we move slower with age, we also think slower.

It is not just mental slowing. We literally lose our creativity. About a decade ago, I began working with a prominent health economist who was about to turn 80. Our collaboration was incredibly productive. We published numerous papers that influenced the evolving debates around health-care reform. My colleague is brilliant and continues to be a major contributor, and he celebrated his 90th birthday this year. But he is an outlier—a very rare individual.

American immortals operate on the assumption that they will be precisely such outliers. But the fact is that by 75, creativity, originality, and productivity are pretty much gone for the vast, vast majority of us. Einstein famously said, “A person who has not made his great contribution to science before the age of 30 will never do so.” He was extreme in his assessment. And wrong. Dean Keith Simonton, at the University of California at Davis, a luminary among researchers on age and creativity, synthesized numerous studies to demonstrate a typical age-creativity curve: creativity rises rapidly as a career commences, peaks about 20 years into the career, at about age 40 or 45, and then enters a slow, age-related decline. There are some, but not huge, variations among disciplines. Currently, the average age at which Nobel Prize–winning physicists make their discovery—not get the prize—is 48. Theoretical chemists and physicists make their major contribution slightly earlier than empirical researchers do. Similarly, poets tend to peak earlier than novelists do. Simonton’s own study of classical composers shows that the typical composer writes his first major work at age 26, peaks at about age 40 with both his best work and maximum output, and then declines, writing his last significant musical composition at 52. (All the composers studied were male.)

This age-creativity relationship is a statistical association, the product of averages; individuals vary from this trajectory. Indeed, everyone in a creative profession thinks they will be, like my collaborator, in the long tail of the curve. There are late bloomers. As my friends who enumerate them do, we hold on to them for hope. It is true, people can continue to be productive past 75—to write and publish, to draw, carve, and sculpt, to compose. But there is no getting around the data. By definition, few of us can be exceptions. Moreover, we need to ask how much of what “Old Thinkers,” as Harvey C. Lehman called them in his 1953 Age and Achievement, produce is novel rather than reiterative and repetitive of previous ideas. The age-creativity curve—especially the decline—endures across cultures and throughout history, suggesting some deep underlying biological determinism probably related to brain plasticity.

We can only speculate about the biology. The connections between neurons are subject to an intense process of natural selection. The neural connections that are most heavily used are reinforced and retained, while those that are rarely, if ever, used atrophy and disappear over time. Although brain plasticity persists throughout life, we do not get totally rewired. As we age, we forge a very extensive network of connections established through a lifetime of experiences, thoughts, feelings, actions, and memories. We are subject to who we have been. It is difficult, if not impossible, to generate new, creative thoughts, because we don’t develop a new set of neural connections that can supersede the existing network. It is much more difficult for older people to learn new languages. All of those mental puzzles are an effort to slow the erosion of the neural connections we have. Once you squeeze the creativity out of the neural networks established over your initial career, they are not likely to develop strong new brain connections to generate innovative ideas—except maybe in those Old Thinkers like my outlier colleague, who happen to be in the minority endowed with superior plasticity.

Maybe mental functions—processing, memory, problem-solving—slow at 75. Maybe creating something novel is very rare after that age. But isn’t this a peculiar obsession? Isn’t there more to life than being totally physically fit and continuing to add to one’s creative legacy?

One university professor told me that as he has aged (he is 70) he has published less frequently, but he now contributes in other ways. He mentors students, helping them translate their passions into research projects and advising them on the balance of career and family. And people in other fields can do the same: mentor the next generation.

Mentorship is hugely important. It lets us transmit our collective memory and draw on the wisdom of elders. It is too often undervalued, dismissed as a way to occupy seniors who refuse to retire and who keep repeating the same stories. But it also illuminates a key issue with aging: the constricting of our ambitions and expectations.

We accommodate our physical and mental limitations. Our expectations shrink. Aware of our diminishing capacities, we choose ever more restricted activities and projects, to ensure we can fulfill them. Indeed, this constriction happens almost imperceptibly. Over time, and without our conscious choice, we transform our lives. We don’t notice that we are aspiring to and doing less and less. And so we remain content, but the canvas is now tiny. The American immortal, once a vital figure in his or her profession and community, is happy to cultivate avocational interests, to take up bird watching, bicycle riding, pottery, and the like. And then, as walking becomes harder and the pain of arthritis limits the fingers’ mobility, life comes to center around sitting in the den reading or listening to books on tape and doing crossword puzzles. And then …

Maybe this is too dismissive. There is more to life than youthful passions focused on career and creating. There is posterity: children and grandchildren and great-grandchildren.

But here, too, living as long as possible has drawbacks we often won’t admit to ourselves. I will leave aside the very real and oppressive financial and caregiving burdens that many, if not most, adults in the so-called sandwich generation are now experiencing, caught between the care of children and parents. Our living too long places real emotional weights on our progeny.

Unless there has been terrible abuse, no child wants his or her parents to die. It is a huge loss at any age. It creates a tremendous, unfillable hole. But parents also cast a big shadow for most children. Whether estranged, disengaged, or deeply loving, they set expectations, render judgments, impose their opinions, interfere, and are generally a looming presence for even adult children. This can be wonderful. It can be annoying. It can be destructive. But it is inescapable as long as the parent is alive. Examples abound in life and literature: Lear, the quintessential Jewish mother, the Tiger Mom. And while children can never fully escape this weight even after a parent dies, there is much less pressure to conform to parental expectations and demands after they are gone.

Living parents also occupy the role of head of the family. They make it hard for grown children to become the patriarch or matriarch. When parents routinely live to 95, children must caretake into their own retirement. That doesn’t leave them much time on their own—and it is all old age. When parents live to 75, children have had the joys of a rich relationship with their parents, but also have enough time for their own lives, out of their parents’ shadows.

But there is something even more important than parental shadowing: memories. How do we want to be remembered by our children and grandchildren? We wish our children to remember us in our prime. Active, vigorous, engaged, animated, astute, enthusiastic, funny, warm, loving. Not stooped and sluggish, forgetful and repetitive, constantly asking “What did she say?” We want to be remembered as independent, not experienced as burdens.

At age 75 we reach that unique, albeit somewhat arbitrarily chosen, moment when we have lived a rich and complete life, and have hopefully imparted the right memories to our children. Living the American immortal’s dream dramatically increases the chances that we will not get our wish—that memories of vitality will be crowded out by the agonies of decline. Yes, with effort our children will be able to recall that great family vacation, that funny scene at Thanksgiving, that embarrassing faux pas at a wedding. But the most-recent years—the years with progressing disabilities and the need to make caregiving arrangements—will inevitably become the predominant and salient memories. The old joys have to be actively conjured up.

Of course, our children won’t admit it. They love us and fear the loss that will be created by our death. And a loss it will be. A huge loss. They don’t want to confront our mortality, and they certainly don’t want to wish for our death. But even if we manage not to become burdens to them, our shadowing them until their old age is also a loss. And leaving them—and our grandchildren—with memories framed not by our vivacity but by our frailty is the ultimate tragedy.

Seventy-five. That is all I want to live. But if I am not going to engage in euthanasia or suicide, and I won’t, is this all just idle chatter? Don’t I lack the courage of my convictions?

No. My view does have important practical implications. One is personal and two involve policy.

Once I have lived to 75, my approach to my health care will completely change. I won’t actively end my life. But I won’t try to prolong it, either. Today, when the doctor recommends a test or treatment, especially one that will extend our lives, it becomes incumbent upon us to give a good reason why we don’t want it. The momentum of medicine and family means we will almost invariably get it.

My attitude flips this default on its head. I take guidance from what Sir William Osler wrote in his classic turn-of-the-century medical textbook, The Principles and Practice of Medicine: “Pneumonia may well be called the friend of the aged. Taken off by it in an acute, short, not often painful illness, the old man escapes those ‘cold gradations of decay’ so distressing to himself and to his friends.”

My Osler-inspired philosophy is this: At 75 and beyond, I will need a good reason to even visit the doctor and take any medical test or treatment, no matter how routine and painless. And that good reason is not “It will prolong your life.” I will stop getting any regular preventive tests, screenings, or interventions. I will accept only palliative—not curative—treatments if I am suffering pain or other disability.

This means colonoscopies and other cancer-screening tests are out—and before 75. If I were diagnosed with cancer now, at 57, I would probably be treated, unless the prognosis was very poor. But 65 will be my last colonoscopy. No screening for prostate cancer at any age. (When a urologist gave me a PSA test even after I said I wasn’t interested and called me with the results, I hung up before he could tell me. He ordered the test for himself, I told him, not for me.) After 75, if I develop cancer, I will refuse treatment. Similarly, no cardiac stress test. No pacemaker and certainly no implantable defibrillator. No heart-valve replacement or bypass surgery. If I develop emphysema or some similar disease that involves frequent exacerbations that would, normally, land me in the hospital, I will accept treatment to ameliorate the discomfort caused by the feeling of suffocation, but will refuse to be hauled off.

What about simple stuff? Flu shots are out. Certainly if there were to be a flu pandemic, a younger person who has yet to live a complete life ought to get the vaccine or any antiviral drugs. A big challenge is antibiotics for pneumonia or skin and urinary infections. Antibiotics are cheap and largely effective in curing infections. It is really hard for us to say no. Indeed, even people who are sure they don’t want life-extending treatments find it hard to refuse antibiotics. But, as Osler reminds us, unlike the decays associated with chronic conditions, death from these infections is quick and relatively painless. So, no to antibiotics.

Obviously, a do-not-resuscitate order and a complete advance directive indicating no ventilators, dialysis, surgery, antibiotics, or any other medication—nothing except palliative care even if I am conscious but not mentally competent—have been written and recorded. In short, no life-sustaining interventions. I will die when whatever comes first takes me.

As for the two policy implications, one relates to using life expectancy as a measure of the quality of health care. Japan has the third-highest life expectancy, at 84.4 years (behind Monaco and Macau), while the United States is a disappointing No. 42, at 79.5 years. But we should not care about catching up with—or measure ourselves against—Japan. Once a country has a life expectancy past 75 for both men and women, this measure should be ignored. (The one exception is increasing the life expectancy of some subgroups, such as black males, who have a life expectancy of just 72.1 years. That is dreadful, and should be a major focus of attention.) Instead, we should look much more carefully at children’s health measures, where the U.S. lags, and shamefully: in preterm deliveries before 37 weeks (currently one in eight U.S. births), which are correlated with poor outcomes in vision, with cerebral palsy, and with various problems related to brain development; in infant mortality (the U.S. is at 6.17 infant deaths per 1,000 live births, while Japan is at 2.13 and Norway is at 2.48); and in adolescent mortality (where the U.S. has an appalling record—at the bottom among high-income countries).

A second policy implication relates to biomedical research. We need more research on Alzheimer’s, the growing disabilities of old age, and chronic conditions—not on prolonging the dying process.

Many people, especially those sympathetic to the American immortal, will recoil and reject my view. They will think of every exception, as if these prove that the central theory is wrong. Like my friends, they will think me crazy, posturing—or worse. They might condemn me as being against the elderly.

Again, let me be clear: I am not saying that those who want to live as long as possible are unethical or wrong. I am certainly not scorning or dismissing people who want to live on despite their physical and mental limitations. I’m not even trying to convince anyone I’m right. Indeed, I often advise people in this age group on how to get the best medical care available in the United States for their ailments. That is their choice, and I want to support them.

And I am not advocating 75 as the official statistic of a complete, good life in order to save resources, ration health care, or address public-policy issues arising from the increases in life expectancy. What I am trying to do is delineate my views for a good life and make my friends and others think about how they want to live as they grow older. I want them to think of an alternative to succumbing to that slow constriction of activities and aspirations imperceptibly imposed by aging. Are we to embrace the “American immortal” or my “75 and no more” view?

I think the rejection of my view is literally natural. After all, evolution has inculcated in us a drive to live as long as possible. We are programmed to struggle to survive. Consequently, most people feel there is something vaguely wrong with saying 75 and no more. We are eternally optimistic Americans who chafe at limits, especially limits imposed on our own lives. We are sure we are exceptional.

I also think my view conjures up spiritual and existential reasons for people to scorn and reject it. Many of us have suppressed, actively or passively, thinking about God, heaven and hell, and whether we return to the worms. We are agnostics or atheists, or just don’t think about whether there is a God and why she should care at all about mere mortals. We also avoid constantly thinking about the purpose of our lives and the mark we will leave. Is making money, chasing the dream, all worth it? Indeed, most of us have found a way to live our lives comfortably without acknowledging, much less answering, these big questions on a regular basis. We have gotten into a productive routine that helps us ignore them. And I don’t purport to have the answers.

But 75 defines a clear point in time: for me, 2032. It removes the fuzziness of trying to live as long as possible. Its specificity forces us to think about the end of our lives and engage with the deepest existential questions and ponder what we want to leave our children and grandchildren, our community, our fellow Americans, the world. The deadline also forces each of us to ask whether our consumption is worth our contribution. As most of us learned in college during late-night bull sessions, these questions foster deep anxiety and discomfort. The specificity of 75 means we can no longer just continue to ignore them and maintain our easy, socially acceptable agnosticism. For me, 18 more years with which to wade through these questions is preferable to years of trying to hang on to every additional day and forget the psychic pain they bring up, while enduring the physical pain of an elongated dying process.

Seventy-five years is all I want to live. I want to celebrate my life while I am still in my prime. My daughters and dear friends will continue to try to convince me that I am wrong and can live a valuable life much longer. And I retain the right to change my mind and offer a vigorous and reasoned defense of living as long as possible. That, after all, would mean still being creative after 75. Ω

[Ezekiel J. Emanuel (b. 1957) is the Vice Provost for Global Initiatives, a Levy University Professor, and Chair of the Department of Medical Ethics and Health Policy at the University of Pennsylvania. From 1997 to 2011, he was chair of the Department of Bioethics at the National Institutes of Health. Emanuel received a B.A. from Amherst College and subsequently received his M.Sc. (biochemistry) from Oxford University. He simultaneously studied for an M.D. and a Ph.D. (political philosophy) from Harvard University and received both degrees in the late 1980s. Emanuel also received the Toppan Dissertation Prize for the best Harvard political science dissertation of 1988.]

Copyright © 2014 The Atlantic Monthly Group



Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2014 Sapper's (Fair & Balanced) Rants & Raves

Friday, September 19, 2014

In The Age Of Instant Replay, NFL Commissioner Roger Goodell Is Guilty Of Late Hits On Everyone!

The National Football League has hit a rough patch and the fumbles (cases of domestic violence and child-abuse) just keep rolling out. An aggrieved nation turns its angry eyes on the NFL Commissioner; Roger Goodell is judge, jury, and executioner in the autumn circus known as pro football. The teams in the league who must juggle the hot potato/potatoes of abusers engage in a comic opera of "play on" or "suspended for two games" or "suspended for life." If this is a (fair & balanced) mirror of our age, so be it.

[x The Nation]
What Happened To Janay Palmer In The Elevator?
By The Deadline Poet (Calvin Trillin)

Tag Cloud of the following piece of writing

created at TagCrowd.com

Absent video, how could Goodell
Be expected for certain to tell?
So he figured that, not feeling well
From the flu, say, this mademoiselle
Might have suffered a brief fainting spell,
From which slowly but surely she fell.

And then her husband dragged her out of the
elevator, dumped her on the floor, and walked away. Ω

[Calvin Trillin began his career as a writer for Time magazine. Since July 2, 1990, as a columnist at The Nation, Trillin has written his weekly "Deadline Poet" column: humorous poems about current events. Trillin has written considerably more pieces for The Nation than any other single person. A native of Kansas City, MO, Trillin received his BA from Yale College in 1957. He served in the army, and then joined Time.]

Copyright © 2014 The Nation



Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2014 Sapper's (Fair & Balanced) Rants & Raves

Thursday, September 18, 2014

Roll Over Patriotism — Corporate Inversions Have Become The Last Refuge Of Scondrels

Burger King has moved its corporate HQ to Canada with the evident support of Burger King's largest shareholder: St. Warren Buffett (the smiling patron saint of Big Business). The reason? To protect Burger King profits from U.S. corporate income taxes and to avoid the implications of an increased minimum wage in the U.S. As they say in Quebec, Laissez les bons temps et bénéfices rouler ! (Let the good times & profits roll!) If this is (fair & balanced) repudiation of avarice, so be it.

[x HuffPo]
Inversions Are Revealing The Ugly Face Of Shareholder Value
By Ralph Gomory

Tag Cloud of the following piece of writing

created at TagCrowd.com

Something strange is happening to familiar American companies: Burger King has become Canadian, Pfizer seems to be trying to be British, and Walgreens has backed away from becoming Swiss only because of the outcry over their plan for a new nationality. Seeing what our companies are willing to do to escape paying income tax, people are beginning to wonder about how American our American companies are.

And when, in addition, these corporate actions are praised, and are described as what American companies should do, or even must do, people begin to wonder if something is seriously wrong--and they are right to wonder.

Inversions may or may not be important in themselves, but understanding the forces that drive corporate inversions reveals a surprising amount about the cause of two major problems; the problem of extreme income inequality and the problem of stagnating wages in America.

The driving force behind inversions was described very clearly by George Will in a recent Washington Post column. Writing a defense of inversions, George Will stated, "A publicly held corporation's responsibility is to its shareholders; its fiduciary duty is to maximize the value of their holdings."

From this statement it follows immediately that it is a corporation's "fiduciary duty" to do inversions, or for that matter, anything else, if it will lift share price.

But George Will's statement, although it is widely believed, is not accurate. There is no legal obligation, no "fiduciary duty" requiring corporations to focus on maximizing shareholder value. The reality is quite different; the reality is that shareholders and their managements have freely chosen this corporate direction. And, as we will see below, there is considerable evidence that this direction has been devastating for the country.

This focus on shareholder value is relatively new. As late as 1981 the Business Roundtable, an extremely influential organizations of major corporate CEO's, listed six constituencies they needed to consider in making their corporate decisions. The six constituencies were: customers, employees, communities, society at large, suppliers and shareholders. About these constituencies they wrote:

"Carefully weighing the impacts of decisions and balancing different constituent interests — in the context of both near term and long-term effects — must be an integral part of the corporation's decision-making and management process."

But over the course of the 1980s, our corporations changed direction in a major way. Today both the Business Roundtable and most corporations would accept George Will's statement as an obvious truth.

Given this current and widely accepted focus on shareholder value, it is both reasonable and illuminating to ask who actually benefits from that focus. Who are the shareholders? Who benefits when the stock price goes up?

Certainly corporate leaders themselves are among the beneficiaries as they are now compensated mainly by massive stock options. But although very visible, and very highly compensated, they are still a small group. Where does most of the benefit of increased stock price go? Who owns the stock in American companies?

While it is true and often stated that a good percentage of Americans have some stake in the stock market, most Americans have very little. Here is the actual pattern of stock ownership:

Roughly 1/3 of the stock market is owned by the richest one percent of Americans; 2/3, by the top five percent; and the remaining 1/3 is spread thinly across the remaining 95 percent of Americans.

Given this concentration of share ownership, the goal of maximizing shareholder value means that that our great corporations are currently dedicated to making the rich richer. And that is in fact what is happening.

Since 1980, despite ups and downs, the country has grown at a reasonable pace. However, almost all of that growth has gone to the top one percent. This is entirely different from the three decades that preceded 1980. During that period the country also grew steadily, but the rich and the rest grew together.

This changed outcome has shown itself in two ways. The first and most visible is that CEO pay went up by a factor of about 10. The second, but far more important outcome is that wages have stood still.

There is a close connection between these two results. Holding down wages is enormously valuable to shareholders. It is a gain in profitability that far outweighs the cost of the increased compensation to top management. Today, with their grants of stock options, shareholders have motivated top management to hold down wages, and management, aided by globalization and the threat or the reality of offshoring, has succeeded in doing just that. Top management's very visible increase in compensation reflects their alignment with the shareholders in gaining from the resulting increase in profitability and share price.

This outcome has now become a system problem. Today's corporate leadership is not only motivated by stock options to put shareholders first, they also know that if they do not they will not survive. It is the shareholders who elect the directors and ultimately control the corporation's direction. Today's remote shareholders, and the hedge funds and other financial firms that usually represent them, have no interest in a corporation other than in its profits and its stock price.

Our corporations do not have to be run this way [PDF]. They were not run this way in the past and even today there are many examples of companies both here and abroad that are run differently and run successfully.

But above all let us not make the easy mistake of simply attacking inversions. Inversions are only the highly visible sign of something much bigger. The real issue is not about inversions; it is about the goals of American corporations, and the fact that their present goals are destructive for America. Ω

[Ralph E. Gomory graduated from Williams College, studied at Cambridge University, and received his Ph.D. in mathematics from Princeton University. He has been a research mathematician for IBM and after retiring from IBM, Gomory became President of the Alfred P. Sloan Foundation. After 18 years as President of the Sloan Foundation, Gomory became a Research Professor at New York University's Stern School of Business.]

Copyright © 2014 The Huffington Post



Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2014 Sapper's (Fair & Balanced) Rants & Raves