I first confronted death and dying with the terminal cancer illness of my father, Bob Sapper. My father faced death bravely and I pray for the grace under pressure to do the same. If this is (fair & balanced) thanatology, so be it.
[x NYTimes]
Dr. Kübler-Ross, Who Changed Perspectives on Death, Dies at 78
By HOLCOMB B. NOBLE
Dr. Elisabeth Kübler-Ross, in 1970.
Elisabeth Kübler-Ross, the psychiatrist whose pioneering work in counseling terminally ill patients helped to revolutionize the care of the dying, enabling people all over the world to die more peacefully and with greater dignity, died Tuesday at her home in Scottsdale, Ariz. She was 78.
Family members told The Associated Press she died of natural causes.
A series of strokes had debilitated her, but as she neared her own death she appeared to accept it, as she had tried to help so many others to do. She seemed ready to experience death, saying: "I'm going to dance in all the galaxies."
Dr. Kübler-Ross was credited with ending centuries-old taboos in Western culture against openly discussing and studying death. She set in motion techniques of care directed at making death less dehumanizing and psychologically painful for patients, for the professionals who attend them and the loved ones who survive them.
She accomplished this largely through her writings, especially the 1969 best-seller, "On Death and Dying," which is still in print around the world; through her lectures and tape recordings; her research into what she described as the five stages of death, based on thousands of interviews with patients and health-care professionals and through her own groundbreaking work in counseling dying patients.
She was a powerful intellectual force behind the creation of the hospice system in the United States through which special care is now provided for the terminally ill. And she helped to turn thanatology, the study of physical, psychological and social problems associated with dying, into an accepted medical discipline.
"Dr. Elisabeth Kübler-Ross was a true pioneer in raising the awareness among the physician community and the general public about the important issues surrounding death, dying and bereavement," said Dr. Percy Wooten, president of the American Medical Association. He said much of her work was a basis for the A.M.A.'s attempts to encourage the medical profession to improve the care patients received at the end of life.
The A.M.A was one of her early supporters, though many of its members at first vigorously opposed her and attempted to ostracize her.
Florence Wald of the Yale School of Nursing said that before her research "doctors and nurses had been simply avoiding the problem of death and focusing on patients who could get better." She said, Dr. Kübler-Ross's "willingness and skill in getting patients to talk about their impending death in ways that helped them set a profoundly important example for nurses everywhere."
In the later part of her career, she embarked on research to verify the existence of life after death, conducting, with others, thousands of interviews with people who recounted near-death experiences, particularly those declared clinically dead by medical authorities but who were then revived. Her prestige generated widespread interest in such research and attracted followers who considered her a saint.
But this work aroused deep skepticism in medical and scientific circles and damaged her reputation. Her claims that she had evidence of an afterlife saddened many of her colleagues, some of whom believed that she had abandoned rigorous science and had succumbed to her own fears of death.
"For years I have been stalked by a bad reputation," she said in her 1997 autobiography "The Wheel of Life: A Memoir of Living and Dying." "Actually, I have been pursued by people who have regarded me as the Death and Dying Lady. They believe that having spent more than three decades in research into death and life after death qualifies me as an expert on the subject. I think they miss the point. The only incontrovertible fact of my work is the importance of life. I always say that death can be one of the greatest experiences ever. If you live each day of your life right, then you have nothing to fear."
Whatever scientists feel about her view of life after death, they continue to be influenced by her methods of caring for the terminally ill. Before "On Death and Dying," terminally ill patients were often left to face death in a miasma of loneliness and fear because doctors, nurses and families were generally poorly equipped to deal with death.
Dr. Kübler-Ross changed all that. By the 1980's, the study of the processes and treatment of dying became a routine part of medical and health-care education in the United States. "Death and Dying" became an indispensable manual, both for professionals and for family members. Many doctors and counselors have relied on it to learn to cope themselves with the loss of their patients, and face their own mortality.
Her early childhood may have been the "instigator," as she put it, in shaping her career. Weighing barely two pounds at birth, she was the first of triplets born to Ernst and Emma Villiger Kübler on July 8, 1926 in Zurich, Switzerland.
She might not have lived, she wrote, "If it had not been for the determination of my mother," who thought a sick child must be kept close to her parents in the intimate environment of the home, not at a hospital.
But there were moments in her childhood in a farm village when she saw death as both moving and frightening. A friend of her father who was dying after a fall from a tree invited neighbors into his home and, with no sign of fear as death approached, asked them to help his wife and children save their farm. "My last visit with him filled me with great pride and joy," she said.
Later, a schoolmate died of meningitis. Relatives or friends of the child were with her night and day, and when she died, her school was closed and half the village attended the funeral.
"There was a feeling of solidarity, of common tragedy shared," Dr. Kübler-Ross said. By contrast, when she was 5, she was "caged" in a hospital with pneumonia, allowed to see her parents only through a glass window, with "no familiar voice, touch, odor, not even a familiar toy." She believed that only her vivid dreams and fantasies enabled her to survive.
By the sixth grade, she wanted to be a physician. But her father, she said, saw only two possibilities in life: "his way and the wrong way." "Elisabeth," he said, "you will work in my office. I need an intelligent secretary."
"No thank you," she said, and her father's face flushed with anger.
"Then you can spend the rest of your life as a maid."
"That's all right with me," she replied.
When she finished school, she worked at various jobs and began her lifelong involvement with humanitarian causes. She volunteered at Zurich's largest hospital to help refugees from Nazi Germany. And when World War II ended, she hitchhiked through nine war-shattered countries, helping to open first-aid posts and working on reconstruction projects, as a cook, mason and roofer.
In Poland, her visit to the Majdanek concentration camp narrowed her professional goal: she would become a psychiatrist to help people cope with death.
Back in Switzerland, she enrolled at the University of Zurich medical school, receiving her degree in 1957. Within a year she had come to the United States; married Dr. Emanuel K. Ross, an American neuropathologist she met at the University of Zurich; begun her internship at Community Hospital in Glen Cove, N.Y., and become a research fellow at Manhattan State Hospital on Ward's Island in New York City.
There she was appalled by what she called routine treatment of dying patients: "They were shunned and abused," she wrote, "sometimes kept in hot tubs with water up to their necks and left for 24 hours at a time."
After badgering her supervisors, she was allowed to develop programs under which the patients were given individual care and counseling.
In 1962, she became a teaching fellow at the University of Colorado School of Medicine in Denver. A small woman, who spoke with a heavy German accent and was shy, despite extraordinary inner self-confidence, she was highly nervous when asked to fill in for a popular professor and master lecturer. She found the medical students rude, paying her scant attention and talking to one another as she spoke.
But the hall became noticeably quieter when she brought out a 16-year-old patient who was dying of leukemia, and asked the students to interview her. Now it was they who seemed nervous. When she prodded them, they would ask the patient about her blood counts, chemotherapy or other clinical matters.
Finally, the teenager exploded in anger, and began posing her own questions: What was it like not to be able to dream about the high-school prom? Or going on a date? Or growing up? "Why," she demanded, "won't people tell you the truth?" When the lecture ended, many students had been moved to tears.
"Now you're acting like human beings, instead of scientists," Dr. Kübler-Ross said.
Her lectures began to draw standing-room-only audiences of medical and theology students, clergymen and social workers — but few doctors.
In 1965, she became an assistant professor in psychiatry at the University of Chicago Medical School, where a group of theology students approached her for help in studying death. She suggested a series of conversations with dying patients, who would be asked their thoughts and feelings; the patients would teach the professionals. At first, staff doctors objected.
Avoiding the subject entirely, particularly when treating the young, physicians and therapists would meet a dying child's questions with comments like, "Take your medicine, and you'll get well," Dr. Kübler-Ross said.
In "On Death and Dying," her account of the seminars on dying that she conducted at Chicago, she asked: What happens to a society when "its young medical student is admired for his research and laboratory work while he is at a loss for words when a patient asks him a simple question?"
She said children instinctively knew that the answers they received about their prognoses were lies and this made them feel punished and alone. Children were often better at coping with imminent death than adults, she said, and told of 9-year-old Jeff, who though weakened with leukemia asked to leave the hospital and go home and ride his bicycle one more time.
The boy's father, tears in his eyes, put the training wheels back on the bike at the boy's request, and his mother was kept by Dr. Kübler-Ross from helping him ride. Jeff came back after a spin around the block in final triumph, the psychiatrist said, and then gave the bicycle to his younger brother.
To bring public pressure for change in hospitals' treatment of the dying, she agreed to a request by Life magazine in 1965 to interview one of her seminar patients, Eva, who felt her doctors had treated her coldly and arrogantly. The Life article prompted one physician, encountering Dr. Kübler-Ross in a hospital corridor, to remark: "Are you looking for your next patient for publicity."
The hospital said it wanted not to be famous for its dying patients but rather for those it saved, and ordered its doctors not to cooperate further. The lecture hall for her next seminar was empty.
"Although humiliated," she said, "I knew they could not stop everything that had been put in motion by the press." The hospital switchboard was overwhelmed with calls in reaction to the Life article; mail piled up and she was invited to speak at other colleges and universities.
Not that this helped Eva much. Dr. Kübler-Ross said she looked in on her years later and found her lying naked on a hospital bed, unable to speak, with an overhead light glaring in her eyes. "She pressed my hand as a way of saying hello, and pointed her other hand up toward the ceiling. I turned the light off and asked a nurse to cover Eva. Unbelievably, the nurse hesitated, and asked, `Why?' " Dr. Kübler-Ross covered the patient herself. Eva died the next day.
"The way she died, cold and alone, was something I could not tolerate," Dr. Kübler-Ross said. Gradually, the medical profession came to accept her new approaches to treating the terminally ill.
From her patient interviews, Dr. Kübler-Ross identified five stages many patients go through in confronting their own deaths. Often denial is the first stage, when the patient is unwilling or unable to face his predicament. As his condition worsens and denial is impossible, the patient displays anger — the "why me?" stage. This is followed by a bargaining period ("Yes, I'm going to die, but if I diet and exercise, can I do it later?"). When the patient sees that bargaining won't work, depression often sets in. The final stage is acceptance, a passive period in which the patient is ready to let go.
Not all dying patients follow the same progression, said Dr. Kübler-Ross, but most experience two or more of these stages. Moreover, she found, people who are experiencing traumatic change in their lives, such as a divorce, often experience similar stages.
Another conclusion she reached was that an untraumatic acceptance of death came easiest for those who could look back and feel they had lived honestly and felt they had not wasted their lives.
In later years, Dr. Kübler-Ross's insistence that she could prove the existence of a serene afterlife drew fire from scientists and many lecture appearances were canceled. The center she built in California in the late 1970's burned, and the police suspected arson. She set up another center in 1984 in Virginia to care for children with AIDS; that center also was burned, in 1994, and arson was again suspected. After the second fire, she moved to Scottsdale, Ariz., to be near her son, Kenneth, a freelance photographer.
That year, when Dr. Ross was dying, he moved to a condominium in Scottsdale near Dr. Kübler-Ross, even though they were divorced. She and their son, Kenneth, cared for him. In addition to the son, Dr. Kübler-Ross is survived by a daughter, Barbara Ross, a clinical psychologist, of Wausau, Wis.; her brother Ernst, of Surrey, England; and her triplet sisters, Erika and Eva of Basel, Switzerland.
As Dr. Kübler-Ross awaited her own death, in a darkened room at her home in Arizona, she acknowledged that she was in pain and ready for her life to end. But she said, "I know beyond a shadow of a doubt that there is no death the way we understood it. The body dies, but not the soul."
Copyright © 2004 The New York Times Company
Wednesday, August 25, 2004
Elisabeth Kübler-Ross, MDRIP
ANOTHER Modest Proposal
Another 527 Group? How about Dummies For Bush? W has self-defined himself as a dumbass; stupid, but cunning. His malapropisms abound and astound. W is proud of his mispronunciation of nuclear. In his mind (using the term loosely), there is honor in mediocrity. Frat-boy jocularity passes for wit. I hate to think what Dr. Samuel Johnson would say of him if W had been born in another place (London) at another time (early 18th century). The good Dr. Johnson would dismiss W as a babbling idiot. Now, who would join (and support) Dummies For Bush? They're everywhere. Anyone with a W bumper strip or a W lapel pin is a prime candidate. Not only does W self-define, but all of the Bush supporters self define themselves as dummies. This cognitive underclass elected W in 2000. May the Almighty have mercy on our souls if W is given 4 more years by them. If this is (fair & balanced) dismay, so be it.
[x Jerusalem Post]
In praise of mediocrity
by Bret Stephens
Friday, July 26, 2002 -- Poor Roman Hruska. In April 1970, the Nebraska Republican took to the floor of the Senate to defend the nomination of G. Harrold Carswell, a federal judge on the Fifth Circuit Court of Appeals, to the United States Supreme Court. It was a trying time. Questions had been raised about Carswell's commitment to civil rights. Betty Friedan testified that he was a sexist. The judge's rulings tended to be overturned on appeal, so he was considered an intellectual lightweight. And the administration was hurting, too. A defeat for Carswell would be the second consecutive rejection by the Senate of a Nixon Supreme Court nominee (the first was Clement Haynsworth), something the president could ill afford in the run-up to the midterm elections.
In this atmosphere, Hruska delivered the remark that would become his epitaph. "It has been held against this nominee that he is mediocre," he told the Senate chamber. "Even if he is mediocre there are a lot of mediocre judges and people and lawyers. They are entitled to a little representation, aren't they, and a little chance? We can't have all Brandeises, Cardozos and Frankfurters and stuff like that there."
Needless to say, Hruska's defense did not do much to help Carswell, who went down in a 51-45 vote and was later indicted for soliciting a male prostitute in a Florida shopping mall. Hruska's reputation fared little better. At the time of the nomination, the Nebraskan had been a senator for 16 years and held seats both on the appropriations and the judiciary committee, where he was known by colleagues as a "workhorse," a "senator's senator," and a contender for the GOP leadership. Yet becoming the champion of the mediocre man also made him the epitome of one, and he never lived it down. When he died, in 1999, every obituary of him reprinted the comment, about which Hruska himself was said to be much abashed.
WHICH IS A PITY, because Hruska was on to something.
Nobody, of course, wants to be thought of as mediocre, least of all as intellectually mediocre. Most of us, of course, are. The typical person has an intelligence quotient somewhere in the vicinity of 100; very few of us have IQ's that go much higher than 120. And yet, as in Garrison Keillor's mythical Lake Wobegon, we have persuaded ourselves that we're all "above average." We may not, after all, be all that clever. Yet the belief that we are, even if self-deceiving, is essential if we're to run our race in life in the hopes of actually getting somewhere.
This is especially important today, in the West, where the work we do requires greater mental exertions than physical ones, and where success in our careers seems to be so closely correlated with having brains to spare.
In The Bell Curve: Intelligence and Class Structure in American Life, published in 1994, social scientists Richard Herrnstein and Charles Murray noted how "Modern societies identify the brightest youths with ever-increasing efficiency and then guide them into fairly narrow educational and occupational channels. These channels are increasingly lucrative and influential, leading to the development of a distinct stratum in the social hierarchy."
Compounding this trend, Herrnstein and Murray argued, is the fact that the members of this stratum tend to marry one another, passing on not just their looks, but also their smarts, to their children. The result was a kind of self-perpetuating and socially impregnable class of natural aristocrats.
On the other end of the spectrum, however, is a cognitive underclass that also intermarries, and also passes on its lower-grade intellectual traits along with the problems that go with them: 48 percent of those living in poverty come from the bottom fifth of the intelligence distribution, as do 66% of high-school dropouts, as do 62% of those in jail. Peering into the future, Murray and Herrnstein espied a two-tier society in which the upper tier would do "whatever is necessary to preserve the mansions on the hills from the menace of the slums below," if necessary by imposing a "custodial state" upon the lower-tier.
IT'S A FRIGHTENING scenario, inspired, I suspect, by present-day realities in places like Mexico and Brazil. It also comes disturbingly close to describing Europe.
Today, "Europe" exists principally on two levels. There is the Europe of nations: Spaniards, Italians, Germans, Belgians, French, etcetera. And there is the Europe of Europeans: a rarified, English-speaking elite that moves comfortably between Paris and London and Frankfurt and Brussels.
Europeans in the first category tend to be fairly stationary, wedded to (often unionized) jobs as machinists or cabbies or petty bureaucrats, and reactionary in their politics.
Europeans in the second category are highly mobile, cultured and smart. Domestically, their politics are integrationist; they are the Europe of open borders, pan- European legislation and the euro. Internationally, their politics are liberal-multilateralist; they believe in "global solutions," the goodness of the UN, the efficacy of foreign aid and so on.
Dividing the groups are educational credentials, reflecting a system in which children are quickly streamed according to their cognitive abilities. In Germany, for example, children who test poorly typically end up in Hauptschule, which ends in the eighth grade and usually leads to some form of vocational training. Their slightly better qualified peers go to Realschule, which ends in the 10th grade, while the best qualified are sent to a 13-year Gymnasium, leading to university education.
The upshot is that by the time most Europeans are 20, their life chances have pretty much been determined, and the crossover possibilities are slim. For many, of course, this bifurcation merely reflects the truth of who they'll ever be. But anyone who's spent time in Europe and encountered train conductors fluent in four languages, or barmen reading Dickens between pulls at the tap, know how much human capital is wasted in the bargain. Europe, too, has its late bloomers, those whose real talents simply couldn't be gauged in a grade-school exam. But their potential goes untapped and is squandered.
Meanwhile, their elite counterparts move in a very different direction: to the European Commission, to the law, to the civil services of their home countries, or to multinational companies such as McKinsey or JP Morgan. These are each, in their way, mandarinates, in which advancement is gotten by way of tests, connections, performance reviews and the like - essentially, an extension of the educational systems from which they came. Taken together, they reinforce the mentality that working within existing systems, rather than creating new ones, is the surest path to success. And so the corporatist mentality perpetuates itself.
FORTUNATELY, matters are otherwise in the United States.
In his first annual message to Congress, Abraham Lincoln made the essential point: "There is not," he said, "of necessity, any such thing as the free hired laborer being fixed to that condition for life.... The prudent, penniless beginner in the world, labors for himself; then labors on his own account another while, and at length hires another new beginner to help him. This is the just, and generous, and prosperous system, which opens the way to all - gives hope to all, and consequent energy, and progress, and improvement of condition to all."
Lincoln was speaking at a time when America was a predominantly agricultural society, when sheer sweat still counted for a lot. And yet the message contains the basic premise of American-style capitalism: that, much more than smarts, hard work, prudence, initiative and independence ultimately are what determine success and failure in America. Even today, that remains largely true.
Consider the state of American education. In 1983, the National Commission on Excellence in Education issued a report entitled A Nation at Risk. It warned of a "rising tide of mediocrity that threatens our very future as a nation and as a people." Among other "indicators of risk," it noted that average verbal SAT scores had fallen by over 50 points between 1963 and 1980; that international comparisons of student achievement had American children falling behind in almost every discipline; that 13% of all 17-year-olds were fully functionally illiterate. The report also insisted that "education is the major foundation for the future strength of this country," and that without full-fledged educational reform "our once unchallenged preeminence in commerce, industry, science and technological innovation" would be overtaken.
But as Roman Hruska's old friend Bob Dole might have said: Whatever.
Nearly 20 years after the report was issued, American preeminence in commerce, industry, science and technological innovation - and just about everything else, for that matter - remains unchallenged. This is not because America's educational system has turned a corner; by several indicators it's become worse. Rather, it's because educational achievement turns out not the great predictor of success that it is in Europe.
Graduates of Chicago, Harvard, and Yale will in all likelihood do well in life, and do so in fields such as medicine and law where high IQ and decent education is essential. Yet that does not mean the more poorly schooled are at too serious a disadvantage. H. Lee Scott, Jr., the president and CEO of Wal Mart, is a graduate of Pittsburg State University; AT&T's Michael Armstrong attended Ohio's Miami University; GE's Jack Welch went to the University of Massachusetts. All three men are surely bright, but no less important is their immense drive, basic horse sense and their willingness to take risks. Yet none of these virtues are easily acquired, and may in fact be discouraged, by attending a top-flight school.
Then too, beyond entry-level offers, few US employers care very much for academic qualifications. What matters is a positive attitude toward work, "people skills," and the ability to get the job done. And in a predominantly service economy having a mediocre IQ works just fine, especially since the advent of widespread computing technology diminishes the need for all but basic numeracy. To be, say, a franchise manager, or a personal trainer, or a sales executive does not require a better than average intellect. Yet each of these jobs may command comfortably middle class incomes.
Finally, political success in America is an area where intellectual brilliance counts for relatively little. Roman Hruska may have been an intellectual mediocrity, but he was a four-term senator with a significant record of legislative success. Gerald Ford was famously unbright, but his decency carried him far and did the country good. Ditto for Franklin Roosevelt (a second-rate intellect, a first- rate temperament, as Oliver Wendell Holmes, Jr. so famously put it), Ronald Reagan, and, perhaps, the office's current occupant. By contrast, America's most brilliant presidents - from Herbert Hoover the wonder boy to Jimmy Carter the nuclear engineer - have notoriously been America's worst.
IN THE NICOMACHEAN ETHICS, Aristotle famously defines intelligence as the sum of moral virtues. By contrast, mere cleverness - IQ, in today's parlance - does not even rank as a virtue, since it contributes neither to goodness nor wisdom, and may be employed toward opposite ends.
The greatness of the United States lies in the fact that, over time, it has tended to place a higher value on ordinary decency than on extraordinary cleverness. The Soviet Union, after all, richly rewarded its greatest talents, as does Europe today. By contrast, America has thrived because it created an environment in which intellectual mediocrities could also prosper, in which their limited capacities for intellectual development would not stand in the way of their ambition so long as they were willing to play by the rules and cultivate the right habits of mind and heart.
In his commencement speech at Yale last year, President George W. Bush offered graduates the following wisdom: "To those of you who received honors, awards and distinctions, I say, well done. And to the C students - I say, you, too, can be president of the United States."
Amen.
Bret Stephens is editor of the Jerusalem Post.
Copyright © 2002 Jerusalem Post. All Rights Reserved