Sunday, September 05, 2010

Hit Or Myth?

WTF? We use more than 10% of our brains? It's better to repress anger? Low self-esteem isn't a problem? Human memory isn't a video cam? Hypnosis isn't a trance? The polygraph test can lie? Opposites don't attract? Schizophrenics don't have multiple personalities? Full moons don't cause crazy behavior? And most insanity pleas aren't successful? Would our films and TV shows misinform us? If this is (fair & balanced) disillusionment, so be it.

(For the version of this article with citations of sources as endnotes, click here.)


[x Skeptic]
Top Ten Myths Of Popular Psychology
By Scott O. Lilienfeld, Steven Jay Lynn, John Ruscio, and [the late] Barry L. Beyerstein

Tag Cloud of the following article

created at TagCrowd.com

Virtually every day, the news media, television shows, films, and Internet bombard us with claims regarding a host of psychological topics: psychics, out of body experiences, recovered memories, and lie detection, to name a few. Even a casual stroll through our neighborhood bookstore reveals dozens of self-help, relationship, recovery, and addiction books that serve up generous portions of advice for steering our paths along life’s rocky road. Yet many popular psychology sources are rife with misconceptions. Indeed, in today’s fast-paced world of information overload, misinformation about psychology is at least as widespread as accurate information. Self-help gurus, television talk show hosts, and self-proclaimed mental health experts routinely dispense psychological advice that is a bewildering mix of truths, half-truths, and outright falsehoods. Without a dependable tour guide for sorting out psychological myth from reality, we’re at risk for becoming lost in a jungle of “psychomythology.”

In our new book, 50 Great Myths of Popular Psychology: Shattering Widespread Misconceptions About Human Nature (2010), we examine in depth 50 widespread myths in popular psychology (along with approximately 250 other myths and “mini-myths”), present research evidence demonstrating that these beliefs are fictional, explore their ramifications in popular culture and everyday life, and trace their psychological and sociological origins. Here, pace David Letterman, we present (in no particular order) our own candidates for the top 10 myths of popular psychology.

Myth #1: We Only Use 10% of our Brains

Whenever those of us who study the brain venture outside the Ivory Tower to give public lectures, one of the questions we’re most likely to encounter is, “Is it true that we only use 10% of our brains?” The look of disappointment that usually follows when we respond, “Sorry, I’m afraid not,” suggests that the 10% myth is one of those hopeful truisms that refuses to die because it would be so nice if it were true. In one study, when asked “About what percentage of their potential brain power do you think most people use?,” a third of psychology majors answered 10%. Remarkably, one survey revealed that even 6% of neuroscientists agreed with this claim! The pop psychology industry has played a big role in keeping this myth alive. For example, in his book, How to be Twice as Smart (1983), Scott Witt wrote that “If you’re like most people, you’re using only ten percent of your brainpower.”

There are several reasons to doubt that 90% of our brains lie silent. At a mere 2–3% of our body weight, our brain consumes over 20% of the oxygen we breathe. It’s implausible that evolution would have permitted the squandering of resources on a scale necessary to build and maintain such a massively underutilized organ. Moreover, losing far less than 90% of the brain to accident or disease almost always has catastrophic consequences. Likewise, electrical stimulation of sites in the brain during neurosurgery has failed to uncover any “silent areas.”

How did the 10% myth get started? One clue leads back about a century to psychologist William James, who once wrote that he doubted that average persons achieve more than about 10% of their intellectual potential. Although James talked in terms of underdeveloped potential, a slew of positive thinking gurus transformed “10% of our capacity” into “10% of our brain.” In addition, in calling a huge percentage of the human brain “silent cortex,” early investigators may have fostered the mistaken impression that what scientists now call “association cortex” — which is vitally important for language and abstract thinking — had no function. In a similar vein, early researchers’ admissions that they didn’t know what 90% of the brain did probably fueled the myth that it does nothing. Finally, although one frequently hears claims that Albert Einstein once explained his own brilliance by reference to 10% myth, there’s no evidence that he ever uttered such a statement.

Myth #2: It’s Better to Express Anger Than to Hold it in

If you’re like most people, you believe that releasing anger is healthier than bottling it up. In one survey, 66% of undergraduates agreed that expressing pent-up anger — sometimes called “catharsis” — is an effective means of reducing one’s risk for aggression. A host of films stoke the idea that we can tame our anger by “letting off steam” or “getting things off our chest.” In the 2003 film "Anger Management," after the meek hero (Adam Sandler) is falsely accused of “air rage” on a flight, a judge orders him to attend an anger management group run by Dr. Buddy Rydell (Jack Nicholson). At Rydell’s suggestion, Sandler’s character plays dodgeball with schoolchildren and throws golf clubs. Dr. Rydell’s advice echoes the counsel of many self-help authors. John Lee suggested that rather than “holding in poisonous anger,” it’s better to “Punch a pillow or a punching bag.” Some psychotherapies encourage clients to scream or throw balls against walls when they become angry. Proponents of “primal scream therapy” believe that psychologically troubled adults must release the emotional pain produced by infant trauma by discharging it, often by yelling at the top of their lungs.

Yet more than 40 years of research reveals that expressing anger directly toward another person or indirectly toward an object actually turns up the heat on aggression. In an early study, people who pounded nails after someone insulted them were more critical of that person. Moreover, playing aggressive sports like football results in increases in aggression, and playing violent videogames like "Manhunt," in which participants rate bloody assassinations on a 5-point scale, is associated with heightened aggression. Research suggests that expressing anger is helpful only when it’s accompanied by constructive problem-solving designed to address the source of the anger.

Why is this myth so popular? In all likelihood, people often mistakenly attribute the fact that they feel better after they express anger to catharsis, rather than to the fact that anger usually subsides on its own after awhile.

Myth #3: Low Self-Esteem is a Major Cause of Psychological Problems

Many popular psychologists have long maintained that low self-esteem is a prime culprit in generating unhealthy behaviors, including violence, depression, anxiety, and alcoholism. From Norman Vincent Peale’s 1952 The Power of Positive Thinking onward, self-help books proclaiming the virtues of self-esteem have become regular fixtures in bookstores. In his best-seller, The Six Pillars of Self-Esteem (1994), Nathaniel Branden insisted that one “cannot think of a single psychological problem — from anxiety and depression, to fear of intimacy or of success, to spouse battery or child molestation — that is not traceable to the problem of low self-esteem.”

The self-esteem movement has found its way into mainstream educational practices. Some athletic leagues award trophies to all schoolchildren to avoid making losing competitors feel inferior. One elementary school in California prohibited children from playing tag because the “children weren’t feeling good about it.” Moreover, the Internet is chock full of educational products intended to boost children’s self-esteem. One book, Self-Esteem Games (1998), contains 300 activities to help children feel good about themselves, such as repeating positive affirmations emphasizing their uniqueness.

But there’s a fly in the ointment: Research shows that low self esteem isn’t strongly associated with poor mental health. In a comprehensive review, Roy Baumeister and his colleagues canvassed over 15,000 studies linking self-esteem to just about every conceivable psychological variable. They found that self-esteem is minimally related to interpersonal success, and not consistently related to alcohol or drug abuse. Moreover, they discovered that although self-esteem is positively associated with school performance, better school performance appears to contribute to high self-esteem rather than the other way around. Perhaps most surprising of all, they found that “low self-esteem is neither necessary nor sufficient for depression.”

Myth #4: Human Memory Works like a Video Camera

Despite the sometimes all-too-obvious failings of everyday memory, surveys show that many people believe that their memories operate very much like videotape recorders. About 36% of us believe that our brains preserve perfect records of everything we’ve experienced. In one survey of undergraduates, 27% agreed that memory operates like a tape recorder. Even most psychotherapists agree that memories are fixed more or less permanently in the mind.

It’s true that we often recall extremely emotional events, sometimes called flashbulb memories because they seem to have a photographic quality. Nevertheless, research shows that even these memories wither over time and are prone to distortions. Consider an example from Ulric Neisser and Nicole Harsch’s study of memories regarding the disintegration of the space shuttle Challenger. A student at Emory University provided the first description 24 hours after the disaster, and the second account two and a half years later.

Description 1. “I was in my religion class and some people walked in and started talking about [it]. I didn’t know any details except that it had exploded and the schoolteacher’s students had all been watching which I thought was so sad. Then after class I went to my room and watched the TV program talking about it and I got all the details from that.”

Description 2. “When I first heard about the explosion I was sitting in my freshman dorm room with my roommate and we were watching TV. It came on a news flash and we were both totally shocked. I was really upset and I went upstairs to talk to a friend of mine and then I called my parents.”

Clearly, there are striking discrepancies between the two memories. Neisser and Harsch found that about one-third of students’ reports contained large differences across the two time points. Similarly, Heike Schmolck and colleagues compared participants’ ability to recall the 1995 acquittal of former football star O. J. Simpson 3 days after the verdict, and after many months. After 32 months, 40% of the memory reports contained “major distortions.”

Today, there’s broad consensus among psychologists that memory isn’t reproductive — it doesn’t duplicate precisely what we’ve experienced — but reconstructive. What we recall is often a blurry mixture of accurate and inaccurate recollections, along with what jells with our beliefs and hunches. Indeed, researchers have created memories of events that never happened. In the “shopping mall study,” Elizabeth Loftus created a false memory in Chris, a 14-year-old boy. Loftus instructed Chris’s older brother to present Chris with a false story of being lost in a shopping mall at age 5, and she instructed Chris to write down everything he remembered. Initially, Chris reported very little about the false event, but over a two week period, he constructed a detailed memory of it. A flood of similar studies followed, showing that in 18-37% of participants, researchers can implant false memories of such events as serious animal attacks, knocking over a punchbowl at a wedding, getting one’s fingers caught in a mousetrap as a child, witnessing a demonic possession, and riding in a hot air balloon with one’s family.

Myth #5: Hypnosis is a Unique “Trance” State Differing in Kind from Wakefulness

Popular movies and books portray the hypnotic trance state as so powerful that otherwise normal people will commit an assassination ("The Manchurian Candidate"); commit suicide ("The Garden Murders"); perceive only a person’s internal beauty ("Shallow Hal"); and (our favorite) fall victim to brainwashing by alien preachers who use messages embedded in sermons ("Invasion of the Space Preachers"). Survey data show that public opinion resonates with these media portrayals: 77% of college students endorsed the statement that “hypnosis is an altered state of consciousness, quite different from normal waking consciousness,” and 44% agreed that “A deeply hypnotized person is robot-like and goes along automatically with whatever the hypnotist suggests.”

But research shows that hypnotized people can resist and even oppose hypnotic suggestions, and won’t do things that are out of character, like harming people they dislike.31 In addition, hypnosis bears no more than a superficial resemblance to sleep: Brain wave studies reveal that hypnotized people are wide awake. What’s more, individuals can be just as responsive to suggestions administered while they’re exercising on a stationary bicycle as they are following suggestions for sleep and relaxation. In the laboratory, we can reproduce all of the phenomena that laypersons associate with hypnosis (such as hallucinations and insensitivity to pain) using suggestions alone, with no mention of hypnosis. Evidence of a distinct trance unique to hypnosis would require physiological markers of subjects’ responses to suggestions to enter a trance. Yet no consistent evidence of this sort has emerged.

Hypnosis appears to be only one procedure among many for increasing people’s responses to suggestions.

Myth #6: The Polygraph Test is an Accurate Means of Detecting Lies

Have you ever told a lie? If you answered “no,” you’re lying. College students admit to lying in about one in every three social interactions and people in the community about one in every five interactions. Not surprisingly, investigators have long sought out foolproof means of detecting falsehoods. In the 1920s, psychologist William Moulton Marston invented the first polygraph or so-called “lie detector” test, which measured systolic blood pressure to detect deception. He later created one of the first female cartoon superheroes, Wonder Woman, who could compel villains to tell the truth by ensnaring them in a magic lasso. For Marston, the polygraph was the equivalent of Wonder Woman’s lasso: an infallible detector of the truth.

A polygraph machine plots physiological activity — such as skin conductance, blood pressure, and respiration — on a continuously running chart. Contrary to the impression conveyed in such movies as "Meet the Parents," the machine isn’t a quick fix for telling whether someone is lying, although the public’s desire for such a fix almost surely contributes to the polygraph’s popularity. In one survey of introductory psychology students, 45% believed that the polygraph “can accurately identify attempts to deceive.” Yet interpreting a polygraph chart is notoriously difficult.

For starters, there are large differences among people in their levels of physiological activity. An honest examinee who tends to sweat a lot might mistakenly appear deceptive, whereas a deceptive examinee who tends to sweat very little might mistakenly appear truthful. Moreover, as David Lykken noted, there’s no evidence for a Pinocchio response, such as an emotional or physiological reaction uniquely indicative of deception. If a polygraph chart shows more physiological activity when the examinee responds to questions about a crime than to irrelevant questions, at most this difference tells us that the examinee was more nervous at those moments. Yet this difference could be due to actual guilt, indignation or shock at being unjustly accused, or the realization that one’s responses to questions about the crime could lead to being fired, fined, or imprisoned. Thus, polygraph tests suffer from a high rate of “false positives” — innocent people whom the test deems guilty. As a consequence, the “lie detector” test is misnamed: It’s really an arousal detector. Conversely, some individuals who are guilty may not experience anxiety when telling lies. For example, psychopaths are notoriously immune to fear and may be able to “beat” the test in high pressure situations, although the research evidence for this possibility is mixed.

Were he still alive, William Moulton Marston might be disappointed to learn that researchers have yet to develop the psychological equivalent of Wonder Woman’s magic lasso. For at least the foreseeable future, the promise of a perfect lie detector remains the stuff of comic book fantasy.

Myth #7: Opposites Attract

The notion that “opposites attract” is a standard part of our cultural landscape. Films, novels, and TV sitcoms overflow with stories of diametrical opposites falling passionately in love. The 2007 smash hit comedy, "Knocked Up," is perhaps Hollywood’s latest installment in it’s seemingly never-ending parade of wildly mismatched romantic pairings. Most of us are convinced that people who are opposite from each other in their personalities, beliefs, and looks tend to be attracted to each other. Lynn McCutcheon found that 77% of undergraduates agreed that opposites attract in relationships. This belief is also widespread in pockets of the Internet dating community. On one site called “Soulmatch,” Harville Hendrix, Ph.D. (described as a “relationships expert”) states that “It’s been my experience that only opposites attract because that’s the nature of reality. The great myth in our culture is that compatibility is the grounds for a relationship — actually, compatibility is grounds for boredom.”

On the contrary, research suggests that Hendrix has gotten his myths precisely backward. When it comes to interpersonal relationships, opposites don’t attract. Dozens of studies demonstrate that people with similar personality traits are more likely to be attracted to and hang out with each other than people with dissimilar personality traits. For example, people with a Type A personality style, who are hard-driving, competitive, and time-conscious, prefer dating partners who have a Type A personality. Similarity in personality traits predicts not only initial attraction, but marital stability and happiness. Similarity on the personality trait of conscientiousness seems to be especially important for marital satisfaction. So if you’re a hopelessly messy person, you’re best off finding someone who isn’t a total neat freak. The “like attracts like” conclusion also extends to our attitudes and values. The more similar someone’s attitudes (for example, political views) are to ours, the more we tend to like that person.

Myth #8: People with Schizophrenia Have Multiple Personalities

A prevalent misconception is that schizophrenia is the same thing as “split personality” or “multiple personality disorder.” A popular bumper sticker, for example, reads: “I was schizophrenic once, but we’re better now.” The schizophrenia-multiple personality misconception is widespread. In one survey, 77% of introductory psychology students agreed that “a schizophrenic is someone with a split personality.” The 2000 comedy film, "Me, Myself, and Irene," starring Jim Carrey, features a man supposedly suffering from schizophrenia. Yet he actually suffers from a split personality, with one personality who’s mellow and another who’s aggressive.

In fact, Schizophrenia differs sharply from the diagnosis of dissociative identity disorder (DID), once called multiple personality disorder. Unlike people with schizophrenia, people with DID supposedly harbor two or more distinct “alters” — personalities or personality states — within them at the same time. Robert Louis Stevenson’s 1886 novel, The Strange Case of Dr. Jekyll and Mr. Hyde, is probably the best known illustration of multiple personality in popular literature. Nevertheless, many psychologists find the assertion that DID patients possess distinct and fully formed personalities to be doubtful. It’s far more likely that these patients are displaying different, but exaggerated, aspects of a single personality.

The schizophrenia-DID myth probably stems in part from confusion in terminology. Swiss psychiatrist Eugen Bleuler coined the term “schizophrenia,” meaning “split mind,” in the early 20th century, and many writers soon misinterpreted Bleuler’s definition. By schizophrenia, Bleuler meant that people suffer from a “splitting” within and between their psychological functions, especially emotion and thinking. For most of us, what we feel and think at one moment corresponds to what we feel and think at the next. Yet in the severe psychotic disorder of schizophrenia, these linkages are ruptured. As Bleuler observed, people with schizophrenia don’t harbor more than one co-existing personality; they possess a single personality that’s been shattered.

Regrettably, many people in the general public don’t appreciate the fact that schizophrenia is often a profoundly disabling condition associated with a heightened risk for suicide, clinical depression, anxiety disorders, substance abuse, unemployment, and homelessness. As Irving Gottesman noted, “everyday misuse of the terms schizophrenia or schizophrenic to refer to the foreign policy of the United States, the stock market, or any other disconfirmation of one’s expectations does an injustice to the enormity of the public health problems and profound suffering associated with this most puzzling disorder of the human mind.”

Myth #9: Full Moons Cause Crimes and Craziness

Once every 29.53 days on average, an event of rather trivial astronomical significance occurs. But according to some writers, it’s an event of enormous psychological significance. What is it? A full moon. Over the decades, authors have linked the full moon to a host of phenomena: strange behaviors, psychiatric hospital admissions, suicides, traffic accidents, crimes, heavy drinking, dog bites, births, crisis calls to emergency rooms, violence by hockey players... the list goes on and on.

The word “lunatic” derives from the Latin term luna, or moon. Legends of werewolves and vampires, terrifying creatures that supposedly often emerged during full moons, date back at least to the ancient Greeks, and were popular in Europe during much of the Middle Ages. In 19th-century England, some lawyers used a “not guilty by reason of the full moon” defense to acquit clients of crimes committed during full moons.

Even today, the notion that the full moon is tied to strange occurrences — the “Lunar Effect” or “Transylvania Effect” — is deeply embedded in popular culture. One study revealed that up to 81% of mental health professionals believe in the lunar effect, and a study of nurses demonstrated that 69% believe that full moons are associated with increase in patient admissions. In 2007, Brighton, England instituted a policy to place more police officers on the beat during full moon nights.

Psychiatrist Arnold Lieber popularized the idea of a correlation between the full moon and behavior. For Lieber, the lunar effect stems mostly from the fact that the human body is four-fifths water. Because the moon affects the tides of the earth, it’s plausible that the moon would also affect the brain, which is, after all, part of the body. Yet as astronomer George Abell noted, a mosquito sitting on your arm would exert a more powerful gravitational force on your body than would the moon. Furthermore, the moon’s tides are influenced not by its phase — that is, by how much of it’s visible on earth — but by its distance from earth. Indeed, during a “new moon,” the phase at which the moon is invisible to us on earth, it exerts just as much gravitational influence as it does during a full moon.

In 1985, two psychologists reviewed all available research evidence on the lunar effect, and found no evidence that the full moon is related to much of anything — crimes, suicides, psychiatric problems, psychiatric hospital admissions, or calls to crisis centers. Later investigators examined whether the full moon is linked to suicides, psychiatric hospital admissions,63 dog bites, or emergency room visits, and came up empty-handed.

What psychologists term the “fallacy of positive” instances may help to explain the persisting popularity of belief in the lunar effect. When an event confirms our hunches, we tend to take special note of it and recall it. In contrast, when an event disconfirms our hunches, we tend to ignore or reinterpret it. So, when there’s a full moon and something out of the ordinary, say, a surge of admissions to our local psychiatric hospital, happens, we’re likely to remember it and tell others about it. In contrast, when there’s a full moon and nothing unusual happens, we typically overlook or discount it. In one study, psychiatric hospital nurses who believed in the lunar effect wrote more notes about patients’ strange behavior during a full moon than did nurses who didn’t believe in the lunar effect. The nurses attended more to events that confirmed their hunches, which in turn probably bolstered these hunches.

Myth #10: A Large Proportion of Criminals Successfully Use the Insanity Defense

After giving a speech on the morning of March 30th, 1981, President Ronald Reagan emerged from the Washington Hilton hotel. Seconds later, six shots rang out. One hit a secret service agent, one hit a police officer, another hit the President’s press secretary James Brady, and another hit the President himself. The would-be assassin was a delusional 26 year-old man named John Hinckley, who had fallen in love from a distance with actress Jodie Foster and become convinced that by killing the President he could make Foster reciprocate his feelings for her. In 1982, following a trial featuring dueling psychiatric experts, the jury found Hinckley not guilty by reason of insanity. The jury’s decision triggered an enormous public outcry; an ABC News poll revealed that 76% of Americans objected to the verdict.

Surveys show that most Americans believe that criminals often use the insanity defense as a loophole to escape punishment. One study revealed that the average layperson believes that the insanity defense is used in 37% of felony cases, and that this defense is successful 44% of the time. This survey also demonstrated that the average layperson believes that 26% of insanity acquittees are set free, and that these acquittees spend only about 22 months in a mental hospital following their trials. Many politicians share these perceptions. One study revealed that politicians in Wyoming believed that 21% of accused felons had used the insanity defense, and that they were successful 40% of the time. In 1973, President Richard Nixon made the abolition of the insanity defense the centerpiece of his effort to fight crime.

Yet these perceptions of the insanity defense are wildly inaccurate. Data indicate that this defense is raised in less than 1% of criminal trials and that it’s successful only about 25% of the time. For example, in the state of Wyoming between 1970 and 1972, a grand total of 1 (!) accused felon successfully pled insanity. Members of the general public also overestimate how many insanity acquittees are set free; the true proportion is only about 15%. Moreover, the average insanity acquittee spends between 32 and 33 months in a psychiatric hospital, considerably longer than the public estimates. In fact, criminals acquitted on the basis of an insanity verdict typically spend at least as long in an institution (such as a psychiatric hospital) as criminals who are convicted.

How did these misperceptions of the insanity defense arise? We Americans live increasingly in a “courtroom culture.” Between Court TV, "CSI," "Law and Order," and CNN’s "Nancy Grace," we’re continually inundated with information about the legal system. Nevertheless, this information can be deceptive, because the media devotes considerably more coverage to legal cases in which the insanity defense is successful, like Hinckley’s, than to those in which it isn’t. As is so often the case, the best antidote to public misperception is accurate knowledge. Lynn and Lauren McCutcheon found that a brief fact-based report on the insanity defense, compared with a news program on crime featuring this defense, produced a significant decrease in undergraduates’ misconceptions concerning this defense. These findings give us cause for hope, as they suggest that it may take only a small bit of information to overcome misinformation.

We can all be fooled by psychomythology, largely because so many popular misconceptions dovetail with our intuitive hunches. As a consequence, we must turn to scientific reasoning, which is a set of safeguards against the tendency to confirm our initial beliefs, to evaluate whether the claims of the pop psychology industry pass muster. The good news is that by continually scrutinizng and questioning popular psychology claims with scientific thinking and scientific evidence, we can come to a better understanding of our mental worlds and make better everyday life decisions. Ω

[Scott O. Lilienfeld is a Professor of Psychology at Emory University, editor-in-chief of the Scientific Review of Mental Health Practice, and past president of the Society for a Science of Clinical Psychology. His principal areas of interest include personality disorders, psychiatric classification, evidence-based practice in clinical psychology, and science and pseudoscience. Lilienfeld received his B.A. in Psychology from Cornell University and his Ph.D. in Clinical Psychology from the University of Minnesota.

Steven Jay Lynn is a Professor of Psychology at Binghamton University (SUNY), the director of the Psychological Clinic and the Center for Evidence-Based Therapy, and a diplomate in clinical and forensic psychology (ABPP). He is the author of more than 270 books, chapters, and articles on science versus pseudoscience, hypnosis, memory, dissociation, and psychological trauma. Lynn received a B.A. in Psychology from the University of Michigan-Ann Arbor and his Ph.D. in Clinical Psychology from Indiana University-Bloomington.

John Ruscio is an Associate Professor of Psychology at The College of New Jersey. His interests include quantitative methods for social and behavioral science research and characteristics distinguishing science from pseudoscience. Ruscio received a B.A. in psychology from the University of Massachusetts-Amherst and both an M.A. and a Ph.D. in Social/Developmental Psychology from Brandeis University.

Barry L. Beyerstein was Professor of Psychology in Simon Fraser University, and an internationally recognized expert on myths about brain functioning. Barry passed away in 2007 at the age of 60, and we dedicate this article to his memory and extraordinary contributions to skepticism. Beyerstein received his B.A. from Simon Fraser University and a Ph.D. in Experimental and Biological Psychology from the University of California at Berkeley.]

Copyright © 2010 Skeptic

Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.

Creative Commons License
Sapper's (Fair & Balanced) Rants & Raves by Neil Sapper is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License. Based on a work at sapper.blogspot.com. Permissions beyond the scope of this license may be available here.

Copyright © 2010 Sapper's (Fair & Balanced) Rants & Raves